Factors That Encourage Student Participation in Surveys
As demand for accountability increases, students’ willingness to participate in surveys appears to be decreasing, making it more difficult for faculty and staff involved in retention work to improve their efforts. A study at a southern flagship university, however, has uncovered some factors that motivate students to complete surveys.
Researchers interviewed eight full-time undergraduate students who had lived in university housing for at least one academic year. (Residential students are typically easier to reach than are commuter students and are more likely to have been surveyed frequently by the university than are nonresidential students.)
The study found that the students’ attitudes toward surveys varied. Some students described survey participation as interesting, while others described it as an annoyance. Despite these differences, three factors that encourage student survey participation emerged:
- the view that surveys are agents of institutional change
- the perception that frequent surveying is simply a part of the university experience
- trust that survey participation would actually lead to improvements
William K. Tschepikow, the University of Georgia’s director of student affairs assessment and staff development, reports the survey’s findings in “Why Don’t Our Students Respond? Understanding Declining Participation in Survey Research among College Students” in the November 2012 Journal of Student Affairs Research and Practice.
Tschepikow recently participated in an email interview about the study’s findings and what implications they may have for increasing student response to surveys about program effectiveness.
Your study found that “while some students … found pleasure in receiving surveys, other participants felt encumbered by the level of exposure to the survey process.” What do you think accounts for the difference?
Tschepikow: My study suggested that the difference may be accounted for—at least to a certain extent—by psychological factors rather than organizational ones. For example, some participants who acknowledged a propensity for survey research located this preference within broader intellectual curiosity.
In other words, these participants found the survey process fascinating in general and therefore chose to participate.
Not all respondents perceived the survey process this way. One participant felt harassed by the number of solicitations she received. Again, psychological rather than organizational factors are probably at play here.
What missteps in administering a particular survey might contribute to a student’s belief that it won’t really lead to anything? What might signal to students that the survey results will indeed be used for improvement?
Tschepikow: I think the biggest misstep is failing to communicate to students on a regular basis how survey results have led to concrete and positive changes (big and small) in the educational environment—of course, “concrete” and “positive” are subjectively defined.
This misstep is often the result of poor planning. For example, an institution may elect to administer a survey without first identifying specific issues, questions, or problems to which the survey might be responsive.
When a survey is not responsive to a set of guiding questions—grounded in the educational environment—it is unlikely that the results from that survey will lead to meaningful action by administrators.
When surveys are administered, officials should develop strategies for effectively communicating the results and any related action to students.
For example, administrators may decide to post on the institution’s home page three ways information collected through a particular survey was used to improve a program or service. Systematic and regular communications like this will, in effect, function as signals over time that the institution values, needs, and uses feedback from students.
How can the administrators of a particular survey highlight to students the salience of that survey?
Tschepikow: It begins with choosing the right sample. … Administrators can also identify in invitations to participate specific connections between the student’s educational experience at the institution and the content of the survey. For example, a solicitation might state, “We are administering this survey to improve the curricula in our residential learning communities. As a member of one of these communities, your feedback is particularly valuable.”
It seems from your study that increasing response rates is as much about building an institution-wide environment of trust and communication as it is about administering a specific study well. What can institutions do to build this environment?
Tschepikow: In many respects the strategy should be different from institution to institution, based on organizational dynamics such as culture, size, student demographics, mission, etc. So, my first thought is that any effort to build this environment should begin with a thoughtful examination of the institution as a distinct social collection. Literature on organizational change may be helpful in this endeavor.
My findings also suggest that trust, in this context, is the belief that survey data will be used to engender positive change in the educational environment. In order for this type of trust to develop, administrators must actually use data collected from surveys in important decision-making processes and communicate their use directly to students.
A challenge to this logic arises when one considers that multiple surveys may be administered by multiple units at any given point in time. This reality is what makes coordination across the institution so critical to building an institution-wide environment of trust and communication. In practical terms, coordination can include the timing of various surveys, sampling, communication strategies, branding, and other elements.
The article recommends that institutions establish campus-wide survey policies and that these policies include expectations for providing direct feedback to students. Could you refer us to policies that you find serve as useful examples?
Tschepikow: I would take a look at Duke’s: http://ir.provost.duke.edu/surveys/policy.html.
I would also take a look at Northwestern’s: www.adminplan.northwestern.edu/ir/sspg/index.htm.