How we get a great response rate on student surveys

Spoiler alert: we get a captive audience

In a previous position where we sent out an email survey to all students, we were lucky to get a 20% response rate.  And by “we were lucky”, I mean we usually got closer to a 10% response rate.  So low that I felt guilty reporting the results, but because of yearly assessments and/or reaccreditation, we had to report something, and that was all we had.

Now, at my current library we get a  near 100% response rate twice a year on a paper survey administered to about 1/4th of the student body.  How?  By coordinating with other offices to administer the test during a mandatory process.  At the end of each semester, there is a mandatory portfolio review for a section of the student about (again about 1/4th of all students).  During this process, students must present their portfolio to faculty for review and feedback.  At some point during the process, the students are excused from the room for 5-10 minutes while the faculty deliberate.  While the students are waiting, they are asked to take the library survey. We’re fortunate to be able to inject ourselves into the review process so that we can administer our surveys, but students and faculty have been nothing but receptive to our involvement.

Preparation for our survey is relatively simple.  We coordinate with other college staff to put survey instructions and the survey itself into faculty packets for review days.  The instructions are a simple reminder to ask students to complete the library survey while they are waiting outside the review room.  After faculty have their instructions, we manually place library survey collection folders in each review room (we get a list of review rooms from the staff coordinating the reviews).  After each review day, we collect the surveys from the folders.

Since the survey is administered in paper, we have to digitize it at some point.  For us, we create the survey in our SurveyMonkey account then replicate the responses for each respondent in SurveyMonkey.  We typically have our work study students do this.  Since we’re small and the number of actual surveys isn’t unmanageable, entering the surveys doesn’t take long.  And since we’re inputting into SurveyMonkey, all the data analysis is automatic.

The downsides of administering a survey in this way are typical of print surveys: the extra preparation and data entry. But it’s well worth the extra coordination on the front and backends to get such a consistently high response rate.  To compare, for faculty we still administer surveys either by mass email or by putting a print survey in their mailboxes.  In both cases, the response rate was around 20%.  We settled on email survey since it’s easier to administer and collect.  Now if only we could find a captive faculty audience *coughfacultymeetingscough*

So my advice for libraries that want to get a significant response rate on student surveys is to find a captive audience you can take advantage of.  Check with faculty, student affairs, or student affairs to see if there are any events or processes that could accommodate administering a survey.  This may be trickier in larger institutions where there may not be as much flexibility, but it’s worth exploring if you want a consistently excellent response rate.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s