Article
Effects of Mentioning the Incentive Prize in the
Email Subject Line on Survey Response
Robert
Janke
Learning Services Librarian
Okanagan Campus Library, University of British
Columbia
Kelowna, British Columbia, Canada
Email: robert.janke@ubc.ca
Received: 15 Oct. 2013 Accepted: 4
Feb. 2014
2014 Janke. This is an Open Access article
distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – This study examined the effects that
mentioning the survey incentive prize in the subject line of a reminder email
had on the response rate and data quality. To date, manipulation of the subject
line, specifically in terms of mentioning the incentive prize, has received
limited attention in the survey design literature.
Methods
– The delivery of the survey invitation is
discussed in terms of the timing of the launch and reminder emails. Particular
emphasis is given to the design of the email subject line and justification of
the format. Weekly response rates from four LibQUAL+TM surveys were
compared. In addition, weekly responses for one year were analyzed using SPSS
to investigate if there were any between means differences in terms of three
elements of data quality. The three elements were: length of time it took to
complete the survey, the number of core questions with an N/A response, and the
number of illogical responses where minimum scores were higher than desired.
Results
– The response rates for the second week were
grouped together based on the presence or absence of the subject line
manipulation. There was a significant difference between these means (4.75%, p 0.033). There was no statistical
difference in regards to the measures of data quality as determined by a
one-way ANOVA test.
Conclusions –
Reminding survey participants with an email that mentions the incentive prize
in the subject line appears to increase response rates with no deleterious
effects on data quality. The results of this investigation are encouraging, and
those running the LibQUAL+TM survey in their universities should
consider implementing this method to increase response rates. Further research
to replicate these findings in other contexts and using an experimental design
would be beneficial.
Introduction
The library at
the Okanagan campus of the University of British Columbia (UBCO) has surveyed
all of its faculty and students on three occasions using LibQUAL+TM. LibQUAL+TM is a standardized
survey developed by the Association of Research Libraries to measure the
service quality perceptions of library users. The surveys have taken place in
2007, 2010, and 2013. As with many other libraries that survey their users,
UBCO offered lottery incentive prizes in the hopes that it would increase
response rate. In survey design, lottery incentive prizes differ from paid
incentives in that the individual has a chance to win the given prize, as
opposed to paid incentives, either pre- or post- survey completion, that
guarantee a prize for participants. As is typically the case for lottery
incentives, UBCO Library made participants aware of the incentive prize in the
body of the email inviting them to complete the survey (see the Appendix for a
sample of the invitation email). The response rate to the 2007 survey was
17.9%. However, with the explosion in the popularity of smart phones and other
mobile devices, the author became curious in the lead up to the launch of the
2010 survey about the extent to which students were reading the full invitation
to become aware that an incentive prize was being offered. The reasoning behind
this concern is that the smaller screens may make it less appealing to read
long emails, or the configuration of some of the email programs may put a
greater emphasis on the subject line in the decision to open or delete the full
message. As a result, this study addressed the following research questions:
1) Would giving
the existence of the lottery incentive prize more prominence, by mentioning it
in the subject line of the email invitation, increase the survey response
rate?
2) If
mentioning the incentive prize increased the response rate, would this have a
negative effect on the quality of the survey responses?
Response rates
should be a concern to all survey administrators. According to Manzo and Burke
(2012), response rates for all types of surveys have been declining over the
last decade, and low response rates threaten the validity of surveys. This is
because as a group, non-responders may share similar characteristics. By not
capturing their data, the sample and survey results would be biased.
Literature Review
There is
considerable interest in the use of lottery survey incentive prizes on
university campuses, so much so that there have been surveys by institutional
researchers (Porter & Whitcomb, 2003) and librarians (Buck, Nutefall, &
Bridges, 2012) to gauge the level of their use. This obvious interest aside,
the evidence regarding the effects of lottery incentives on survey response
rates is contradictory.
A meta-analysis
by Cook, Heath, and Thompson (2000) noted that surveys using an incentive
seemed to be associated with a lower response rate. In contrast, another
meta-analysis by Göritz (2006) concluded that there was a significant odds
ratio of 1.19 showing that incentives encourage individuals to start web
surveys and complete them (odds ratio of 1.27). Expressed differently, an
incentive should increase the odds of a person beginning the survey by 19% and
completing it by 27% over the odds without incentives. In terms of absolute
percentage differences, Göritz (2006) concluded that an incentive should
increase the response rate by an average of 2.8% and retention by 4.2%.
These two
meta-analyses studied the effects of incentive prizes on response rates. There
is an implied assumption, however, that the responder is aware that a prize is
offered when the incentive prize is mentioned in the body of the invitation.
This assumption is particularly interesting in light of a report from a
Canadian post-secondary library, using Constant Contact (email tracking
software), that as little as 33% of their users bothered to open the email
inviting them to take their LibQUAL+TM survey in 2012 (Reed, 2012).
One of the reasons that invitees may or may not open and read an email is the
subject line. The subject line, in contrast to the surveyor’s name and email
address, may be the most likely element of the invitation that encourages
recipients to open and review its contents (Manzo & Burke, 2012).
Research to date
on the effects of the subject line of the invitation email has been sparse. In
research with high school seniors as well as undergraduates, Porter and
Whitcomb (2005) manipulated the reason for the email, survey sponsor, and
whether or not the subject line included a plea for help. Trouteaud (2004) also
used a plea manipulation in the subject line (Please help... vs. Share your
advice with...) in an experiment conducted on American subscribers to a large
company’s daily email newsletter. Concerning research into the effects of
mentioning the incentive prize in the email invitation to a web-based survey,
only two studies were found.
Linegang and
Moroney (2012), in an experiment at the University of
Dayton (UD), manipulated the subject line to gauge its effects on response
rates among undergraduates invited to take a survey. Both experimental groups
received a pre-survey notification, a survey notification, and a reminder
notice. All recipients were entered into a draw for gift certificates from
local restaurants, and this information was communicated in the text of the
email. Linegang and Moroney reported neither the value of these gift
certificates, nor the number that they would be giving away. One group received
the subject line “UD Computer Survey” while the other received the subject line
“FREE FOOD!!! UD Computer Survey” in all of their communications. It was
reported that the response rate for the group that received the email subject
line that mentioned the incentive prize was 24.1% compared to 30.3% for the
group where the subject line did not mention the incentive prize, a 6.2% lower
response rate for the group invited to the survey with an emphasis on the
incentive prize.
Similarly, Kent
and Brandal (2003) also found a decreased response rate among their
experimental group that received an e-mail subject line that mentioned the
incentive prize. The subjects for their experiment were taken from the customer
database of loyalty cardholders from a Norwegian company. Kent and Brandal do
not specify the subject line that did not emphasize the incentive prize, only
saying that it was a survey from the company, whereas the other group’s email
had the subject line “Win a weekend for two to Nice.” The response rates for the
two groups were 66% and 52% respectively, a 14% lower response rate for the
group invited to the survey with an emphasis on the incentive prize.
These
counterintuitive results suggest that perhaps the recipients of the email which
emphasized the incentive prize believed the email was “spam” - that is,
unsolicited email from a dubious source where there may or may not be an actual
incentive prize to be won.
Methods
Sample
This research
is based on data collected at UBCO using the LibQUAL+TM web survey
instrument. The first time LibQUAL+TM was used at the campus was in
2007, and this serves as a baseline for purposes of comparison. To test the
effect of mentioning the incentive prize on survey response the author
manipulated the content of the subject lines in the 2010 and 2013 survey
cycles.
All three
survey cycles launched at the start of the fourth week of classes in the second
term of the winter session. In all cases, there was a single incentive prize
worth approximately $300 mentioned in the body of the email invitation. With
minor exceptions, the email invitations were identical each year: an example is
in the Appendix. The reminder email contained all of the text of the invitation
with the addition of a paragraph at the beginning apologizing to those
individuals who already completed the survey. (As no personal data were
collected, the library could not determine who completed the survey and remove
them from the invitee list). In 2007, the incentive prize was a digital camera.
The subject line of the invitation and reminder emails, sent at one week
intervals, was neutral with respect to mentioning the incentive prize, and
simply said Library Survey.
Data Collection
Because of the
negative effects found in other research that mentioned the incentive prize in
the email invitation (Kent & Brandal, 2003; Linegang & Moroney, 2012),
great attention was paid to ways to alleviate the spam effect that may have
influenced the outcome of these studies. In both of the intervention years,
2010 and 2013, the invitation email had the following subject line: Library Survey - Please let us know what you
think of our service. Only the reminder email, sent one week after the
invitation, mentioned the incentive prize: Library
Survey – you could win an [name of incentive prize]. The purpose of this
two-stage approach was to build trust and familiarity with the initial
invitation and then mention the incentive prize with the follow-up. A final
strategy employed was to construct the subject lines in a consistent manner
that made it clear that the email was indeed an invitation to a reputable
survey. This strategy avoided the use of excessive capitalization, as in the
case with Linegang and Moroney, and mentioned the word survey, unlike in the
research by Kent and Brandal.
The final
reminder, sent a week after the first, had the following subject line: Library Survey – your last chance to win an
[name of incentive prize]. In 2010, the incentive prize was an iPod touch,
while in 2013 it was an iPad mini. Enrolment Services sent out the invitations
and reminders, and the author’s institutional email address appeared as the
sender in order to be the one to receive any replies with questions.
In addition to
response rate data from 2007, data from another institution that also ran
LibQUAL+TM in 2013 was included for comparison purposes. The
Vancouver campus of the University of British Columbia was chosen as they also
ran the survey in the second term of the winter session and sent survey
reminders at one week intervals. Instead of surveying all undergraduate and
graduate students, they sampled from their population. The subject line for
their invitation and first reminder email mentioned the existence of incentive
prizes but did not specify what they were. More importantly, there was not the
specific manipulation of mentioning the incentive prize beginning with the
reminder email. The subject lines were: initial invitation UBC Library Survey – Tell us what you think & enter to win prizes;
first reminder UBC Library Survey –
Provide your feedback & enter to win prizes; final reminder UBC Library Survey – One week left to
provide your feedback. The body of the email invitation mentioned the
incentive prizes of an iPad mini and six $25 gift cards.
Data Analysis
Excel files
containing all survey data from the various survey cycles, supplied by LibQUAL+TM,
were used to generate the weekly and overall response rates. Data quality was
assessed using the raw data from the 2013 survey and analyzed using SPSS,
version 19, to see if there were differences in data quality between the
responses submitted during each week using a one-way ANOVA test. The raw data
file from LibQUAL+TM includes the date that the survey was
submitted, facilitating the grouping of responses. Variables supplied in the
survey data that were analyzed for quality include STime, the length of time it
took the respondent to fill out the survey in seconds, CountNA, the number of
core items where there was an N/A response, and finally CountINV, the number of
illogical responses where minimum scores are higher than desired. CountINV is
unique to the core questions in LibQUAL+TM, where for each question
the respondent provides three responses on a scale of 1-9, their desired and
minimum levels of service as well as where they perceive the service quality of
the library to be. Because an individual’s minimum score should not exceed
their desired, instances where this occurs would indicate a responder who is
not paying close attention to the actual content of the questions.
Results
Table 1 details
the number of students surveyed in a given year and the number of valid surveys
received after the initial invitation and after each of the two reminders, as well
as corresponding weekly response rates. 2013V represents the response rate from
the Vancouver campus survey. The calculation of the weekly response rates was
based on the premise that those individuals who had responded in the previous
week(s) were unlikely to respond again and were therefore removed from the
denominator. The final column reflects the overall response rate for the
different years and was calculated by simply dividing the total number of
responses received over the course of the survey by the number of students
surveyed.
Although there
is an increase in overall response rate between the 2007 and the 2010 and 2013
iterations, 5.1% and 1.2% respectively, 2007 marked the only year that the long
version of the LibQUAL+TM metric was used. As a result, attention
should be paid to the significant differences in response rates for week 2,
following the reminder email, and the change in the subject line to emphasize
the incentive prize in years 2010 and 2013, rather than a comparison of the overall
response rate. This is because the much shorter LibQUAL+TM Lite
survey was used in 2010 and 2013, which may confound inter-year comparisons of
overall response rate. Figure 1 compares the weekly response rate
graphically.
Table 1
Valid Surveys Received per Week Expressed as a Weekly
Response Rate
Year |
Number of Students Surveyed |
Valid Surveys Week 1 |
Response Rate |
Valid Surveys Week 2 |
Response Rate Week 2 |
Valid Surveys Week 3 |
Response Rate Week 3 |
Overall Response Rate |
2007 |
4132 |
325 |
7.9% |
239 |
6.3% |
176 |
4.9% |
17.9% |
2010 |
6160 |
541 |
8.8% |
599 |
10.6% |
278 |
5.5% |
23% |
2013 |
8069 |
403 |
5% |
778 |
10.1% |
358 |
5.2% |
19.1% |
2013V |
4376 |
376 |
8.6% |
195 |
4.9% |
112 |
2.9% |
15.6% |
Table 2 focuses
on the response rates for week 2 and groups them together based on whether or
not the incentive prize was mentioned. The mean response rate for the baseline
data (no manipulation) for 2007 and 2013V was 5.6%, while the mean response
rate for the years where there was a manipulation of the subject line, 2010 and
2013, was 10.35%. When the incentive prize was mentioned in the subject line in
the reminder
email, the response rate for that week was significantly higher (4.75%, p 0.033).
With respect to
data quality, Göritz (2006) raises some concerns in regards to offering
incentives, namely, individuals completing the survey multiple times or simply
entering “rubbish” responses in order to get to the end of the survey and be
eligible for the incentive. For this investigation, there were no statistically
significant differences between group means as determined by a one-way ANOVA
(scores were all above .05) for STime, CountNA, and CountINV, indicating there
was no more “rubbish” entered when the incentive prize was mentioned than when
it was not.
Figure 1
Comparisons of
weekly response rates.
Table 2
Response Rates for Week 2
|
Mean |
Standard Deviation |
Incentive
Mentioned (2010 & 2013) |
10.35% |
0.35 |
Incentive Not
Mentioned (2007 & 2013V) |
5.6% |
0.99 |
Difference |
4.75%* |
|
*One tail t-test of significant differences
(p<.05) |
Discussion
The results of
this study suggest that mentioning the incentive prize in the subject line of the
reminder email yields, on average, a 4.75% higher response rate for the given
week. This is contrasted with Linegang and Moroney (2012), who found a 6.2%
lower overall response rate for the invited with emphasis group and Kent and
Brandal (2003) who found a 14% lower response rate. One explanation for this
difference could be how the subject lines were constructed in the current
investigation, most notably the absence of excessive capitalization and
inclusion of the word “survey.”
Unfortunately,
it is difficult to gauge the effects this study had on the overall survey
response rate because of a lack of experimental comparisons in the design. The
largest external baseline for overall response rates that could be found
averaged the rates from 13 post-secondary libraries that used a 100% Lite
version of LibQUAL+TM in 2010. That survey of LibQUAL+TM
administrators by Buck et al. (2012) reported an average response rate of 17%,
which is 6% lower than the response rate of 23% that UBCO obtained in 2010. Taking
that 6% difference and the 3.5% difference observed in the 2013 response rates
between the two campuses of UBC does indicate a positive trend line for the
effects of mentioning the incentive prize. The decision to complete a survey is
complex, with multiple variables at play, making these inter-institutional
comparisons less than ideal. See below for a suggested course of research that
would better establish the effects on the overall response rate.
In regards to
data quality, analysis of the 2013 survey responses indicated no inter-week
differences on the measures chosen for analysis. These results are similar to
those reported elsewhere that found no statistically significant differences in
regards to the response speed (Heerwegh, 2006) and item non-response (Heerwegh,
2006; Sánchez-Fernández, Muñoz-Leiva, & Montoro-Ríos, 2012) when comparing
groups that were either offered or not offered an incentive prize to complete a
survey.
A practical
implication of this research for librarians and information professionals who
are delivering LibQUAL+TM surveys via email is that mentioning the
incentive prize in the reminder email will increase response rates for the
given week and may improve overall response rates. Survey administrators should
also have confidence that implementing the strategies outlined in this study
will not have a negative impact on the quality of responses provided by the
respondents.
Limitations & Future Research
The post hoc
design of this investigation does not permit strong conclusions with regard to
the exact effects mentioning the incentive prize in the subject line has on
overall response rates for the LibQUAL+TM survey. However, when
contrasted with the limited literature in this area (Kent & Brandal, 2003;
Linegang & Moroney, 2012) and their findings of a negative influence on
response rates, it does make a strong argument for further research. In the
future, it would be beneficial to create an experimental design in which three
groups are randomly generated. One group would be invited and reminded about a
survey with an email that does not mention the incentive in the subject line. A
second group would be invited with a neutral subject line, but reminded with a
subject line that mentions the incentive prize. Lastly, a third group would be
invited and reminded with a subject line that mentions the incentive prize in
both instances. Of course, if such research were carried out on a single
university campus, one would have to keep in mind the concern of between
experimental group communication raised by Porter and Whitcomb (2003),
communication that is all the more likely in this time of hyper connectedness
and when there are potentially valuable incentive prizes available to be
won.
Related to
survey data quality, another aspect that deserves further attention would be a
comparison of click-through rates and completion rates. Although out of the
scope of this investigation, comparing these two rates for instances when the
prize was mentioned and when it was not, would give a more complete picture as
to whether mentioned incentives encouraged someone to click through to the
survey but once they viewed the survey for whatever reason they declined to
fill it out.
Another
interesting line of inquiry would be other aspects of subject line composition
that may have an influence on response rate. One example would be moving the
mentioning of the incentive prize to earlier in the subject line. For instance,
it might be interesting to compare the effects of the following subject line Library Survey – you could win an [name of
incentive prize] with You could win
an [name of incentive price] – Fill out the library survey.
Conclusion
Increasing
response rates for surveys conducted on university campuses is an area of interest
for both librarians and institutional researchers alike. A common approach to
attempt to increase response rates is to offer a lottery incentive prize. This
study demonstrated a beneficial way to increase the response rates for the
LibQUAL+TM survey following the reminder email by manipulating the
email subject line. In contrast to earlier studies, this study found that
mentioning the incentive prize in the subject line of the reminder email
increased the response rate. Further investigation would permit conclusions on
the effect that this manipulation had on the overall response rate for the
survey, not just for a given week that it was open. In regards to data quality,
this study found no differences between the weeks where the subject line manipulation
occurred and when it did not
for the three variables chosen for investigation. These results echo research
conducted elsewhere on incentives and data quality. Results of this study
should give survey administrators confidence that adopting the strategy
outlined in this investigation should not only increase the response rate for
the week following the reminder email but should also not attract an inordinate
amount of careless responses used as a vehicle for entry into the draw for the
incentive prize.
References
Buck, S.,
Nutefall, J. E., & Bridges, L. M. (2012). “We thought it might encourage
participation." Using lottery incentives to improve LibQUAL+TM
response rates among students. Journal of Academic Librarianship, 38(6),
400-408. doi:10.1016 /j.acalib.2012.07.004
Cook, C.,
Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in
web- or internet-based surveys. Educational and Psychological Measurement,
60(6), 821-836. doi:10.1177/00131640021970934
Göritz, A. S.
(2006). Incentives in web studies: Methodological issues and a review.
International Journal of Internet Science, 1(1), 58-70. Retrieved 10 Feb.
2014 from http://www.ijis.net/
Heerwegh, D.
(2006). An investigation of the effect of lotteries on web survey response rates.
Field Methods, 18(2), 205-220. doi:10.1177/1525822X05285781
Kent, R., &
Brandal, H. (2003). Improving email response in a permission marketing context.
International Journal of Market Research, 45(4), 489-503. Retrieved 10 Feb. 2014 from https://www.mrs.org.uk/ijmr
Linegang, M.
P., & Moroney, W. F. (2012). Effects of cover letter subject line and
open-ended question response area on responding to an internet survey. Proceedings
of the Human Factors and Ergonomics Society Annual Meeting, 56(1),
1268-1272. doi:10.1177/1071181312561225
Manzo, A. N.,
& Burke, J. M. (2012). Increasing response rate in web-based/internet
surveys. In L. Gideon (Ed.), Handbook of survey methodology for the social
sciences (pp. 327-343). New York, NY: Springer.
doi:10.1007/978-1-4614-3876-2_19
Porter, S. R.,
& Whitcomb, M. E. (2003). The impact of lottery incentives on student
survey response rates. Research in Higher Education, 44(4), 389-407.
doi:10.1023/A:1024263031800
Porter, S. R., & Whitcomb, M. E. (2005). E-mail subject
lines and their effect on web survey viewing and response. Social Science
Computer Review, 23(3), 380-387. doi:10.1177/0894439305275912
Reed, K. (2012, November 27). Getting
ready for LibQual 2013. Message posted to LibQual-Canada electronic mailing
list, archived at
http://www.mailman.srv.ualberta.ca/mailman/listinfo/libqual-canada
Sánchez-Fernández,
J., Muñoz-Leiva, F., & Montoro-Ríos, F. J. (2012). Improving retention
rate and response quality in web-based surveys. Computers in Human Behavior,
28(2), 507-514. doi:10.1016/j.chb.2011.10.023
Trouteaud, A.
R. (2004). How you ask counts: A test of internet-related components of
response rates to a web-based survey. Social Science Computer Review, 22(3),
385-392. doi:10.1177/0894439304265650
Appendix
Example of
Invitation Email
You are invited to participate in a comprehensive survey of library
service quality. The survey, known as LibQUAL+TM, assesses
satisfaction with collections, services, access, and space at participating
academic libraries throughout the world. Because so many libraries use LibQUAL+TM,
it allows us to compare how we are doing with other libraries in Canada, as
well as with our colleagues at the UBC Vancouver campus. These results help us
create the library you need in the future. We value your input. Speak up!
Past experience indicates it takes an average of only 5 minutes to
complete the web-based survey. Please keep in mind that all the core questions
must be completed for your results to be tallied in the overall totals and that
if you do not wish to respond to a particular questions, just select the
"NA" box in the right hand column.
To take the
web-based survey, please click on: [survey URL]
The survey will
be open from [survey dates]
Incentive draw:
Your time is important! Thank you! The Library is offering an iPad mini
to a randomly selected participant in the survey. If you choose, you may enter
the draw by entering your e-mail address at the end of the survey.
Confidentiality:
All responses are held in strictest confidence. No identifying links
between responses and the individual are retained. The only identifying piece of
information, (your e-mail addresses if you choose to enter the draw), is stored
separately from the survey results and is discarded after the winner has been
identified.
More
information:
To see more information about the survey and its goals, please see: [URL
for survey information]
Whom to
contact:
If you have any difficulty accessing or taking the survey or have any
other questions or comments about the LibQUAL+TM survey at UBC
Okanagan, please contact [survey administrator] by e-mail at [administrator’s
address]
Thank you for your help.
[name and rank of Chief Librarian]