[Case Study] Do Surveys Get Better Responses with an Incentive?

Last week you were part of an experiment. I’m sorry for not telling you sooner, but it would have skewed the results. But now that the experiment has concluded, I’m happy to share with you the methodology and my findings.

For a few months I have been floating this idea of sending out a survey to sage subscribers, asking a few simple questions, and learning more about all of you. I wanted to know what type of business you run, what topics interest you, and where my team can help.

While it would have been easy enough to just send you a survey link via email, that would be too generic for my taste. I wanted to devise a clever method for delivering the survey, in hopes that it would inspire you to respond. Not only that, but I wanted to see if an incentive was required to get you to take the survey.

So I wrote a sage post talking about how much you can charge, and tied the story together by conducting my own pricing survey. During that article, I lamented the lameness of offering to enter you into a drawing for a chance to win an Amazon gift card, and wanted to do something better than that.

That meant I was balancing two objectives. First, I wanted to learn more about my list. Second, I wanted to conduct an experiment to see if incentives were needed.

To test out my hypothesis, I did what every marketer should do in this situation: A split-test.

Half of you received an offer, the other half did not

This experiment was very easy to execute. I did a 50/50 subject line test to see if offering an incentive would improve response rates.

Half of you received the title of the post in your inbox. The other half received a bit more “sensational” of a title, with me stating [I have never done this before] in the subject.

For the first subject, I simply offered a survey link at the top and bottom of the email. For the second one, I offered to answer any question you had about business, if you filled out the survey. The survey link was only available at the bottom of the email.

Side note: Neither of these invites offered you a chance to win an Amazon Gift Card (although I did get offered a chance to play the $100 gift card lottery on Monday. I took a screen shot, printed out that email, crumpled the paper, and made a “swish” in the wastebasket.

End side note. 

So how did the experiment do?

Survey invite test results

Two headlines were sent, one with an offer incentive. The second one had no offer.

[one_half_first]Headline: How Much Should you Charge for Your Services?

Offer: None

Open rate: 40.67% unique opens

Overall Click Rate: 6.13% uniquely clicked

Survey Click Rate: 2.9% clicked to survey

Survey Complete Rate: 1.8% of people who received email

[/one_half_first][one_half_last]Headline: [I’ve Never Done This Before] How Much Should You Charge?

Offer: I will record a video answering your business question

Open rate: 48.10% unique opens

Overall Click Rate: 9.70% uniquely clicked

Survey Click Rate: 4.1% clicked to survey

Survey Complete Rate: 2.9% of people who received email


Looking at these numbers, it’s obvious that the incentive worked on our list of ~1,500 people. There were 50% more survey responses from the group who received an incentive.

And of those who received the incentive, three of you asked me a question. I spent 30-60 minutes recording these videos and sending them over in response.

What I learned from this experiment

This experiment taught me several lessons that I consider to be valuable business learnings.

  1. Your subject line matters. Much of the success of these response rates starts with the total number of people who open. It’s always a good idea to try and test subject/headlines!
  2. An incentive matters. Even with a small sample size, I received 50% more survey responses from those who were given an incentive.
  3. The incentive doesn’t have to result in crippling financial stress, and you can experiment with different methods to see what works best.
  4. The survey results have my head spinning more than I expected. I thought I knew my audience, but it turns out that only a few of you have agencies or want to identify as an agency. After I picked my jaw up off the floor, I thought to myself “well, let’s do something about that!”
  5. Survey results should be taken seriously, but also represent a very small amount of overall subscribers (we are talking about 2.5% of the list responded). So while these responses are valuable, it is not statistically significant. But from a practical perspective, this is enough data to make important decisions for my business.
  6. There’s two sides of the survey coin. First of all, I think it’s smart to conduct a survey. Secondly, be careful of what you wish for! More on this to come.

Thank you to those of you who responded. It was really great to hear your responses.

P.S. Have you ever heard of the Baader-Meinhof phenomenon?

It’s a theory that when you notice something “for the first time” it suddenly appears everywhere. I am having one of those moments with Amazon $100 gift cards. This one came in from Amazon themselves right after I finished drafting this post. You can’t make this stuff up!