top of page
  • Ben Yurchak

8 Steps That Will Make Your Website Intercept Surveys Highly Actionable


1. Use invitations that appeal to all visitor segments, not just your loyal customers.

One of the biggest causes of bias is having an invite that is far more enticing to customers than prospects. Use active invitations, such as a simple lightbox that asks visitors to give their input (they need to tell you yes or no). Avoid passive invitations such as an html link to ‘Take our survey’ or ‘Give Feedback’ for anything where a representative sample is important. While these links are fine for hearing about problems, they are unrepresentative of visitors as a whole.


2. Create a positive user experience on the survey.

A bad user experience can cause the least-represented to drop out first. In our testing, when user experience is not taken into account, over 80% who agree to take the survey will drop out before completing it. Even worse, we have found that those who drop off are disproportionately prospects, as they have a lower propensity to want to give feedback.


NEVER ask questions all on one page. If you start with a few on the first page and use multiple pages, completion rates will increase. Start with questions about their needs and what they liked/disliked. This shows that you really care what they think. Never start with age, gender or other demographics questions – this is akin to asking someone how much money they make while you’re just getting to know them!


3. Segment everything.

When analyzing results and performance, make sure to look at responses by a variety of visitors segments. For example, how does the site do for customers vs prospects? You might be doing okay overall but terribly for prospects. Segment by different visitor tasks, first-time vs. repeat visitors, stage in the purchase process, and so on.


4. Ask more than 4 questions.

Conventional wisdom holds that a short 1-4 question survey will get several times the response rate of a medium or lengthy survey. This conventional wisdom is rational, intuitive, and wrong.


We’ve run extensive testing across many different industries’ websites and found that you’ll typically only get 20% to 30% more responses for a very short survey vs. a longer one (i.e. 10-20 questions). The value you get from 10 to 20 questions is so much greater than the value from only 1 to 4 questions that we usually prefer running lengthier surveys.


There are some exceptions to this rule: if you are trying to get insight from visitors who bounce, the survey has to be brief. Also, if you are intercepting and inviting those who just abandoned your cart, briefer is better.


5. Don’t take everything respondents say at face value.

Dig deeper and add additional questions to uncover the real problem. Let’s assume that 65% of respondents said that they didn’t convert because the price is too high. Does that mean that you should lower the prices? Maybe, maybe not. Are they too high because the price is more than the person wants to spend (wrong target audience) or because they are more expensive than competitors (price is probably too high)? Is it really about the price, or did the site not clearly communicate the value proposition (poor communication of the product’s value proposition is one of the biggest problems that we see across sites)?


6. Measure more than just visitor satisfaction.

Many website surveys focus almost exclusively on satisfaction. This makes sense if your site’s only goal is to satisfy your visitors. If your goal is to sell something, though, satisfaction doesn’t tell you if your site has convinced them that your products are worthwhile. Instead, measure the following:


  • Target Audience: Measure whether you are even reaching visitors who are likely to buy from you. If not, your advertising is the problem, not the site.

  • Satisfaction: Measure if you are meeting your target audience’s needs. If not, they’ll leave before you get an opportunity to convince them to buy from you.

  • Persuasion: Measure whether you are persuading your target that you are the right choice for them. (This is where more companies struggle than any other area).

  • Action: Measure if you are effective at getting your target to take the next steps in the purchase process.

7. Benchmark results.

When looking at survey results, one of the first questions you should ask is “how does this compare to others?” For example, if only 65% of people said that they found what they were looking for, that might seem a little low. However, compared to the industry benchmark, it is actually pretty average.


8. Integrate survey data with behavioral data to uncover the exact cause of good or bad performance.

Website surveys yield summary metrics of each question, such as % satisfied with the site and the % saying that the prices are too high. This is nice to know, but it doesn’t show the cause of good or bad performance. By integrating the survey responses with behavioral data, we can now see how performance varies by content viewed to see exactly which content is doing well or poorly.


For example, a survey might tell you that 65% of respondents were satisfied. By integrating with behavior, we could find that only 20% of the people going to Product Page A were satisfied while 70% going to Product page B were satisfied. If Page A is highly visited, it must be fixed.


Conclusion

Voice of customer surveys can be an extremely useful way to identify friction, dig deep into website issues, and benchmark your brand/site against competitors or other industries. However, without taking deliberate steps to make a survey actionable, results will be inconclusive at best, and misleading at worst.

bottom of page