Are Repeat Visits a Good Measure of Success?

By | Analytics | No Comments

Are Repeat Visits a Good Measure of Success?

We’ve long been skeptical of using web analytics data (such as time on site & visits to key pages) as the main success measure for brand & information websites. What people choose to view on a website tells us little about whether the site meets their needs, influences them, or drives them to take action after their visit.

A good example of this is using repeat visits as a measure of success. Surely visitors who return must have a better experience than those who don’t come back, right?

Not always.

For one of our clients, visitors who were dissatisfied with the website were actually THREE TIMES more likely to return than those who were satisfied.

When unsatisfied visitors returned to the site, they typically looked at a few pages and quickly left. From web analytics, it appears that these are returning visitors coming back to consume more content or reconsider a purchase. Because we surveyed these visitors, we know that they are actually unsatisfied prospects who were disappointed their first visit because they were earlier in the purchase cycle and needed extensive background information, which the site did not provide. 

Satisfied visitors, on the other hand, didn’t spend as much time at the site, typically found what they were looking for, and were more likely to convert offsite. These visitors were further along in their purchase process and didn’t return because they found the in-depth product information that the site offered.

We could never find this out simply from looking at web analytics alone. We could see the repeat visits, but we’d know nothing about why they came back or why some repeat visitors spent so little time on the site. We could also never get this from surveys alone, as we’d never see their behaviors on the site. We’d wrongly assume that those who were more satisfied would be more likely to come back.

In these cases, repeat visits and time on site were used by our client as two main Key Performance Indicators (KPIs). By combining survey and behavior data, we proved that while repeat visits in themselves are not bad (they are a second chance to make an impression in this case), they are a poor measure of site success for this brand. This relatively simple information helped them stop wasting money and time on building irrelevant content and refocus efforts on content which really performs best.

Website Surveys: Moving from Interesting to Actionable

By | Analytics | No Comments

Eight Steps That Will Make Your Website Intercept Surveys Highly Actionable

1. Use invitations that appeal to all visitor segments, not just your loyal customers.

One of the biggest causes of bias is having an invite that is far more enticing to customers than prospects. Use active invitations, such as a simple lightbox that asks visitors to give their input (they need to tell you yes or no). Avoid Passive invitations such as an html link to ‘Take our survey’ or ‘Give Feedback’ for anything where a representative sample is important. While these links are fine for hearing about problems, they are unrepresentative of visitors as a whole.

2. Create a positive user experience on the survey.

A bad user experience can cause the least-represented to drop out first. In our testing, when user experience is not taken into account, over 80% who agree to take the survey will drop out before completing it. Even worse, we have found that those who drop off are disproportionately prospects, as they have a lower propensity to want to give feedback.

NEVER ask questions all on one page. If you start with a few on the first page and use multiple pages, completion rates will increase. Start with questions about their needs and what they liked/disliked. This shows that you really care what they think. Never start with age, gender or other demographics questions – this is akin to asking someone how much money they make while you’re just getting to know them!

3. Segment everything.

When analyzing results and performance, make sure to look at responses by a variety of visitors segments. For example, how does the site do for customers vs prospects? You might be doing okay overall but terribly for prospects. Segment by different visitor tasks, first-time vs. repeat visitors, stage in the purchase process, and so on.

4. Ask more than 4 questions.

Conventional wisdom holds that a short 1-4 question survey will get several times the response rate of a medium or lengthy survey. This conventional wisdom is rational, intuitive, and wrong.

We’ve run extensive testing across many different industries’ websites and found that you’ll typically only get 20% to 30% more responses for a very short survey vs. a longer one (i.e. 10-20 questions). The value you get from 10 to 20 questions is so much greater than the value from only 1 to 4 questions that we usually prefer running lengthier surveys.

There are some exceptions to this rule: if you are trying to get insight from visitors who bounce, the survey has to be brief. Also, if you are intercepting and inviting those who just abandoned your cart, briefer is better.

5. Don’t take everything respondents say at face value.

Dig deeper and add additional questions to uncover the real problem. Let’s assume that 65% of respondents said that they didn’t convert because the price is too high. Does that mean that you should lower the prices? Maybe, maybe not. Are they too high because the price is more than the person wants to spend (wrong target audience) or because they are more expensive than competitors (price is probably too high)? Is it really about the price, or did the site not clearly communicate the value proposition (poor communication of the product’s value proposition is one of the biggest problems that we see across sites)?

6. Measure more than just visitor satisfaction.

Many website surveys focus almost exclusively on satisfaction. This makes sense if your site’s only goal is to satisfy your visitors. If your goal is to sell something, though, satisfaction doesn’t tell you if your site has convinced them that your products are worthwhile. Instead, measure the following:

Target Audience: Measure whether you are even reaching visitors who are likely to buy from you. If not, your advertising is the problem, not the site.

Satisfaction: Measure if you are meeting your target audience’s needs. If not, they’ll leave before you get an opportunity to convince them to buy from you.

Persuasion: Measure whether you are persuading your target that you are the right choice for them. (This is where more companies struggle than any other area).

Action: Measure if you are effective at getting your target to take the next steps in the purchase process.

7. Benchmark results.

When looking at survey results, one of the first questions you should ask is “how does this compare to others?” For example, if only 65% of people said that they found what they were looking for, that might seem a little low. However, compared to the industry benchmark, it is actually pretty average.

8. Integrate survey data with behavioral data to uncover the exact cause of good or bad performance.

Website surveys yield summary metrics of each question, such as % satisfied with the site and the % saying that the prices are too high. This is nice to know, but it doesn’t show the cause of good or bad performance. By integrating the survey responses with behavioral data, we can now see how performance varies by content viewed to see exactly which content is doing well or poorly.

For example, a survey might tell you that 65% of respondents were satisfied. By integrating with behavior, we could find that only 20% of the people going to Product Page A were satisfied while 70% going to Product page B were satisfied. If Page A is highly visited, it must be fixed.


Voice of customer surveys can be an extremely useful way to identify friction, dig deep into website issues, and benchmark your brand/site against competitors or other industries. However, without taking deliberate steps to make a survey actionable, results will be inconclusive at best, and misleading at worst.

Web Measurement: How Far Have We Come Since the 90s?

By | Analytics | No Comments

Web Measurement: How Far Have We Come Since the 90s?

From its humble beginnings as server logs used by IT, web analytics has advanced over the years to become the defacto standard of online measurement, used by every serious online marketer (and quite a few unserious ones!).

Here is our ist of the top 5 most significant advances in web measurement and some great resources for further reading on each topic:

1. Democratization of Web Analytics

Google Analytics, the first robust, easy to implement, and most importantly, free version of web analytics, has made web analytics available to everyone. Not only has Google Analytics driven much greater adoption of web analytics, it also has forced the other web analytics players to offer more value or be steamrolled.

For more information about the history of web analytics, listen to this informative and entertaining podcast by the Digital Analytics Power Hour.

2. Industry “Ecosystem”

As the industry has matured, we now have the institutions, vendors, consultants, best practices, etc. to help advance web measurement. While this “advance” is more abstract than the others, the codification of best practices and the spread of practicing companies has pushed the industry into the mainstream.

Avinash Kaushik has a great post on some digital analytics best practices.

3. Testing

Testing tools have made tremendous progress, moving from simple A/B tests to multivariate tests of tens of thousands of permutations of content. But more important than the actual tools is the adoption of testing as a core tactic used by companies. Got a great idea but your company is skeptical? Test it. Trying to choose between two different messages? Test them. This “testing mindset” states that we don’t know what is going to work best until we test it. We still need to get much better at determining what to test and how to measure the results, but we are on the right track.

Forrester released a report on the top testing platforms of 2015 here. If you’d like to download it for free (in exchange for your information), SiteSpect offers a free downloadable file.

4. Marketing Optimization (particularly for eCommerce and Lead Gen)

Here, I am referring more to the idea of automatically taking behavioral data (clicks) and using software and tools to automate tactical improvements to campaigns and content (usually through testing and behavioral targeting). Of all the tools out there, paid search software has probably made the most progress here due to the innate measurability of search. Other tools still need much more development, but it is great to see this idea achieve widespread adoption and so much attention from software providers.

Eric Greenberg and Alex Kates wrote a good primer on measuring returns through digital marketing.

5. Voice of Customer (VOC)

While these tools still need dramatic improvement and greater integration with behavioral data, capturing visitor opinions and feedback is now a fairly common practice. It is a good complement to the behavioral data that still dominates web measurement.

Gary Angel often delivers a remarkable overview of the benefits of Voice of Customer data on his blog. You can also find some great content in his recent book, Measuring the Digital World.


While the industry has made progress, it is woefully inadequate in so many other ways. Much of the industry lives and breathes on onsite behavioral data (clicks) and clicks alone. We often know nothing about how a site impacted brand perceptions, how it impacted offline actions, and sometimes we know little about the actual people who are visiting. Without these, we are only seeing one piece of the puzzle, and we struggle to convince others of the value of our online marketing and therefore struggle to get the resources we need to get more from the web.