Is your advice content working?

Good advice content helps people to do, get or know something. But how can we show our advice content has actually done that? Using metrics and analytics can show when advice content is working and when it might need some help.

Advice content should be ‘actionable’. This means that it helps you to do something, usually:

  • find out if something applies to you,
  • decide what to do,
  • plan how to do it.

Page views and other ‘conversion’ metrics for sales content are not good at measuring this. Just because someone goes to your web page, this does not mean that they have used it to solve their problem.

But you can work out what content to improve next by using more meaningful metrics, like:

  • low scroll rate,
  • not following external links,
  • rating pages as unhelpful. 

When you do this, you can influence decision makers to make smarter content, not (just) more content.

Better page views (not more page views)

Does this sound familiar?

X: How many page views did our advice content get?

Y: Over 100,000!

X: Is that good?

Y: I don’t know!

X: Well, next year your target is 120,000!

Y: ...great!

When people are asked to show if their content is succeeding, pageviews are an easy place to start. But just reporting page views means you’re only looking at the amount of traffic, not at the quality of that traffic. 

Pageviews are not a meaningful target. A page view does not tell you that the content meets the user’s needs. And using them as a target means you have an incentive to make content that gets clicks even if it’s not useful.

Measure ‘engagement’ in a better way

Google Analytics counts a ‘session’ (visit) as ‘engaged’ if people do either of these things:

  • stay for longer than 10 seconds,
  • visit 2 or more pages.

These are not good measures for advice content. 

For example, after viewing a page, these journeys would both count as ‘engaged’:

  • realising it’s not what you need and clicking on an internal link (that also does not help),
  • looking around for 10 seconds and leaving, disappointed.

New ‘conversions’ can help

There are better ‘good’ behaviours you can measure, things that indicate your advice content is working.

You can set up Google Analytics to register these as ‘events’ and ‘conversions’ that count as ‘engagement’.

For example:

  • scrolling 25% or 50% of the way through the page (not the 90% for the default ‘scroll’ event, that’s too high)
  • following a link, probably to an external site 

This means that if someone lands on a page, finds a solution to their problem and uses an outbound (external) link to leave, that will also count as ‘engagement’. 

You could also add a “was this page helpful?” with a ‘yes’ and ‘no’ option and attach the ‘yes’ to a conversion event.

Scope, the disability charity: 

Screenshot of Scope's website - a purple box, with the label 'was this page helpful?' and 2 buttons with a thumbs up and 'yes' and the other with a thumbs down and 'no'.Citizens Advice:

Screenshot from the Citizens Advice website. Question text is 'Did this advice help?' with 2 radio button answers, 'yes' and 'no'. GOV.UK:

Screenshot from GOV.UK. Question text is "Is this page useful?" with 2 buttons for "yes" and "no". there is also a button that says "Report a problem with this page".These are much better indicators that someone either:

  • read the content and understood it,
  • took the next step in their journey.

Measure ‘failure’ to start improving content

If you want to make content that solves problems, starting by measuring success or ‘conversions’ won’t help.

Answering the question ‘What content should we improve next?’ is a better way to start. It shows that you:

  • accept that some bits of content will do better than others,
  • will be using metrics to plan the work, not punish team members.

It also means you can use more than 1 data point. The pages on your site you need to improve will probably have at least 2 of these:

  • low scroll rate,
  • low external links (relative to page views),
  • low time on page.

If you can also include what proportion of people answered ‘no’ to ‘was this page helpful?’ and you’re on to a winner. And when the ‘no’ rating goes down for your site? That’s a metric you can be proud of if your page views are going up or holding steady.

How to improve your content

You can use web and search analytics for some of this, but testing is what will really help.

Search analytics

Your content may be failing before people view it. Use Google Search console to see if your page is appearing (impressions) in search and if people are choosing it (clicks). 

Web analytics

Web analytics can be good at telling you what to improve. They can tell you a little bit about how. 

For example:

  • on-site searches starting from a page suggest what people were hoping to find on that page but could not
  • Keywords that are not on your page but are marked as relevant in tools like Google Trends or Keywords Everywhere could suggest you need to change the scope of your content

On-page feedback forms

If you have a form on your pages where people can leave feedback, that can be really helpful. If someone says they couldn’t find the information they needed on that page, believe them - even if it is there, somewhere!

Testing

Highlighter testing can give users a chance to tell you which bits are hard to understand. You can also ask if the piece has met its user story and acceptance criteria. It’s a versatile testing method. You can do it:

  • in person or remotely,
  • on Google Docs or on paper,
  • on Zoom or over the phone.

The main weakness of highlighter testing is that it relies on people self-reporting that they do not understand. And people may not want to share that information.

Scenario testing can also be useful. For example, if your content is there to help people make a choice, you would:

  1. describe a situation,
  2. ask them to read your content,
  3. ask them to talk about the choice they would make.

You’ve got this!

Now you’re using the right metrics for your advice content, you’re creating a culture where it feels safe to test, learn and make content that works for users.

Links

User stories and acceptance criteria

How to work with user needs

Content design: planning, writing and managing content (GOV.UK)

Google Analytics 4

Scroll tracking (Analytics mania) 

Measure outbound clicks for a website (Google)

Create or modify conversion events (Google)

Sign up to our newsletter

Get content design insights sent straight to your inbox.




  • Choose what information you get: (required)