Google Analytics: Avg. Time on Page vs. Avg. Session Duration

When we drive traffic to a landing page -- whether it's through organic search, paid media, email or any other channel -- it's important that we're able to determine how well a piece of content has been engaged with in order for us to be able to report on the success of a page or campaign.

But when reviewing data around a recent campaign in Google Analytics, I noticed that the landing page Avg. Session Duration was really, really low -- just nine seconds in fact. My first thought was that this campaign had bombed spectacularly. Well, it turns out it didn't!

On closer investigation, I saw that when looking at the 'All pages' report, specifically at the campaign page with landing page as the source and then looking at the Avg. Time on Page, the figure was completely different -- and a lot more positive -- than the Avg. Session Duration figure.

So I thought I'd carry out an experiment to find out why these metrics were so inconsistent with the aim of answering two simple questions: 'What is Avg. Time on Page?' and 'What is Avg. Session Duration?'.

The test

We conducted three unique visits to a landing page:

  • 1 visit stays 5 seconds and exits.

  • 1 visit stays 15 seconds and visits another page for 5 seconds and exits.

  • 1 visit stays 30 seconds and visits another page for 15 seconds and exits.

The results

All Pages

Expected results: We expected the Avg. Time on page to be 17 seconds because the average of 5, 15 and 30 seconds is 16.666.


Actual results:

GA-avg -time -table _2

The results of the actual Avg. Time on Page shows as 23 seconds. This is because this figure does not include the visits which bounced so is calculated based on the visits which lasted 15 and 30 seconds, working out at an average is 22.5.

If you click the question mark next to the Avg. Time on Page metric, this is how it's defined:

GA-avg -time -screenshot _1

But, based on the results of our experiment, this is actually a completely inaccurate way of describing the metric.

Landing Pages

Expected results: We expected the Avg. Session Duration to be 23 seconds because the total time of the 3 sessions lasted 5, 20 and 45 seconds (this includes the time spent on the second page).

GA-avg -time -table _3

Actual results:

GA-avg -time -table _4

The results of the Avg. Session Duration showed as 15 seconds. This is because this figure is calculated based on inaccurate data of 0 seconds (the visit which bounced), 15 and 30 seconds. All three of these timings are incorrect because the time a user spends on the final page they visit (or first page if that is also their final page) is always counted as 0 seconds in Session Duration.

The below tables shows the data when we include a secondary dimension of 'Session Duration'.

GA-avg -time -table _5

If the data was accurate it would read as follows:

GA-avg -time -table _6

The reason Google Analytics doesn't take into account the amount of time spent on the second page in this scenario is because the user ends the Session after the second page and therefore Google reports on the time spent on the second page as 0 seconds.

This means that, just like with the Avg. Time on Page metric description, the definitions that Google provides for both Sessions and Avg. Sessions Duration (see below) are also wrong.

GA-avg -time -screenshot _2

GA-avg -time -screenshot _3

The conclusion

Here's what we learnt from this experiment:

  • Avg. Time on Page is calculated based on the average amount of time the users who did not bounce spend on the page.
  • If you have a page which has a very high bounce rate then it will massively skew the Avg. Session Duration as these users will be reported as spending 0 seconds on the page.
  • This means Avg. Session Duration is calculated based on completely inaccurate Session Duration data and, therefore, should not be used for reporting.

Clients often ask 'What was the bounce rate?' or 'How much time were users on the page?'. But, rather than use the inaccurate data outlined above which provide a skewed view of the actual success of the page, we've found that a better way to track engagement on a page is to use event tracking to see:

  • When a page is loaded
  • When a user starts scrolling
  • When a user gets to the bottom of the content
  • When a user gets to the bottom of the page.

Because this tracking triggers an event it can track the time of each event so we know how much time a user spends getting to the end of the page. A demo example of how this works can be found on the JSFiddle site.

Insights from FutureEverything 2015