What is Web Analytics Good For?
Introduction
The lack of concrete examples can be frustrating. People talk about web analytics, saying how amazing it is, and how it will revolutionize your online presence. But in my experience very few people will give actual examples of ways it does so. I’ve seen disappointingly little evidence of impactful insights that web analytics has provided. At the end of the day, what is the beneficial output of web analytics? What is it good for!?!
The following is my attempt to answer that question from my own 5 years of experience practicing web analytics, primarily by using Google Analytics.
By the end of this post I hope you have a much better idea of the kinds of insights web analytics can provide. I know I do.
An Analogy
To help guide us through this investigation I want to introduce an analogy:
Web Analytics : Website Owner :: Military Intelligence : General
In order to command and control the huge, complicated thing that is an army, a general must have a comprehensive, trustworthy, and detailed yet summarizable intelligence gathering system.
Likewise, a website owner, in order to command and control a website, has to have a comprehensive, trustworthy, and detailed yet summarizable web analytics infrastructure.
And of course, just like the general isn’t the only person who benefits from military intelligence, the “owner” of the website isn’t the only person who benefits from web analytics. Managers, marketers, writers, designers, and developers all benefit too. So, although all of the examples in this post are applicable to higher-level website decision makers, they are often also applicable to more specialized disciplines as well.
Defining Goals: What does it mean to win?
Before we can really understand what web analytics is good for, we do have to start at the beginning.
Before a war, a general has to decide what it means to “win” the war. Does it mean minimizing casualties and getting to peace as soon as possible? Defending the homeland at all costs? Showing loyalty to an ally?
Similarly, a website owner has to decide what “winning” looks like for their website. Does it mean driving enrollment? Increasing brand awareness? Improving public perception? Driving subscriptions or sales?
The first requirement for web analytics to be helpful at all is that the goals of the website are well defined. There may be more than one. And they may evolve over time. But without an understanding of why the website exists, web analytics is a waste of time. Without a clear direction an army just ends up wandering around some foreign place doing more harm than good.
I won’t take the time here to go into detail on how this process works, but you can read more about it from analytics guru Avinash Kaushik here and here, or read about metrics models, which is how I did it at NewCity.
Collecting Good Data
Another prerequisite for insight is good data. You have to have the data collection infrastructure in place to be able to tell if you’re succeeding. A general, when going into a war, needs to have a trustworthy intelligence gathering system in place.
And this doesn’t just happen. As I’ve written about before, there are steps that must be taken to ensure that the data you’re collecting is accurate and relevant to your goals. The larger and more complicated your website, the less likely it is that just slapping the basic Google Analytics code on your site will be sufficient.
1. Prioritizing
Now on to the good stuff.
What is web analytics good for? It’s good for prioritizing.
Having defined goals and begun to gather trustworthy, relevant information, a general can then begin deciding how to prioritize resources in the best way possible: what are the keys to victory? Where are the crucial battlefields? What troops should go where?
Similarly, a website owner can use analytics to help prioritize the aspects or sections of the site that need to perform well in order for the site to accomplish its goals.
Insight Pattern
These kinds of insights typically follow this pattern:
x is some amount (relative to y), therefore we should spend more/less time/money on x (relative to y).
Examples
- 35% of our new, external traffic is on mobile and that percentage is growing, therefore we should invest time/money into making our mobile experience a good one, especially for new, external users.
- Our site had over 100,000 unique users over the last year, including 25,000 who viewed at least 3 pages. But only 50 people attended our conference last year. Therefore, in planning for the redesign of our website, we shouldn’t solely focus on serving people who are coming to the conference. We should try to learn more about the other 24,450 users who engaged with the site, but didn’t come to the conference. Did they want to come but couldn’t? If so, why not? Were they even interested in coming to the conference? If not, why not? What were they using the site for then? How can we serve them and accomplish our goals?
- Only 1% of users are on IE8 or older and that number’s dropping, therefore we shouldn’t invest time/money in trying to support those browsers, even though VIP x says we should.
- These 100 pages account for 95% of the entrances on our site, therefore when we launch our new site we need to make absolutely sure that we have set up meaningful 301 redirects for them to their closest equivalents on the new site.
- One of our goals for the site is to attract new users who aren’t familiar with our brand, but we don’t have any valuable third-party referral sources, therefore we should think of sites that could link to our site and provide qualified new traffic. We should then try to get those sites to link to us.
- We are maintaining a separate mobile site with the intent of serving our mobile users better, but only 20% of mobile users actually view our mobile site. Therefore we should investigate why that’s happening, whether there are obvious problems, and maybe whether maintaining a separate mobile site is even worth it.
- 7% of pageviews on our site are 404 errors, and users who have seen a 404 page are less likely to convert, therefore we should invest time/money into fixing those 404s.
- These are the top 10 404s that people are seeing, therefore we should try to fix them first.
- 1.5% of pageviews on our site had JavaScript errors, therefore we should at least investigate those briefly to see if any of them are hampering the user experience and preventing the website from accomplishing its goals.
- These are the top 10 pages with the most JavaScript errors, therefore we should investigate them first.
- Although we haven’t completely moved our site to https, our site’s pages are available on https, and over the last month 5% of our pageviews are of the https version of the page. Somehow users are seeing that version. Therefore we should make sure either that users can’t access the https versions of pages (by using a 301 redirect) or that all the pages load correctly on https. And if it’s feasible, we might want to consider transitioning completely to https.
- Only 0.2% of our traffic comes from Baidu (and expanding our reach in China isn’t one of our goals), therefore we shouldn’t optimize our site for it at the expense of larger sources of traffic even though one of our board members travels to China every year and has found it hard to find our site when searching on Baidu.
- Only 0.4% of our users are using our site’s internal site search, therefore it’s probably not worth investing a lot of time and money upgrading/improving our site search, that is unless we plan on making it a more prominent feature on our site.
- Our virtual campus tour only gets 5 visits/day, therefore we should seriously question whether it’s worth what we’re paying for it. Further questions we should ask: Does it drive conversions (lift)? If not, then it’s probably not worth it. But if it does, then should try to find a way to get users to use it more.
- One of our goals for the homepage carousel was to engage different kinds of users, but only 1.24% of visitors to the homepage in the last 30 days interacted with our carousel, therefore we should reevaluate the need for a carousel. Maybe we should run some A/B tests to see if other ways of organizing and displaying content on the homepage result in greater engagement by new users.
- Users only click on the button to get the printer-friendly version of a page about 0.08% of the time (of the time that pages with those buttons are loaded), therefore it’s probably not worth including this functionality on the new layout/design, or we should at least make it less prominent.
- When our program list page is viewed, it’s almost always as the entrance page — 92% of its unique pageviews were entrances. So most users who come to this page are not navigating to it from another page on the site — most are entering directly on that page. Therefore we need to make sure this page performs well as a landing page, and we need to investigate which traffic sources are sending people to this page so that we can tailor the page even further to users coming from those particular sources.
- Users are using the tabs on our Student Life landing page 37% of the time, therefore we don’t necessarily need to totally scrap the tab structure (at least not without some testing). But users are clicking some tabs more than others. It looks like the tabs with strong keywords (e.g., “clubs”) are being clicked more than tabs with less clear words and phrases (e.g., “a better world”), (and the content that is behind the tabs that aren’t getting clicked as much is important). Therefore we should A/B test the wording on these tabs to see if we can increase click throughs to those items.
- Our program pages are some of the most viewed pages on our site, and are viewed by 90% of users who end up starting the application, therefore it’s important that we invest time/money in making those pages the best we can. We should do further investigation and analysis of those pages in terms of a number of things — content, layout, speed, calls-to-action, design, broken links, bugs, browser/device compatibility, accessibility, etc.
You may have noticed that most of these insights don’t tell you exactly what you should do to or fix on your site. They don’t tell you whether your call-to-action button should be moved. Or whether your subtitle fonts should change. Or whether you need to redesign your entire website. Or whether the content on your program pages needs to be rewritten. Or what your content needs to say. Or whether users are frustrated by what they’re not finding. They don’t tell you a lot of stuff.
But what they do tell you is where (or where not) to focus your time and energy to figure out the answers to those questions. And using that information you can then go on and use other methods to find the answers to those questions.
Some of these methods were mentioned in the examples above, but here’s a more complete list:
- A/B tests
- User tests
- Perception tests
- Content audits
- Site surveys / voice of customer
- Site speed tests
- Accessibility tests
- Browser compatibility tests
- Site scans for broken links, misspellings, and other errors.
Applicable in Multiple Contexts
It’s also important to note that prioritization insights can be applicable in a variety of contexts. Some are most useful when you’re planning a complete overhaul of your entire website presence, while others make more sense when you’re trying to optimize some piece of what you’ve already got.
Put This Into Practice: How to start using analytics to prioritize
- Define your website’s goals in light of your organization’s goals.
- Make sure your data’s not shady, i.e., is trustworthy and relevant to your goals.
- Look at your analytics to learn the basics of how your site is being used and to answer any initial questions, then prioritize your further work and research around what you find.
2. Identifying Problem Areas
What is web analytics good for? It’s good for identifying problem areas.
As a war rages on, a general needs to know if certain parts of his army are underperforming. Are certain fronts weak? Are certain supply chains failing? Are certain technologies limiting?
Similarly, a website owner needs to know if there are certain parts or aspects of the website that are underperforming.
Insight Pattern
These kinds of insights typically follow this pattern:
x is performing poorly relative to y, therefore we should try to improve x.
Examples
- We use a third-party solution for our online store. The conversion rate of our store is 18.4% for desktop/tablet users, but only 4.8% for mobile users. After controlling for differences in traffic source, location, and apparent buying intent, this difference in conversion rate holds up. This means there is likely something suboptimal about the mobile buying experience compared to the desktop experience. Therefore we should:
- Test and compare these two experiences.
- Fix any issues we have control over.
- If there are larger problems that only the third-party vendor can address, we should put pressure on them to improve their mobile experience, or if that’s not likely to yield results, consider switching vendors.
- The conversion rate on Firefox is significantly lower than the site average, therefore we should test our site on Firefox to see if there are any problems. And we estimate that these lost conversions are costing us $1,000 per day, therefore we should make this a priority.
- Our website lost approximately 10 orders yesterday because our e-commerce server was down for 30 minutes, therefore we should invest time/money so that that doesn’t happen again.
- After seeing that our homepage accounts for 50% of new, external user entrances, we’ve prioritized the improvement of this page. In investigating its performance we found that it has a 20% slower load time than the site average. Therefore we should figure out why and fix any problems that we find (using Google’s PageSpeed test and other tests).
- The president of the university wants to increase enrollment from out-of-state students. Last year 12% of enrollment was from out-of-state, 14% of our external users were from out-of-state, 15% of external user traffic to our admissions pages was from out-of-state, but only 3% of application starts were from out-of-state. Therefore we should try to figure out why so many out-of-state users viewed our admissions pages, but failed to start the application — why are out-of-state users “dropping off” in our “funnel”? We could figure this out using a combination of:
- Diving deeper into analytics:
- How are the out-of-state users who don’t start the application different from those that do? Can we learn anything from that?
- Has this dropoff from 15% to 3% always existed? Or have some things changed to make it better or worse?
- Asking around internally to see if anyone has any possible explanations. We could then do further testing and research to see if any of those hold up.
- Doing audience research to figure out what it is that out-of-state prospective students are interested in and why they would apply to our school.
- Doing user tests to figure out what it’s like for a prospective out-of-state student to use our website, especially the admissions section.
- Diving deeper into analytics:
- We’re looking for more students to enroll in our accounting program. Accounting is above average among all of our school’s programs in terms of getting users to request information once they’ve seen the program page, but it’s far far below average in terms of traffic to the program page. Therefore we should try to figure out why it’s receiving less traffic than the other program pages and if there are ways we can increase the amount of qualified traffic to it.
Put This Into Practice: How to start using analytics to identify problem areas
- Based on your goals and/or the aspects and sections of the site you’ve prioritized, look in your analytics to see if there are:
- Steps in your conversion funnel (not necessarily a step-by-step sequence of pages) where users, or certain segments of users, are dropping off at high rates.
- Certain things that are performing worse than certain other things (e.g. pages, devices, browsers, locations, etc that are performing worse than other comparable pages, devices, browsers, locations, etc).
3. Detecting Unexpected Changes
What is web analytics good for? It’s good for detecting unexpected changes.
As much as they might like to be, generals can’t be in control of everything. There are, of course, enemies trying to defeat you; mother nature, who doesn’t give a crap about your stupid war; and the unpredictability of your own troops.
A general has to be able to adapt quickly when stuff happens — a surprise attack, an October blizzard, or a suddenly underperforming division.
And of course, how can you adapt if you don’t know that these things have happened? You need to have a system of alerts and regular reporting to inform you of what’s going on.
The same, of course, is true for your website.
Insight Pattern
These kinds of insights typically follow this pattern:
We noticed that x changed, therefore we should try to either amplify or mitigate this change, and if nothing else, learn from it.
Examples
- Traffic and conversions from referral traffic from our parent site were up 50% year-over-year in the 1st quarter. Why? Largely because one page on their site was redesigned and a bunch of links to our site were added. Therefore we should:
- Thank them and give them positive feedback by showing the effect it had on our site. Hopefully this will encourage other similar behaviors.
- Learn from it ourselves — hmm, so when you add a bunch of links to a page, user behavior changes, and in this case it changed in this way.
- Organic traffic was down 15% year-over-year in the third quarter. Why? After investigating further we found that part of the reason is that some important content wasn’t migrated over to the new site that we launched at the beginning of the quarter. Therefore we should migrate that content over to the new site and 301 redirect the URLs of the old content to their new equivalent URLs. We should also learn from this mistake so we don’t do it again — we should put in place a more rigorous process to ensure we don’t miss any important content when doing content migrations in the future.
- Information requests are down 60% this week compared to last week. After investigating further it appears that a prominent link to the form was mistakenly broken when some content was rewritten. Therefore, we should fix that link and slap our content managers work to improve our QA process.
- The conversion rate of users who have viewed our FAQs page was down 30% year-over-year in February. Therefore, we should investigate why. Has our content on that page changed? Are there questions that users have this year that they didn’t have last year? Consider running a VoC survey on that page to see what it is customers are looking for there.
- Referral traffic from third-party sites is up. Therefore, we should investigate the individual sources from which traffic increased in order to learn something about the way outsiders view our website and its programs. In what context are they linking to us? Why do they find our site valuable? Are there other similar sites that aren’t linking to us that could be? Should we be linking back to them? Are we being talked about in a way that is worth repeating on our own site?
Put This Into Practice: How to start using analytics to detect unexpected changes
- Set up custom alerts in Google Analytics for your most important conversion goals, overall traffic, and any other key metrics for your site so that whenever those numbers change dramatically and suddenly you are notified within a day.
- Start doing periodic reporting and analysis. This will help you notice larger, more gradual changes in things like where traffic is coming from, how engaged it is with your content, and whether it’s converting.
4. Getting Feedback on Changes
What is web analytics good for? It’s good for getting feedback.
Generals might not be in control of everything, but they are in control of some things. And for those things, it’d be nice to know whether their efforts are helping or hurting the cause. Did fortifying that position deter the enemy? Did sending books to the front lines improve morale? Did employing new technology save lives?
In this case what generals need is some way of getting feedback.
Similarly, a website owner needs to get feedback on their decisions. Did redesigning the entire website help to increase enrollment? Did changing the color of that button increase clicks? Did enabling caching improve the site speed? Did rewriting that content lead to increased engagement?
Insight Pattern
These kinds of insights typically follow this pattern:
When we did x, y changed — therefore we should try to either amplify or mitigate this change, and if nothing else, learn from it.
Examples
- We launched a new website. Since launch, the click-through-rate of the main call-to-action button on our program page is up 77%. Therefore, we should:
- Give the people who designed this page a raise, or at least a firm milkshake handshake.
- Determine exactly why the new version of the page was better. What was different about the new page? We have some clues, but to know for sure it may be necessary to run some tests. Then we should use the insights gained from this to inform future work — could we make the effect even more pronounced on this page? Are there principles we learned that we can apply to other pages?
- In order to try to increase information requests, we added a call-to-action button to our academic program pages that points to our request info page. Since then, overall views of this page are up 230% year-over-year. Therefore it’s probably safe to say that this change worked, although to be sure we should confirm 1) that there weren’t any confounding factors, and 2) that actual completions of the form increased.
- We launched a new website. About 81% of mobile users on the old site weren’t even viewing the mobile site — they were only seeing the desktop site. That means that about 81% of mobile users weren’t getting an experience that was optimized for their device. The conversion rate for these users was only 1.64%. This resulted in an overall 2.31% conversion rate for sessions on mobile devices. But on the new responsive site, 100% of mobile users were seeing a mobile-optimized site and the conversion rate for sessions on mobile devices was a much higher 3.81%. The effect that this increase had on the overall conversion rate was substantial — had 81% of mobile users continued to see a unoptimized site and converted at a rate of 1.64%, the overall conversion rate (mobile, desktop, and tablet) would have been about 2.89% instead of the 3.28% that it actually was. That means that we can estimate that this improvement alone increased the conversion rate by about 0.39 percentage points (3.28%-2.89%). Therefore we should translate that increase into the bottom line for our organization and relay that success back to project stakeholders. Raises or budget increases may be in order.
- We launched a new website. On the old site the conversion rate for mobile users who viewed the mobile site was 5.26%. So far on the new responsive site the conversion rate for mobile users is 3.81%. That is a sizable drop. So what impact did this drop have on the overall conversion rate? Well, because the number of mobile users who actually viewed the old mobile site was relatively small (only 19% of mobile users), it didn’t have a huge impact, but it did have some — had 19% of mobile users continued to use the old mobile site and converted at a rate of 5.26% instead of using the new responsive site (and converting at a rate of 3.81%), the overall conversion rate would have been about 3.34% instead of the 3.28% that it actually was. That means that we can estimate that this change alone decreased the overall conversion rate by about 0.06 percentage points (3.28%-3.34%). Therefore, we shouldn’t get too cocky. There’s still room for improvement. We should investigate further to see if there are clear reasons why the new responsive site isn’t performing as well for mobile users as the old mobile site was.
- We launched a new website yesterday. We’ve already noticed a 10-fold increase in 404 errors since then. Therefore, we should investigate and fix the most important ones. There may have been important pages on the old site that weren’t migrated over to the new site or were migrated over, but weren’t redirected.
- We launched a new website. One of the changes we made was to make the links geared towards internal users less prominent (e.g. email and online class access). As a result, clicks on these links are down significantly since launch. Therefore, we should ask: if fewer internal users are using the website to access those services, how are they accessing them? Did we make the experience worse? We should somehow ask some internal users to find out. We may need to revise our placement of those links.
- We launched a new website last year. Since then, there has been a 119% year-over-year increase in application starts. Therefore we should:
- Investigate whether that increase translated into an increase in completed applications, and ultimately in enrollment. If it didn’t, then we should figure out why. If it did, then that positive feedback should be relayed back to those who were responsible for the new site.
- Investigate why this increase might have happened so that improvements can be amplified and lessons can be learned.
- We launched a new website last year. Since then, there has been a 40% year-over-year drop in information requests. The primary reason for the drop was that the number of people who even reached the form dropped by 56%. And the biggest reason for that drop was the drop in the click-through rate to the request info form from the main admissions page from 4.5% to 1.5%. This decrease in CTR resulted in about 1,600 fewer pageviews of the request info form over Nov 14 – Feb 28 than would have been expected given the CTR of the old admissions page. Part of the reason users may not be clicking through to the request info form as much is that they may instead be using the contact info (email addresses and phone numbers) for admissions that is now displayed on the main admissions page (it wasn’t on the old version of the page) — there were a significant number of clicks on the mailto links on this page. Therefore we should ask:
- Is that what’s actually happening? Are form submissions being replaced by direct calls and emails? Or is admissions noticing a drop in the overall number of students who are contacting them?
- And is it ok that RFIs are down? Would we prefer users to fill out the request info form instead of calling or emailing admissions directly?
- And if there is indeed a problem (overall contacts are down or they’re not down, but RFI form submissions are and we’d prefer those to direct contacts) then we should investigate ways to remedy the problem. Some user or A/B tests may be in order.
- Last month we added to the department pages more prominent links to individual major pages, in order to increase the traffic to the major pages. (We did this because most of the specific degree information that prospective students are looking for is on those pages.) Since then, the traffic to these two types of pages has changed, albeit just a little:
- Pageviews of the major pages are up from 3.45% of all pageviews prior to the changes to 4.11% since, while pageviews of the department pages are down from 9.53% prior to 8.16% since.
- Entrances to the major pages are up from 0.36% of all entrances prior to the changes to 0.42% since, while entrances on the department pages are down from 9.34% prior to 8.19% since.
- So it appears that there is a bit of a transition going on — more traffic is going to the major pages while less is going to the department pages. But the change hasn’t been dramatic, at least not so far. Therefore, we should feel like the changes we made are having some effect, but we should continue to review the overall user experience of exploring majors. We should also see whether that additional traffic to the major pages has had any impact on the bottom line — has it increased the conversion rate at all?
- We launched a new website last month and as part of the redesign we wanted to increase traffic to our really neat (and expensive) virtual campus map. So one thing we did was move the link to it from inside a “quick links” dropdown in the header to the footer. Since then there has been a 130% year-over-year increase in pageviews of the virtual campus map. Part of the reason was an increase in entrances on the map (+77%), but that doesn’t account for all of it — non-entrance unique pageviews of the map increased by 142%. So it appears that moving this link had an effect on how often it was clicked. Therefore we shouldn’t get rid of the virtual campus map, at least not yet. If we haven’t already, we should try to figure out what impact it’s having on conversions and whether, even with the additional traffic to it, it’s worth the expense.
Caveats
Using analytics for getting feedback on changes does have one serious limitation: it can only show correlation, it can’t prove causation. If you’re trying to judge the impact of a change, in order to find causality you need to run a randomized controlled experiment. That’s usually not what’s happening when you’re analyzing changes in your website engagement over time. So whenever you’re attempting to determine causality using analytics, you need to:
- Be wary about making definitive statements of causality. As large and obvious as the effect might be, you don’t really know that such-and-such was the reason. It could be random, or there could be confounding factors.
- Do your best to control for other factors after the fact. So your conversion rate increased post-launch. But besides launching a new site, what else changed? The time of year? The sources of your traffic? The types of users coming to the site? To have a better idea whether the conversion rate increase really was due to the new site, you’ll need to check whether it persists after controlling for these factors and others. It may be that a change in one of these other factors happened to coincide with the launch of the new site, and if so, it may be that that, not your new site, was the reason for the increase in conversion rate.
- Run actual randomized experiments. If you really want to know, test it. Run some A/B tests or multivariate tests.
Put This Into Practice: How to start using analytics to get feedback
- First make sure you have good baseline data that you can compare your post-change data to. If you don’t have that baseline data, start gathering it as soon as possible (but not before doing some planning of course :)).
- Make analytics a part of your normal website management process — whether you’re just making a small change or whether you’re redesigning an entire site, form hypotheses about what effects you think your changes will have on the key metrics that you want to improve. Then, after you’ve made the change, check analytics to see if your hypotheses were correct. Remember to use controlled experiments if you want proof of causality.
5. Forecasting
What is web analytics good for? It’s good for forecasting.
Experienced generals know what it takes to be victorious. They can more accurately predict the outcome of an upcoming battle based on their knowledge of how similar battles turned out in the past.
Website owners can do the same thing using analytics. They can better anticipate the future because they have a good understanding of the past. They can use forecasts to plan budgets and timelines, and as benchmarks against which to measure future performance.
Insight Pattern
These kinds of insights typically follow this pattern:
Based on x results from past experiences, we expect y results this time — therefore we can plan for y results and as time goes on we can compare our actual results to y, which will help us understand how things have changed. We can then amplify the good changes, mitigate the bad changes, and learn from them no matter what.
Examples
- In past lead generation marketing campaigns, we’ve gotten certain levels of conversions from certain sources at a certain pace based on seasonality and spend amounts. So, based on that information along with this year’s marketing plan and budget, we can project how many leads we’ll get from where and when over the course of this year’s campaign.
- Therefore we can estimate the number of people we’ll need to follow up with the leads that are generated.
- Therefore, as the campaign progresses we’ll be able to compare its performance to our projections to gauge how well it’s performing. If current performance varies from past performance, we can then investigate why, amplify positive developments, mitigate negative ones, and generally learn from what’s happening. Having accurate projections allows us to press the panic button when we should, but also to not press it when we shouldn’t.
- Over the past year we’ve gotten an average of 183,000 pageviews per month, with a high of 268,000 in March.
- Therefore we’ll have to pay $x for a hosting service that can handle that amount of traffic.
- Therefore we’ll have to pay $y to use a particular font on the site.
- Based on the last 5 years of data, we’ve found that traffic to our program pages correlates fairly closely with enrollment in those programs (confirm this yourself for your institution — it may not be true for you). Therefore, at a certain point in the upcoming year we can look at the traffic so far to the program pages and get a rough sense whether there will be any large changes in enrollment in certain programs. If there are certain programs that show a significant bump in traffic compared to last year, we can then take this information to our admissions teams and the individual departments so they can better prepare for the additional volume. On the other hand, if certain programs are trending below average we can put extra effort into promoting those programs to help avoid an enrollment gap. Similar things could be for international traffic or out-of-state traffic, or for different markers of engagement.
Caveats
- One caveat to keep in mind with regards to using analytics for forecasting is that in order to be able to make accurate, precise, and useful predictions, you have to have good historical data over multiple cycles. For higher-ed institutions who are trying to drive enrollment, that usually means you need at least one, if not two, full years of good data.
- Another things to keep in mind is that there have to be enough similarities between previous cycles and this cycle. If your entire marketing plan is different from last year’s, specific forecasts using that historical data might be useless.
- And lastly, you of course need to be aware of larger market or environmental trends. If there is evidence that prospective students in general are going to school websites less, then you should account for that if you try to project enrollment based on traffic or leads.
Put This Into Practice: How to start using analytics to forecast
- Again, the first thing you need to have is good historical data — in this case at least one cycle’s worth.
- Once you have that you need to analyze it to see what kind of correlations there are from year-to-year and across factors like season and marketing efforts.
- If those correlations are strong enough you can begin making projections for the upcoming cycle.
Conclusion
So what is web analytics good for? Well, I’m sure I don’t know everything it’s good for, but I do know it’s good for prioritizing, identifying problem areas, detecting unexpected changes, getting feedback, and forecasting.
Is it good for anything else? You tell me. I’d love to hear from you.