Tag Archive | "Aren’t"

When Bounce Rate, Browse Rate (PPV), and Time-on-Site Are Useful Metrics… and When They Aren’t – Whiteboard Friday

Posted by randfish

When is it right to use metrics like bounce rate, pages per visit, and time on site? When are you better off ignoring them? There are endless opinions on whether these kinds of metrics are valuable or not, and as you might suspect, the answer is found in the shades of grey. Learn what Rand has to say about the great metrics debate in today’s episode of Whiteboard Friday.

When bounce rate browse rate and ppc are useful metrics and when they suck

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about times at which bounce rate, browse rate, which is pages per visit, and time on site are terrible metrics and when they’re actually quite useful metrics.

This happens quite a bit. I see in the digital marketing world people talking about these metrics as though they are either dirty-scum, bottom-of-the-barrel metrics that no one should pay any attention to, or that they are these lofty, perfect metrics that are what we should be optimizing for. Neither of those is really accurate. As is often the case, the truth usually lies somewhere in between.

So, first off, some credit to Wil Reynolds, who brought this up during a discussion that I had with him at Siege Media’s offices, an interview that Ross Hudgens put together with us, and Sayf Sharif from Seer Interactive, their Director of Analytics, who left an awesome comment about this discussion on the LinkedIn post of that video. We’ll link to those in this Whiteboard Friday.

So Sayf and Wil were both basically arguing that these are kind of crap metrics. We don’t trust them. We don’t use them a lot. I think, a lot of the time, that makes sense.

Instances when these metrics aren’t useful

Here’s when these metrics, that bounce rate, pages per visit, and time on site kind of suck.

1. When they’re used instead of conversion actions to represent “success”

So they suck when you use them instead of conversion actions. So a conversion is someone took an action that I wanted on my website. They filled in a form. They purchased a product. They put in their credit card. Whatever it is, they got to a page that I wanted them to get to.

Bounce rate is basically the average percent of people who landed on a page and then left your website, not to continue on any other page on that site after visiting that page.

Pages per visit is essentially exactly what it sounds like, the average number of pages per visit for people who landed on that particular page. So people who came in through one of these pages, how many pages did they visit on my site.

Then time on site is essentially a very raw and rough metric. If I leave my computer to use the restroom or I basically switch to another tab or close my browser, it’s not necessarily the case that time on site ends right then. So this metric has a lot of imperfections. Now, averaged over time, it can still be directionally interesting.

But when you use these instead of conversion actions, which is what we all should be optimizing for ultimately, you can definitely get into some suckage with these metrics.

2. When they’re compared against non-relevant “competitors” and other sites

When you compare them against non-relevant competitors, so when you compare, for example, a product-focused, purchase-focused site against a media-focused site, you’re going to get big differences. First off, if your pages per visit look like a media site’s pages per visit and you’re product-focused, that is crazy. Either the media site is terrible or you’re doing something absolutely amazing in terms of keeping people’s attention and energy.

Time on site is a little bit misleading in this case too, because if you look at the time on site, again, of a media property or a news-focused, content-focused site versus one that’s very e-commerce focused, you’re going to get vastly different things. Amazon probably wants your time on site to be pretty small. Dell wants your time on site to be pretty small. Get through the purchase process, find the computer you want, buy it, get out of here. If you’re taking 10 minutes to do that or 20 minutes to do that instead of 5, we’ve failed. We haven’t provided a good enough experience to get you quickly through the purchase funnel. That can certainly be the case. So there can be warring priorities inside even one of these metrics.

3. When they’re not considered over time or with traffic sources factored in

Third, you get some suckage when they are not considered over time or against the traffic sources that brought them in. For example, if someone visits a web page via a Twitter link, chances are really good, really, really good, especially on mobile, that they’re going to have a high bounce rate, a low number of pages per visit, and a low time on site. That’s just how Twitter behavior is. Facebook is quite similar.

Now, if they’ve come via a Google search, an informational Google search and they’ve clicked on an organic listing, you should see just the reverse. You should see a relatively good bounce rate. You should see a relatively good pages per visit, well, a relatively higher pages per visit, a relatively higher time on site.

Instances when these metrics are useful

1. When they’re used as diagnostics for the conversion funnel

So there’s complexity inside these metrics for sure. What we should be using them for, when these metrics are truly useful is when they are used as a diagnostic. So when you look at a conversion funnel and you see, okay, our conversion funnel looks like this, people come in through the homepage or through our blog or news sections, they eventually, we hope, make it to our product page, our pricing page, and our conversion page.

We have these metrics for all of these. When we make changes to some of these, significant changes, minor changes, we don’t just look at how conversion performs. We also look at whether things like time on site shrank or whether people had fewer pages per visit or whether they had a higher bounce rate from some of these sections.

So perhaps, for example, we changed our pricing and we actually saw that people spent less time on the pricing page and had about the same number of pages per visit and about the same bounce rate from the pricing page. At the same time, we saw conversions dip a little bit.

Should we intuit that pricing negatively affected our conversion rate? Well, perhaps not. Perhaps we should look and see if there were other changes made or if our traffic sources were in there, because it looks like, given that bounce rate didn’t increase, given that pages per visit didn’t really change, given that time on site actually went down a little bit, it seems like people are making it just fine through the pricing page. They’re making it just fine from this pricing page to the conversion page, so let’s look at something else.

This is the type of diagnostics that you can do when you have metrics at these levels. If you’ve seen a dip in conversions or a rise, this is exactly the kind of dig into the data that smart, savvy digital marketers should and can be doing, and I think it’s a powerful, useful tool to be able to form hypotheses based on what happens.

So again, another example, did we change this product page? We saw pages per visit shrink and time on site shrink. Did it affect conversion rate? If it didn’t, but then we see that we’re getting fewer engaged visitors, and so now we can’t do as much retargeting and we’re losing email signups, maybe this did have a negative effect and we should go back to the other one, even if conversion rate itself didn’t seem to take a particular hit in this case.

2. When they’re compared over time to see if internal changes or external forces shifted behavior

Second useful way to apply these metrics is compared over time to see if your internal changes or some external forces shifted behavior. For example, we can look at the engagement rate on the blog. The blog is tough to generate as a conversion event. We could maybe look at subscriptions, but in general, pages per visit is a nice one for the blog. It tells us whether people make it past the page they landed on and into deeper sections, stick around our site, check out what we do.

So if we see that it had a dramatic fall down here in April and that was when we installed a new author and now they’re sort of recovering, we can say, “Oh, yeah, you know what? That takes a little while for a new blog author to kind of come up to speed. We’re going to give them time,” or, “Hey, we should interject here. We need to jump in and try and fix whatever is going on.”

3. When they’re benchmarked versus relevant industry competitors

Third and final useful case is when you benchmark versus truly relevant industry competitors. So if you have a direct competitor, very similar focus to you, product-focused in this case with a homepage and then some content sections and then a very focused product checkout, you could look at you versus them and their homepage and your homepage.

If you could get the data from a source like SimilarWeb or Jumpshot, if there’s enough clickstream level data, or some savvy industry surveys that collect this information, and you see that you’re significantly higher, you might then take a look at what are they doing that we’re not doing. Maybe we should use them when we do our user research and say, “Hey, what’s compelling to you about this that maybe is missing here?”

Otherwise, a lot of the time people will take direct competitors and say, “Hey, let’s look at what our competition is doing and we’ll consider that best practice.” But if you haven’t looked at how they’re performing, how people are getting through, whether they’re engaging, whether they’re spending time on that site, whether they’re making it through their different pages, you don’t know if they actually are best practices or whether you’re about to follow a laggard’s example and potentially hurt yourself.

So definitely a complex topic, definitely many, many different things that go into the uses of these metrics, and there are some bad and good ways to use them. I agree with Sayf and with Wil, but I think there are also some great ways to apply them. I would love to hear from you if you’ve got examples of those down in the comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google: Three Reasons Your Rich Snippets Aren’t Showing In Search

An SEO asked Google’s John Mueller why his rich cards (rich snippets) aren’t showing up in search. John responded with three possible reasons over Twitter…


Search Engine Roundtable

Posted in IM NewsComments Off

Google: Those Spammy Links Aren’t Benefiting Your Competitors

I often see SEOs, webmasters and site owners complaining that their competitors are building spammy links, buying links, doing PBNs and more and yet their competitor ranks well and is doing excellent in the Google results…


Search Engine Roundtable

Posted in IM NewsComments Off

Aren’t 301s, 302s, and Canonicals All Basically the Same? – Whiteboard Friday

Posted by Dr-Pete

They say history repeats itself. In the case of the great 301 vs 302 vs rel=canonical debate, it repeats itself about every three months. In today’s Whiteboard Friday, Dr. Pete explains how bots and humans experience pages differently depending on which solution you use, why it matters, and how each choice may be treated by Google.

Aren't 301s, 302s, and canonicals all basically the same?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans, it’s Dr. Pete, your friendly neighborhood marketing scientist here at Moz, and I want to talk today about an issue that comes up probably about every three months since the beginning of SEO history. It’s a question that looks something like this: Aren’t 301s, 302s, and canonicals all basically the same?

So if you’re busy and you need the short answer, it’s, “No, they’re not.” But you may want the more nuanced approach. This popped up again about a week [month] ago, because John Mueller on the Webmaster Team at Google had posted about redirection for secure sites, and in it someone had said, “Oh, wait, 302s don’t pass PageRank.”

John said, “No. That’s a myth. It’s incorrect that 302s don’t pass PR,” which is a very short answer to a very long, technical question. So SEOs, of course, jumped on that, and it turned into, “301s and 302s are the same, cats are dogs, cakes are pie, up is down.” We all did our freakout that happens four times a year.

So I want to get into why this is a difficult question, why these things are important, why they are different, and why they’re different not just from a technical SEO perspective, but from the intent and why that matters.

I’ve talked to John a little bit. I’m not going to put words in his mouth, but I think 95% of this will be approved, and if you want to ask him, that’s okay afterwards too.

Why is this such a difficult question?

So let’s talk a little bit about classic 301, 302. So a 301 redirect situation is what we call a permanent redirect. What we’re trying to accomplish is something like this. We have an old URL, URL A, and let’s say for example a couple years ago Moz moved our entire site from seomoz.org to moz.com. That was a permanent change, and so we wanted to tell Google two things and all bots and browsers:

  1. First of all, send the people to the new URL, and, second,
  2. pass all the signals. All these equity, PR, ranking signals, whatever you want to call them, authority, that should go to the new page as well.

So people and bots should both end up on this new page.

A classic 302 situation is something like a one-day sale. So what we’re saying is for some reason we have this main page with the product. We can’t put the sale information on that page. We need a new URL. Maybe it’s our CMS, maybe it’s a political thing, doesn’t matter. So we want to do a 302, a temporary redirect that says, “Hey, you know what? All the signals, all the ranking signals, the PR, for Google’s sake keep the old page. That’s the main one. But send people to this other page just for a couple of days, and then we’re going to take that away.”

So these do two different things. One of these tells the bots, “Hey, this is the new home,” and the other one tells it, “Hey, stick around here. This is going to come back, but we want people to see the new thing.”

So I think sometimes Google interprets our meaning and can change things around, and we get frustrated because we go, “Why are they doing that? Why don’t they just listen to our signals?”

Why are these differentiations important?

The problem is this. In the real world, we end up with things like this, we have page W that 301s to page T that 302s to page F and page F rel=canonicals back to page W, and Google reads this and says, “W, T, F.” What do we do?

We sent bad signals. We’ve done something that just doesn’t make sense, and Google is forced to interpret us, and that’s a very difficult thing. We do a lot of strange things. We’ll set up 302s because that’s what’s in our CMS, that’s what’s easy in an Apache rewrite file. We forget to change it to a 301. Our devs don’t know the difference, and so we end up with a lot of ambiguous situations, a lot of mixed signals, and Google is trying to help us. Sometimes they don’t help us very well, but they just run into these problems a lot.

In this case, the bots have no idea where to go. The people are going to end up on that last page, but the bots are going to have to choose, and they’re probably going to choose badly because our intent isn’t clear.

How are 301s, 302s, and rel=canonical different?

So there are a couple situations I want to cover, because I think they’re fairly common and I want to show that this is complex. Google can interpret, but there are some reasons and there’s some rhyme or reason.

1. Long-term 302s may be treated as 301s.

So the first one is that long-term 302s are probably going to be treated as 301s. They don’t make any sense. If you set up a 302 and you leave it for six months, Google is going to look at that and say, “You know what? I think you meant this to be permanent and you made a mistake. We’re going to pass ranking signals, and we’re going to send people to page B.” I think that generally makes sense.

Some types of 302s just don’t make sense at all. So if you’re migrating from non-secure to secure, from HTTP to HTTPS and you set up a 302, that’s a signal that doesn’t quite make sense. Why would you temporarily migrate? This is probably a permanent choice, and so in that case, and this is actually what John was addressing in this post originally, in that case Google is probably going to look at that and say, “You know what? I think you meant 301s here,” and they’re going to pass signals to the secure version. We know they prefer that anyway, so they’re going to make that choice for you.

If you’re confused about where the signals are going, then look at the page that’s ranking, because in most cases the page that Google chooses to rank is the one that’s getting the ranking signals. It’s the one that’s getting the PR and the authority.

So if you have a case like this, a 302, and you leave it up permanently and you start to see that Page B is the one that’s being indexed and ranking, then Page B is probably the one that’s getting the ranking signals. So Google has interpreted this as a 301. If you leave a 302 up for six months and you see that Google is still taking people to Page A, then Page A is probably where the ranking signals are going.

So that can give you an indicator of what their decision is. It’s a little hard to reverse that. But if you’ve left a 302 in place for six months, then I think you have to ask yourself, “What was my intent? What am I trying to accomplish here?”

Part of the problem with this is that when we ask this question, “Aren’t 302s, 301s, canonicals all basically the same?” what we’re really implying is, “Aren’t they the same for SEO?” I think this is a legitimate but very dangerous question, because, yes, we need to know how the signals are passed and, yes, Google may pass ranking signals through any of these things. But for people they’re very different, and this is important.

2. Rel=canonical is for bots, not people.

So I want to talk about rel=canonical briefly because rel=canonical is a bit different. We have Page A and Page B again, and we’re going to canonical from Page A to Page B. What we’re basically saying with this is, “Look, I want you, the bots, to consider Page B to be the main page. You know, for some reason I have to have these near duplicates. I have to have these other copies. But this is the main one. This is what I want to rank. But I want people to stay on Page A.”

So this is entirely different from a 301 where I want people and bots to go to Page B. That’s different from a 302, where I’m going to try to keep the bots where they are, but send people over here.

So take it from a user perspective. I have had in Q&A all the time people say, “Well, I’ve heard that rel=canonical passes ranking signals. Which should I choose? Should I choose that or 301? What’s better for SEO?”

That’s true. We do think it generally passes ranking signals, but for SEO is a bad question, because these are completely different user experiences, and either you’re going to want people to stay on Page A or you’re going to want people to go to Page B.

Why this matters, both for bots and for people

So I just want you to keep in mind, when you look at these three things, it’s true that 302s can pass PR. But if you’re in a situation where you want a permanent redirect, you want people to go to Page B, you want bots to go to Page B, you want Page B to rank, use the right signal. Don’t confuse Google. They may make bad choices. Some of your 302s may be treated as 301s. It doesn’t make them the same, and a rel=canonical is a very, very different situation that essentially leaves people behind and sends bots ahead.

So keep in mind what your use case actually is, keep in mind what your goals are, and don’t get over-focused on the ranking signals themselves or the SEO uses because all off these three things have different purposes.

So I hope that makes sense. If you have any questions or comments or you’ve seen anything weird actually happen on Google, please let us know and I’ll be happy to address that. And until then, we’ll see you next week.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Why the Links You’ve Built Aren’t Helping Your Page Rank Higher – Whiteboard Friday

Posted by randfish

Link building can be incredibly effective, but sometimes a lot of effort can go into earning links with absolutely no improvement in rankings. Why? In today’s Whiteboard Friday, Rand shows us four things we should look at in these cases, help us hone our link building skills and make the process more effective.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high resolution image in a new tab!

Why the Links You've Built to That Page Aren't Helping it Move up the Rankings Whiteboard

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about why link building sometimes fails.

So I’ve got an example here. I’m going to do a search for artificial sweeteners. Let’s say I’m working for these guys, ScienceMag.org. Well, this is actually in position 10. I put it in position 3 here, but I see that I’m position 10. I think to myself, “Man, if I could get higher up on this page, that would be excellent. I’ve already produced the content. It’s on my domain. Like, Google seems to have indexed it fine. It’s performing well enough to perform on page one, granted at the bottom of page one, for this competitive query. Now I want to move my rankings up.”

So a lot of SEOs, naturally and historically, for a long time have thought, “I need to build more links to that page. If I can get more links pointing to this page, I can move up the rankings.” Granted, there are some other ways to do that too, and we’ve discussed those in previous Whiteboard Fridays. But links are one of the big ones that people use.

I think one of the challenges that we encounter is sometimes we invest that effort. We go through the process of that outreach campaign, talking to bloggers and other news sites and looking at where our link sources are coming from and trying to get some more of those. It just doesn’t seem to do anything. The link building appears to fail. It’s like, man, I’ve got all these nice links and no new results. I didn’t move up at all. I am basically staying where I am, or maybe I’m even falling down. Why is that? Why does link building sometimes work so well and so clearly and obviously, and sometimes it seems to do nothing at all?

What are some possible reasons link acquisition efforts may not be effective?

Oftentimes if you get a fresh set of eyes on it, an outside SEO perspective, they can do this audit, and they’ll walk through a lot of this stuff and help you realize, “Oh yeah, that’s probably why.” These are things that you might need to change strategically or tactically as you approach this problem. But you can do this yourself as well by looking at why a link building campaign, why a link building effort, for a particular page, might not be working.

1) Not the right links

First one, it’s not the right links. Not the right links, I mean a wide range of things, even broader than what I’ve listed here. But a lot of times that could mean low domain diversity. Yeah, you’re getting new links, but they’re coming from all the same places that you always get links from. Google, potentially, maybe views that as not particularly worthy of moving you up the rankings, especially around competitive queries.

It might be trustworthiness of source. So maybe they’re saying “Yeah, you got some links, but they’re not from particularly trustworthy places.” Tied into that maybe we don’t think or we’re sure that they’re not editorial. Maybe we think they’re paid, or we think they’re promotional in some way rather than being truly editorially given by this independent resource.

They might not come from a site or from a page that has the authority that’s necessary to move you up. Again, particularly for competitive queries, sometimes low-value links are just that. They’re not going to move the needle, especially not like they used to three, four, five or six years ago, where really just a large quantity of links, even from diverse domains, even if they were crappy links on crappy pages on relatively crappy or unknown websites would move the needle, not so much anymore. Google is seeing a lot more about these things.

Where else does the source link to? Is that source pointing to other stuff that is potentially looking manipulative to Google and so they discounted the outgoing links from that particular domain or those sites or those pages on those sites?

They might look at the relevance and say, “Hey, you know what? Yeah, you got linked to by some technology press articles. That doesn’t really have anything to do with artificial sweeteners, this topic, this realm, or this region.” So you’re not getting the same result. Now we’ve shown that off-topic links can oftentimes move the rankings, but in particular areas and in health, in fact, may be one of those Google might be more topically sensitive to where the links are coming from than other places.

Location on page. So I’ve got a page here and maybe all of my links are coming from a bunch of different domains, but it’s always in the right sidebar and it’s always in this little feed section. So Google’s saying, “Hey, that’s not really an editorial endorsement. That’s just them showing all the links that come through your particular blog feed or a subscription that they’ve got to your content or whatever it is promotionally pushing out. So we’re not going to count it that way.” Same thing a lot of times with footer links. Doesn’t work quite as well. If you’re being honest with yourself, you really want those in content links. Generally speaking, those tend to perform the best.

Or uniqueness. So they might look and they might say, “Yeah, you’ve got a ton of links from people who are republishing your same article and then just linking back to it. That doesn’t feel to us like an editorial endorsement, and so we’re just going to treat those copies as if those links didn’t exist at all.” But the links themselves may not actually be the problem. I think this can be a really important topic if you’re doing link acquisition auditing, because sometimes people get too focused on, “Oh, it must be something about the links that we’re getting.” That’s not always the case actually.

2) Not the right content

Sometimes it’s not the right content. So that could mean things like it’s temporally focused versus evergreen. So for different kinds of queries, Google interprets the intent of the searchers to be different. So it could be that when they see a search like “artificial sweeteners,” they say, “Yeah, it’s great that you wrote this piece about this recent research that came out. But you know what, we’re actually thinking that searchers are going to want in the top few results something that’s evergreen, that contains all the broad information that a searcher might need around this particular topic.”

That speaks to it might not answer the searchers questions. You might think, “Well, I’m answering a great question here.” The problem is, yeah you’re answering one. Searchers may have many questions that they’re asking around a topic, and Google is looking for something comprehensive, something that doesn’t mean a searcher clicks your result and then says, “Well, that was interesting, but I need more from a different result.” They’re looking for the one true result, the one true answer that tells them, “Hey, this person is very happy with these types of results.”

It could be poor user experience causing people to bounce back. That could be speed things, UI things, layout things, browser support things, multi-device support things. It might not use language formatting or text that people or engines can interpret as on the topic. Perhaps this is way over people’s heads, far too scientifically focused, most searchers can’t understand the language, or the other way around. It’s a highly scientific search query and a very advanced search query and your language is way dumbed down. Google isn’t interpreting that as on-topic. All the Hummingbird and topic modeling kind of things that they have say this isn’t for them.

Or it might not match expectations of searchers. This is distinct and different from searchers’ questions. So searchers’ questions is, “I want to know how artificial sweeteners might affect me.” Expectations might be, “I expect to learn this kind of information. I expect to find out these things.” For example, if you go down a rabbit hole of artificial sweeteners will make your skin shiny, they’re like, “Well, that doesn’t meet with my expectation. I don’t think that’s right.” Even if you have some data around that, that’s not what they were expecting to find. They might bounce back. Engines might not interpret you as on-topic, etc. So lots of content kinds of things.

3) Not the right domain

Then there are also domain issues. You might not have the right domain. Your domain might not be associated with the topic or content that Google and searchers are expecting. So they see Mayo Clinic, they see MedicineNet, and they go, “ScienceMag? Do they do health information? I don’t think they do. I’m not sure if that’s an appropriate one.” It might be perceived, even if you aren’t, as spammy or manipulative by Google, more probably than by searchers. Or searchers just won’t click your brand for that content. This is a very frustrating one, because we have seen a ton of times when search behavior is biased by the brand itself, by what’s in this green text here, the domain name or the brand name that Google might show there. That’s very frustrating, but it means that you need to build brand affinity between that topic, that keyword, and what’s in searchers’ heads.

4) Accessibility or technical issues

Then finally, there could be some accessibility or technical issues. Usually when that’s the case, you will notice pretty easily because the page will have an error. It won’t show the content properly. The cache will be an issue. That’s a rare one, but you might want to check for it as well.

But hopefully, using this kind of an audit system, you can figure out why a link building campaign, a link building effort isn’t working to move the needle on your rankings.

With that, we will see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

http://goo.gl/X3VNrK — WEB HOSTING — OVER 50% OFF! http://myboringchannel.com — How to Make a Blog ***** UPDATED VIDEO HERE — http://goo.gl/bBPZgP This tutorial will show you how to…

Posted in IM NewsComments Off

Why Visitor Analytics Aren’t Enough for Modern Marketers

Posted by randfish

For the first two decades of the web, the vast majority of those performing web marketing tasks used visitor analytics tools (from log files and hit counters all the way up to today’s full-featured visitor analytics tools) to do their jobs. We’d look at how many visits came in, where they were coming from, and what pages they saw, and that was enough.

But, web marketing has evolved. It’s become far more complex and competitive. And in 2013, visitor analytics alone doesn’t cut it.

The key challenges marketers face usually fall into one of three buckets:

  1. Measuring & reporting (and the analysis of those reports)
  2. Uncovering problem issues
  3. Identifying areas of opportunity

If we visualize these challenges, we can see the missing holes compared to the features of visitor analytics software:

(note: this graphic isn’t meant to be an exhaustive list of metrics or of tools, and there’s plenty of overlap, e.g. Moz Analytics and Raven both track visit data, Mixpanel and Kiss Metrics both measure revenue and usage, etc)

It’s been my experience that most of the great web marketing teams have access to several tools that fill in the gaps on both sides of what visitor analytics provide. These marketers analyze how they’re doing in the leading indicator metrics against the competition, and follow that methodology (as far as possible) down to marketing KPIs, and finally business metrics.

Why does this matter so much?

Because a competitive web marketing world means we have less room for failure over a long period of time. If a tactic or channel isn’t succeeding, we have to know whether that’s because it’s a bad channel, or whether we’re just bad at it. Competitive comparisons are critical to getting that analysis right.

If your key competitors are kicking butt on Pinterest, but your CMO doesn’t “believe” in the channel, you need data to make the case. Likewise, if you’re attracting lots of converting visitors through Pinterest, but the lifetime value of those customers is 1/10th that of your email list based on your recitivism and amplification data, you need to know that, too. Google Analytics is great, but it can’t give you the answer to either of those questions, no matter how you customize it.

Obviously, I’m biased. Moz makes marketing software that’s focused on comparing your leading indicator metrics against your competition’s (go read Matt’s Field Guide to Moz Analytics if you’re curious about the details). We have a vested interest in marketers feeling the need for this type of data. But the truth is that we built software to help solve that problem because I/we believe it’s such an important part of the story.

We’re also not the only ones in the field.

Raven Tools provides a lot of this data, too, as do SearchMetrics, WooRank, and others. For individual pieces of this picture, tools like SEMRush, Majestic SEO, Sprout Social, and many more can help. Companies that make analytics software focused on those bottom-of-funnel, lead tracking, and lifetime value/retention-focused metrics are equally essential – KissMetrics, Mixpanel, Intercom.io, Hubspot, etc. There’s a reason so many players are in this field – marketers clearly need the data.

Visitor analytics like Google Analytics, Omniture, and Webtrends aren’t going anywhere. They’re still a huge part of what we need to do in our jobs. But alone, they’re not enough.

We need to see how the competitive landscape is trending, and how our efforts compare. We need to see how channels perform beyond simple conversion and sales tracking. There’s no single piece of software that does all of this in one place, and I strongly doubt there will be. Instead, I believe the future will have marketers on the organic side doing what our brethren in paid channels do – visiting several sources, aggregating information, and making smart decisions based on the nuance their collective brain power can help deduce.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Why Google Wants to Know About Small Websites That Aren’t Ranking Well

Do you have a small website that isn’t performing as well in Google’s search results as you think it should? Google’s Matt Cutts is asking small website owners to submit sites and explain why they deserve to outrank other websites on Google.
Search Engine Watch – Latest

Posted in IM NewsComments Off

Successful Crowdfunding Campaigns Aren’t About Cash, Says Shark Tank Winner

Before starting your crowdfunding campaign, be very clear about the purpose and what will resonate with many people. A clear, meaningful purpose will also help you overcome inevitable challenges along the way. SEW talked to Shark Tank Winner Tiffany Krumi
Search Engine Watch – Latest

Posted in IM NewsComments Off

Is F-commerce a Flop? Why Retailers Aren’t Sold on Facebook

F-commerce seems on the decline among big brands, as major retailers forego selling through Facebook and redirect budget to their own e-commerce sites. Is F-commerce a fad and how can Facebook help e-tailers succeed in their social space?
Search Engine Watch – Latest

Posted in IM NewsComments Off


Advert