Tag Archive | "Setting"

Google: Large Sites Should Not Use The Google Crawl Setting

Google’s John Mueller said “really large sites” should not use the crawl setting because “setting maximum value is probably too small for what your need.” Larger sites need to go beyond that max value in Google Search Console and it is not possible to do that with the crawl setting.


Search Engine Roundtable

Posted in IM NewsComments Off

A Guide to Setting Up Your Very Own Search Intent Projects

Posted by TheMozTeam

This post was originally published on the STAT blog.


Whether you’re tracking thousands or millions of keywords, if you expect to extract deep insights and trends just by looking at your keywords from a high-level, you’re not getting the full story.

Smart segmentation is key to making sense of your data. And you’re probably already applying this outside of STAT. So now, we’re going to show you how to do it in STAT to uncover boatloads of insights that will help you make super data-driven decisions.

To show you what we mean, let’s take a look at a few ways we can set up a search intent project to uncover the kinds of insights we shared in our whitepaper, Using search intent to connect with consumers.

Before we jump in, there are a few things you should have down pat:

1. Picking a search intent that works for you

Search intent is the motivating force behind search and it can be:

  • Informational: The searcher has identified a need and is looking for information on the best solution, ie. [blender], [food processor]
  • Commercial: The searcher has zeroed in on a solution and wants to compare options, ie. [blender reviews], [best blenders]
  • Transactional: The searcher has narrowed their hunt down to a few best options, and is on the precipice of purchase, ie. [affordable blenders], [blender cost]
    • Local (sub-category of transactional): The searcher plans to do or buy something locally, ie. [blenders in dallas]
    • Navigational (sub-category of transactional): The searcher wants to locate a specific website, ie. [Blendtec]

We left navigational intent out of our study because it’s brand specific and didn’t want to bias our data.

Our keyword set was a big list of retail products — from kitty pooper-scoopers to pricey speakers. We needed a straightforward way to imply search intent, so we added keyword modifiers to characterize each type of intent.

As always, different strokes for different folks: The modifiers you choose and the intent categories you look at may differ, but it’s important to map that all out before you get started.

2. Identifying the SERP features you really want

For our whitepaper research, we pretty much tracked every feature under the sun, but you certainly don’t have to.

You might already know which features you want to target, the ones you want to keep an eye on, or questions you want to answer. For example, are shopping boxes taking up enough space to warrant a PPC strategy?

In this blog post, we’re going to really focus-in on our most beloved SERP feature: featured snippets (called “answers” in STAT). And we’ll be using a sample project where we’re tracking 25,692 keywords against Amazon.com.

3. Using STAT’s segmentation tools

Setting up projects in STAT means making use of the segmentation tools. Here’s a quick rundown of what we used:

  • Standard tag: Best used to group your keywords into static themes — search intent, brand, product type, or modifier.
  • Dynamic tag: Like a smart playlist, automatically returns keywords that match certain criteria, like a given search volume, rank, or SERP feature appearance.
  • Data view: House any number of tags and show how those tags perform as a group.

Learn more about tags and data views in the STAT Knowledge Base.

Now, on to the main event…

1. Use top-level search intent to find SERP feature opportunities

To kick things off, we’ll identify the SERP features that appear at each level of search intent by creating tags.

Our first step is to filter our keywords and create standard tags for our search intent keywords (read more abou tfiltering keywords). Second, we create dynamic tags to track the appearance of specific SERP features within each search intent group. And our final step, to keep everything organized, is to place our tags in tidy little data views, according to search intent.

Here’s a peek at what that looks like in STAT:

What can we uncover?

Our standard tags (the blue tags) show how many keywords are in each search intent bucket: 2,940 commercial keywords. And our dynamic tags (the sunny yellow stars) show how many of those keywords return a SERP feature: 547 commercial keywords with a snippet.

This means we can quickly spot how much opportunity exists for each SERP feature by simply glancing at the tags. Boom!

By quickly crunching some numbers, we can see that snippets appear on 5 percent of our informational SERPs (27 out of 521), 19 percent of our commercial SERPs (547 out of 2,940), and 12 percent of our transactional SERPs (253 out of 2,058).

From this, we might conclude that optimizing our commercial intent keywords for featured snippets is the way to go since they appear to present the biggest opportunity. To confirm, let’s click on the commercial intent featured snippet tag to view the tag dashboard…

Voilà! There are loads of opportunities to gain a featured snippet.

Though, we should note that most of our keywords rank below where Google typically pulls the answer from. So, what we can see right away is that we need to make some serious ranking gains in order to stand a chance at grabbing those snippets.

2. Find SERP feature opportunities with intent modifiers

Now, let’s take a look at which SERP features appear most often for our different keyword modifiers.

To do this, we group our keywords by modifier and create a standard tag for each group. Then, we set up dynamic tags for our desired SERP features. Again, to keep track of all the things, we contained the tags in handy data views, grouped by search intent.

What can we uncover?

Because we saw that featured snippets appear most often for our commercial intent keywords, it’s time to drill on down and figure out precisely which modifiers within our commercial bucket are driving this trend.

Glancing quickly at the numbers in the tag titles in the image above, we can see that “best,” “reviews,” and “top” are responsible for the majority of the keywords that return a featured snippet:

  • 212 out of 294 of our “best” keywords (72%)
  • 109 out of 294 of our “reviews” keywords (37%)
  • 170 out of 294 of our “top” keywords (59%)

This shows us where our efforts are best spent optimizing.

By clicking on the “best — featured snippets” tag, we’re magically transported into the dashboard. Here, we see that our average ranking could use some TLC.

There is a lot of opportunity to snag a snippet here, but we (actually, Amazon, who we’re tracking these keywords against) don’t seem to be capitalizing on that potential as much as we could. Let’s drill down further to see which snippets we already own.

We know we’ve got content that has won snippets, so we can use that as a guideline for the other keywords that we want to target.

3. See which pages are ranking best by search intent

In our blog post How Google dishes out content by search intent, we looked at what type of pages — category pages, product pages, reviews — appear most frequently at each stage of a searcher’s intent.

What we found was that Google loves category pages, which are the engine’s top choice for retail keywords across all levels of search intent. Product pages weren’t far behind.

By creating dynamic tags for URL markers, or portions of your URL that identify product pages versus category pages, and segmenting those by intent, you too can get all this glorious data. That’s exactly what we did for our retail keywords

What can we uncover?

Looking at the tags in the transactional page types data view, we can see that product pages are appearing far more frequently (526) than category pages (151).

When we glanced at the dashboard, we found that slightly more than half of the product pages were ranking on the first page (sah-weet!). That said, more than thirty percent appeared on page three and beyond. So despite the initial visual of “doing well”, there’s a lot of opportunity that Amazon could be capitalizing on.

We can also see this in the Daily Snapshot. In the image above, we compare category pages (left) to product pages (right), and we see that while there are less category pages ranking, the rank is significantly better. Amazon could take some of the lessons they’ve applied to their category pages to help their product pages out.

Wrapping it up

So what did we learn today?

  1. Smart segmentation starts with a well-crafted list of keywords, grouped into tags, and housed in data views.
  2. The more you segment, the more insights you’re gonna uncover.
  3. Rely on the dashboards in STAT to flag opportunities and tell you what’s good, yo!

Want to see it all in action? Get a tailored walkthrough of STAT, here.

Or get your mitts on even more intent-based insights in our full whitepaper: Using search intent to connect with consumers.

Read on, readers!

More in our search intent series:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

SearchCap: Google Assistant GMB setting, voice assistance & SMX West

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Big, Fast, and Strong: Setting the Standard for Backlink Index Comparisons

Posted by rjonesx.

It’s all wrong

It always was. Most of us knew it. But with limited resources, we just couldn’t really compare the quality, size, and speed of link indexes very well. Frankly, most backlink index comparisons would barely pass for a high school science fair project, much less a rigorous peer review.

My most earnest attempt at determining the quality of a link index was back in 2015, before I joined Moz as Principal Search Scientist. But I knew at the time that I was missing a huge key to any study of this sort that hopes to call itself scientific, authoritative or, frankly, true: a random, uniform sample of the web.

But let me start with a quick request. Please take the time to read this through. If you can’t today, schedule some time later. Your businesses depend on the data you bring in, and this article will allow you to stop taking data quality on faith alone. If you have questions with some technical aspects, I will respond in the comments, or you can reach me on twitter at @rjonesx. I desperately want our industry to finally get this right and to hold ourselves as data providers to rigorous quality standards.


Quick links:

  1. Home
  2. Getting it right
  3. What’s the big deal with random?
  4. Now what? Defining metrics
  5. Caveats
  6. The metrics dashboard
  7. Size matters
  8. Speed
  9. Quality
  10. The Link Index Olympics
  11. What’s next?
  12. About PA and DA
  13. Quick takeaways

Getting it right

One of the greatest things Moz offers is a leadership team that has given me the freedom to do what it takes to “get things right.” I first encountered this when Moz agreed to spend an enormous amount of money on clickstream data so we could make our keyword tool search volume better (a huge, multi-year financial risk with the hope of improving literally one metric in our industry). Two years later, Ahrefs and SEMRush now use the same methodology because it’s just the right way to do it.

About 6 months into this multi-year project to replace our link index with the huge Link Explorer, I was tasked with the open-ended question of “how do we know if our link index is good?” I had been thinking about this question ever since that article published in 2015 and I knew I wasn’t going to go forward with anything other than a system that begins with a truly “random sample of the web.” Once again, Moz asked me to do what it takes to “get this right,” and they let me run with it.

What’s the big deal with random?

It’s really hard to over-state how important a good random sample is. Let me diverge for a second. Let’s say you look at a survey that says 90% of Americans believe that the Earth is flat. That would be a terrifying statistic. But later you find out the survey was taken at a Flat-Earther convention and the 10% who disagreed were employees of the convention center. This would make total sense. The problem is the sample of people surveyed wasn’t of random Americans — instead, it was biased because it was taken at a Flat-Earther convention.

Now, imagine the same thing for the web. Let’s say an agency wants to run a test to determine which link index is better, so they look at a few hundred sites for comparison. Where did they get the sites? Past clients? Then they are probably biased towards SEO-friendly sites and not reflective of the web as a whole. Clickstream data? Then they would be biased towards popular sites and pages — once again, not reflective of the web as a whole!

Starting with a bad sample guarantees bad results.

It gets even worse, though. Indexes like Moz report our total statistics (number of links or number of domains in our index). However, this can be terribly misleading. Imagine a restaurant which claimed to have the largest wine selection in the world with over 1,000,000 bottles. They could make that claim, but it wouldn’t be useful if they actually had 1,000,000 of the same type, or only Cabernet, or half-bottles. It’s easy to mislead when you just throw out big numbers. Instead, it would be much better to have a random selection of wines from the world and measure if that restaurant has it in stock, and how many. Only then would you have a good measure of their inventory. The same is true for measuring link indexes — this is the theory behind my methodology.

Unfortunately, it turns out getting a random sample of the web is really hard. The first intuition most of us at Moz had was to just take a random sample of the URLs in our own index. Of course we couldn’t — that would bias the sample towards our own index, so we scrapped that idea. The next thought was: “We know all these URLs from the SERPs we collect — perhaps we could use those.” But we knew they’d be biased towards higher-quality pages. Most URLs don’t rank for anything — scratch that idea. It was time to take a deeper look.

I fired up Google Scholar to see if any other organizations had attempted this process and found literally one paper, which Google produced back in June of 2000, called “On Near-Uniform URL Sampling.” I hastily whipped out my credit card to buy the paper after reading just the first sentence of the abstract: “We consider the problem of sampling URLs uniformly at random from the Web.” This was exactly what I needed.

Why not Common Crawl?

Many of the more technical SEOs reading this might ask why we didn’t simply select random URLs from a third-party index of the web like the fantastic Common Crawl data set. There were several reasons why we considered, but chose to pass, on this methodology (despite it being far easier to implement).

  1. We can’t be certain of Common Crawl’s long-term availability. Top million lists (which we used as part of the seeding process) are available from multiple sources, which means if Quantcast goes away we can use other providers.
  2. We have contributed crawl sets in the past to Common Crawl and want to be certain there is no implicit or explicit bias in favor of Moz’s index, no matter how marginal.
  3. The Common Crawl data set is quite large and would be harder to work with for many who are attempting to create their own random lists of URLs. We wanted our process to be reproducible.

How to get a random sample of the web

The process of getting to a “random sample of the web” is fairly tedious, but the general gist of it is this. First, we start with a well-understood biased set of URLs. We then attempt to remove or balance this bias out, making the best pseudo-random URL list we can. Finally, we use a random crawl of the web starting with those pseudo-random URLs to produce a final list of URLs that approach truly random. Here are the complete details.

1. The starting point: Getting seed URLs

The first big problem with getting a random sample of the web is that there is no true random starting point. Think about it. Unlike a bag of marbles where you could just reach in and blindly grab one at random, if you don’t already know about a URL, you can’t pick it at random. You could try to just brute-force create random URLs by shoving letters and slashes after each other, but we know language doesn’t work that way, so the URLs would be very different from what we tend to find on the web. Unfortunately, everyone is forced to start with some pseudo-random process.

We had to make a choice. It was a tough one. Do we start with a known strong bias that doesn’t favor Moz, or do we start with a known weaker bias that does? We could use a random selection from our own index for the starting point of this process, which would be pseudo-random but could potentially favor Moz, or we could start with a smaller, public index like the Quantcast Top Million which would be strongly biased towards good sites.

We decided to go with the latter as the starting point because Quantcast data is:

  1. Reproducible. We weren’t going to make “random URL selection” part of the Moz API, so we needed something others in the industry could start with as well. Quantcast Top Million is free to everyone.
  2. Not biased towards Moz: We would prefer to err on the side of caution,
    even if it meant more work removing bias.
  3. Well-known bias: The bias inherent in the Quantcast Top 1,000,000 was easily understood — these are important sites and we need to remove that bias.
  4. Quantcast bias is natural: Any link graph itself already shares some of the Quantcast bias (powerful sites are more likely to be well-linked)

With that in mind, we randomly selected 10,000 domains from the Quantcast Top Million and began the process of removing bias.

2. Selecting based on size of domain rather than importance

Since we knew the Quantcast Top Million was ranked by traffic and we wanted to mitigate against that bias, we introduced a new bias based on the size of the site. For each of the 10,000 sites, we identified the number of pages on the site according to Google using the “site:” command and also grabbed the top 100 pages from the domain. Now we could balance the “importance bias” against a “size bias,” which is more reflective of the number of URLs on the web. This was the first step in mitigating the known bias of only high-quality sites in the Quantcast Top Million.

3. Selecting pseudo-random starting points on each domain

The next step was randomly selecting domains from that 10,000 with a bias towards larger sites. When the system selects a site, it then randomly selects from the top 100 pages we gathered from that site via Google. This helps mitigate the importance bias a little more. We aren’t always starting with the homepage. While these pages do tend to be important pages on the site, we know they aren’t always the MOST important page, which tends to be the homepage. This was the second step in mitigating the known bias. Lower-quality pages on larger sites were balancing out the bias intrinsic to the Quantcast data.

4. Crawl, crawl, crawl

And here is where we make our biggest change. We actually crawl the web starting with this set of pseudo-random URLs to produce the actual set of random URLs. The idea here is to take all the randomization we have built into the pseudo-random URL set and let the crawlers randomly click on links to produce the truly random URL set. The crawler will select a random link from our pseudo-random crawlset and then start a process of randomly clicking links, each time with a 10% chance of stopping and a 90% chance of continuing. Wherever the crawler ends, the final URL is dropped into our list of random URLs. It is this final set of URLs that we use to run our metrics. We generate around 140,000 unique URLs through this process monthly to produce our test data set.

Phew, now what? Defining metrics

Once we have the random set of URLs, we can start really comparing link indexes and measuring their quality, quantity, and speed. Luckily, in their quest to “get this right,” Moz gave me generous paid access to competitor APIs. We began by testing Moz, Majestic, Ahrefs, and SEMRush, but eventually dropped SEMRush after their partnership with Majestic.

So, what questions can we answer now that we have a random sample of the web? This is the exact wishlist I sent out in an email to leaders on the link project at Moz:

  1. Size:
    • What is the likelihood a randomly selected URL is in our index vs. competitors?
    • What is the likelihood a randomly selected domain is in our index vs. competitors?
    • What is the likelihood an index reports the highest number of backlinks for a URL?
    • What is the likelihood an index reports the highest number of root linking domains for a URL?
    • What is the likelihood an index reports the highest number of backlinks for a domain?
    • What is the likelihood an index reports the highest number of root linking domains for a domain?
  2. Speed:
    • What is the likelihood that the latest article from a randomly selected feed is in our index vs. our competitors?
    • What is the average age of a randomly selected URL in our index vs. competitors?
    • What is the likelihood that the best backlink for a randomly selected URL is still present on the web?
    • What is the likelihood that the best backlink for a randomly selected domain is still present on the web?
  3. Quality:
    • What is the likelihood that a randomly selected page’s index status (included or not included in index) in Google is the same as ours vs. competitors?
    • What is the likelihood that a randomly selected page’s index status in Google SERPs is the same as ours vs. competitors?
    • What is the likelihood that a randomly selected domain’s index status in Google is the same as ours vs. competitors?
    • What is the likelihood that a randomly selected domain’s index status in Google SERPs is the same as ours vs. competitors?
    • How closely does our index compare with Google’s expressed as “a proportional ratio of pages per domain vs our competitors”?
    • How well do our URL metrics correlate with US Google rankings vs. our competitors?

Reality vs. theory

Unfortunately, like all things in life, I had to make some cutbacks. It turns out that the APIs provided by Moz, Majestic, Ahrefs, and SEMRush differ in some important ways — in cost structure, feature sets, and optimizations. For the sake of politeness, I am only going to mention name of the provider when it is Moz that was lacking. Let’s look at each of the proposed metrics and see which ones we could keep and which we had to put aside…

  1. Size: We were able monitor all 6 of the size metrics!
  2. Speed:
    • We were able to include this Fast Crawl metric.
    • What is the average age of a randomly selected URL in our index vs. competitors?
      Getting the age of a URL or domain is not possible in all APIs, so we had to drop this metric.
    • What is the likelihood that the best backlink for a randomly selected URL is still present on the web?
      Unfortunately, doing this at scale was not possible because one API is cost prohibitive for top link sorts and another was extremely slow for large sites. We hope to run a set of live-link metrics independently from our daily metrics collection in the next few months.
    • What is the likelihood that the best backlink for a randomly selected Domain is still present on the web?
      Once again, doing this at scale was not possible because one API is cost prohibitive for top link sorts and another was extremely slow for large sites. We hope to run a set of live-link metrics independently from our daily metrics collection in the next few months.
  3. Quality:
    • We were able to keep this metric.
    • What is the likelihood that a randomly selected page’s index status in Google SERPs is the same as ours vs. competitors?
      Chose not to pursue due to internal API needs, looking to add soon.
    • We were able to keep this metric.
    • What is the likelihood that a randomly selected domain’s index status in Google SERPs is the same as ours vs. competitors?
      Chose not to pursue due to internal API needs at the beginning of project, looking to add soon.
    • How closely does our index compare with Google’s expressed as a proportional ratio of pages per domain vs our competitors?
      Chose not to pursue due to internal API needs. Looking to add soon.
    • How well do our URL metrics correlate with US Google rankings vs. our competitors?
      Chose not to pursue due to known fluctuations in DA/PA as we radically change the link graph. The metric would be meaningless until the index became stable.

Ultimately, I wasn’t able to get everything I wanted, but I was left with 9 solid, well-defined metrics.

On the subject of live links:

In the interest of being TAGFEE, I will openly admit that I think our index has more deleted links than others like the Ahrefs Live Index. As of writing, we have about 30 trillion links in our index, 25 trillion we believe to be live, but we know that some proportion are likely not. While I believe we have the most live links, I don’t believe we have the highest proportion of live links in an index. That honor probably does not go to Moz. I can’t be certain because we can’t test it fully and regularly, but in the interest of transparency and fairness, I felt obligated to mention this. I might, however, devote a later post to just testing this one metric for a month and describe the proper methodology to do this fairly, as it is a deceptively tricky metric to measure. For example, if a link is retrieved from a chain of redirects, it is hard to tell if that link is still live unless you know the original link target. We weren’t going to track any metric if we couldn’t “get it right,” so we had to put live links as a metric on hold for now.

Caveats

Don’t read any more before reading this section. If you ask a question in the comments that shows you didn’t read the Caveats section, I’m just going to say “read the Caveats section.” So here goes…

  • This is a comparison of data that comes back via APIs, not within the tools themselves. Many competitors offer live, fresh, historical, etc. types of indexes which can differ in important ways. This is just a comparison of API data using default settings.
  • Some metrics are hard to estimate, especially like “whether a link is in the index,” because no API — not even Moz — has a call that just tells you whether they have seen the link before. We do our best, but any errors here are on the the API provider. I think we (Moz, Majestic, and Ahrefs) should all consider adding an endpoint like this.
  • Links are counted differently. Whether duplicate links on a page are counted, whether redirects are counted, whether canonicals are counted (which Ahrefs just changed recently), etc. all affect these metrics. Because of this, we can’t be certain that everything is apples-to-apples. We just report the data at face value.
  • Subsequently, the most important takeaway in all of these graphs and metrics is direction. How are the indexes moving relative to one another? Is one catching up, is another falling behind? These are the questions best answered.
  • The metrics are adversarial. For each random URL or domain, a link index (Moz, Majestic, or Ahrefs) gets 1 point for being the biggest, for tying with the biggest, or for being “correct.” They get 0 points if they aren’t the winner. This means that the graphs won’t add up to 100 and it also tends to exaggerate the differences between the indexes.
  • Finally, I’m going to show everything, warts and all, even when it was my fault. I’ll point out why some things look weird on graphs and what we fixed. This was a huge learning experience and I am grateful for the help I received from the support teams at Majestic and Ahrefs who, as a customer, responded to my questions honestly and openly.

The metrics dashboard

The Dashboard for All MetricsWe’ve been tracking these 9 core metrics (albeit with improvements) since November of 2017. With a close eye on quality, size, and speed, we have methodically built an amazing backlink index, not driven by broad counts but instead by intricately defined and measured metrics. Let’s go through each of those metrics now.

Size matters

It does. Let’s admit it. The diminutive size of the Mozscape index has been a limitation for years. Maybe someday we will write a long post about all the efforts Moz has made to grow the index and what problems stood in our way, but that’s a post for a different day. The truth is, as much as quality matters, size is huge for a number of specific use-cases for a link index. Do you want to find all your bad links? Bigger is better. Do you want to find a lot of link opportunities? Bigger is better. So we came up with a number of metrics to help us determine where we were relative to our competitors. Here are each of our Size metrics.

Index Has URL

What is the likelihood a randomly selected URL is in our index vs. competitors?

This is one of my favorite metrics because I think it’s a pure reflection of index size. It answers the simple question of “if we grabbed a random URL on the web, what’s the likelihood an index knows about it?” However, you can see my learning curve in the graph (I was misreporting the Ahrefs API due to an error on my part) but once corrected, we had a nice reflection of the indexes. Let me restate this — these are comparisons in APIs, not in the web tools themselves. If I recall correctly, you can get more data out of running reports in Majestic, for example. However, I do think this demonstrates that Moz’s new Link Explorer is a strong contender, if not the largest, as we have led in this category every day except one. As of writing this post, Moz is winning.

Index Has Domain

What is the likelihood a randomly selected domain is in our index vs competitors?

When I said I would show “warts and all,” I meant it. Determining whether a domain is in an index isn’t as simple as you would think. For example, perhaps a domain has pages in the index, but not the homepage. Well, it took me a while to figure this one out, but by February of this year I had it down.

The scale of this graph is important to note as well. The variation is between 99.4 and 100% between Moz, Majestic, and Ahrefs over the last few months. This indicates just how close the link indexes are in terms of knowing about root domains. Majestic has historically tended to win this metric with near 100% coverage, but you would have to select 100 random domains to find one that Moz or Ahrefs doesn’t have information on. However, Moz’s continued growth has allowed us to catch up. While the indexes are super close, as of writing this post, Moz is winning.

Backlinks Per URL

Which index has the highest backlink count for a randomly selected URL?

This is a difficult metric to really pin down. Unfortunately, it isn’t easy to determine what backlinks should count and what shouldn’t. For example, imagine a URL has one page linking to it, but that page includes that link 100 times. Is that 100 backlinks or one? Well, it turns out that the different link indexes probably measure these types of scenarios differently and getting an exact definition out of each is like pulling teeth because the definition is so complicated and there are so many edge cases. At any rate, I think this is a great example of where we can show the importance of direction. Whatever the metrics actually are, Moz and Majestic are catching up to Ahrefs, which has been the leader for some time. As of writing this post, Ahrefs is winning.

Root Linking Domains Per URL

Which index reports the highest RLD count for a randomly selected URL?

Simple, right? No, even this metric has its nuances. What is a root linking domain? Do subdomains count if they are on subdomain sites like Blogspot or WordPress.com? If so, how many sites are there on the web which should be treated this way? We used a machine learned methodology based on surveys, SERP data, and unique link data to determine our list, but each competitor does it differently. Thus, for this metric, direction really matters. As you can see, Moz has been steadily catching up and as of writing today, Moz is finally winning.

Backlinks Per Domain

Which index reports the highest backlink count for a randomly selected domain?

This metric was not kind to me, as I found a terrible mistake early on. (For the other techies reading this, I was storing backlink counts as INT(11) rather than BIGINT, which caused lots of ties for big domains when they were larger than the maximum number size because the database defaults to same highest number.) Nevertheless, Majestic has been stealing the show on this metric for a little while, although the story is deeper than that. Their dominance is such an outlier that it needs to be explained.

One of the hardest decisions a company has to make regarding its backlink index is how to handle spam. On one hand, spam is expensive to the index and probably ignored by Google. On the other hand, it is important for users to know if they have received tons of spammy links. I don’t think there is a correct answer to this question; each index just has to choose. A close examination of the reason why Majestic is winning (and continuing to increase their advantage) is because of a particularly nefarious Wikipedia-clone spam network. Any site with any backlinks from Wikipedia are getting tons of links from this network, which is causing their backlink counts to increase rapidly. If you are worried about these types of links, you need to go take a look at Majestic and look for links ending in primarily .space or .pro, including sites like tennis-fdfdbc09.pro, troll-warlord-64fa73ba.pro, and badminton-026a50d5.space. As of my last tests, there are over 16,000 such domains in this spam network within Majestic’s index. Majestic is winning this metric, but for purposes other than finding spam networks, it might not be the right choice.

Linking Root Domains Per Domain

Which index reports the highest LRD count for a randomly selected domain?

OK, this one took me a while to get just right. In the middle of this graph, I corrected an important error where I was looking at domains only for the root domain on Ahrefs rather than the root domain and all subdomains. This was unfair to Ahrefs until I finally got everything corrected in February. Since then, Moz has been aggressively growing its index, Majestic has picked up LRD counts through the previously discussed network but steadied out, and Ahrefs has remained relatively steady in size. Because of the “adversarial” nature of these metrics, it gives the false appearance that Ahrefs is dropping dramatically. They aren’t. They are still huge, and so is Majestic. The real takeaway is directional: Moz is growing dramatically relative to their networks. As of writing this post, Moz is winning.

Speed

Being the “first to know” is an important part in almost any industry and with link indexes it is no different. You want to know as soon as possible when a link goes up or goes down and how good that link is so you can respond if necessary. Here is our current speed metric.

FastCrawl

What is the likelihood the latest post from a randomly selected set of RSS feeds is indexed?

Unlike the other metrics discussed, the sampling here is a little bit different. Instead of using the randomization above, we make a random selection from a million+ known RSS feeds to find their latest post and check to see if they have been included in the various indexes of Moz and competitors. While there are a few errors in this graph, I think there is only one clear takeaway. Ahrefs is right about their crawlers. They are fast and they are everywhere. While Moz has increased our coverage dramatically and quickly, it has barely put a dent in this FastCrawl metric.

Now you may ask, if Ahrefs is so much faster at crawling, how can Moz catch up? Well, there are a couple of answers, but probably the biggest is that new URLs only represent a fraction of the web. Most URLs aren’t new. Let’s say two indexes (one new, one old) have a bunch of URLs they’re considering crawling. Both might prioritize URLs on important domains that they’ve never seen before. For the larger, older index, that will be a smaller percentage of that group because they have been crawling fast a long time. So, during the course of the day, a higher percentage of the old index’s crawl will be dedicated to re-crawl pages it already knows about. The new index can dedicate more of its crawl potential to new URLs.

It does, however, put the pressure on Moz now to improve crawl infrastructure as we catch up to and overcome Ahrefs in some size metrics. As of this post, Ahrefs is winning the FastCrawl metric.

Quality

OK, now we’re talking my language. This is the most important stuff, in my opinion. What’s the point of making a link graph to help people with SEO if it isn’t similar to Google? While we had to cut some of the metrics temporarily, we did get a few in that are really important and worth taking a look.

Domain Index Matches

What is the likelihood a random domain shares the same index status in Google and a link index?

Domain Index Matches seeks to determine when a domain shares the same index status with Google as it does in one of the competing link indexes. If Google ignores a domain, we want to ignore a domain. If Google indexes a domain, we want to index a domain. If we have a domain Google doesn’t, or vice versa, that is bad.

This graph is a little harder to read because of the scale (the first few days of tracking were failures), but what we actually see is a statistically insignificant difference between Moz and our competitors. We can make it look more competitive than it really is if we just calculate wins and losses, but we have to take into account an error in the way we determined Ahrefs index status up until around February. To do this, I show wins/losses for all time vs. wins/losses over the last few months.

As you can see, Moz wins the “all time,” but Majestic has been winning more over the last few months. Nevertheless, these are quite insignificant, often being the difference between one or two domain index statuses out of 100. Just like the Index Has Domain metric we discussed above, nearly every link index has nearly every domain, and looking at the long-term day-by-day graph shows just how incredibly close they are. However, if we are keeping score, as of today (and the majority of the last week), Moz is winning this metric.

Domain URL Matches

What is the likelihood a random URL shares the same index status in Google as in a link index?

This one is the most important quality metric, in my honest opinion. Let me explain this one a little more. It’s one thing to say that your index is really big and has lots of URLs, but does it look like Google’s? Do you crawl the web like Google? Do you ignore URLs Google ignores while crawling URLs that Google crawls? This is a really important question and sets the foundation for a backlink index that is capable of producing good relational metrics like PA and DA.

This is one of the metrics where Moz just really shines. Once we corrected for an error in the way we were checking Ahrefs, we could accurately determine whether our index was more or less like Google’s than our competitors. Since the beginning of tracking, Moz Link Explorer has never been anything but #1. In fact, we only had 3 ties with Ahrefs and never lost to Majestic. We have custom-tailored our crawl to be as much like Google as possible, and it has paid off. We ignore the types of URLs Google hates, and seek out the URLs Google loves. We believe this will pay huge dividends in the long run for our customers as we expand our feature set based on an already high-quality, huge index.

The Link Index Olympics

Alright, so we’ve just spent a lot of time delving into these individual metrics, so I think it’s probably worth it to put these things into an easy-to-understand context. Let’s pretend for a moment that this is the Link Index Olympics, and no matter how much you win or lose by, it determines whether you receive a gold, bronze or silver medal. I’m writing this on Wednesday, April 25th. Let’s see how things play out if the Olympics happened today:

As you can see, Moz takes the gold in six of the nine metrics we measure, two silvers, and one bronze. Moreover, we’re continuing to grow and improve our index daily. As most of the above graphs indicate, we tend to be improving relative to our competitors, so I hope that by the time of publication in a week or so our scores will even be better. But the reality is that based on the metrics above, our link index quality, quantity, and speed are excellent. I’m not going to say our index is the best. I don’t think that’s something anyone can really even know and is highly dependent upon the specific use case. But I can say this — it is damn good. In fact, Moz has won or tied for the “gold” 27 out of the last 30 days.

What’s next?

We are going for gold. All gold. All the time. There’s a ton of great stuff on the horizon. Look forward to regular additions of features to Link Explorer based on the data we already have, faster crawling, and improved metrics all around (PA, DA, Spam Score, and potentially some new ones in the works!) There’s way too much to list here. We’ve come a long way but we know we have a ton more to do. These are exciting times!

A bit about DA and PA

Domain Authority and Page Authority are powered by our link index. Since we’re moving from an old, much smaller index to a larger, much faster index, you may see small or large changes to DA and PA depending on what we’ve crawled in this new index that the old Mozscape index missed. Your best bet is just to compare yourselves to your competitors. Moreover, as our index grows, we have to constantly adjust the model to address the size and shape of our index, so both DA and PA will remain in beta a little while. They are absolutely ready for primetime, but that doesn’t mean we don’t intend to continue to improve them over the next few months as our index growth stabilizes. Thanks!

Quick takeaways

Congratulations for getting through this post, but let me give you some key takeaways:

  1. The new Moz Link Explorer is powered by an industry-leading link graph and we have the data to prove it.
  2. Tell your data providers to put their math where their mouth is. You deserve honest, well-defined metrics, and it is completely right of you to demand it from your data providers.
  3. Doing things right requires that we sweat the details. I cannot begin to praise our leadership, SMEs, designers, and engineers who have asked tough questions, dug in, and solved tough problems, refusing to build anything but the best. This link index proves that Moz can solve the hardest problem in SEO: indexing the web. If we can do that, you can only expect great things ahead.

Thanks for taking the time to read! I look forward to answering questions in the comments or you can reach me on Twitter at @rjonesx.

Also, I would like to thank the non-Mozzers who offered peer reviews and critiques of this post in advance — they do not necessarily endorse any of the conclusions, but provided valuable feedback. In particular, I would like to thank Patrick Stox of IBM, JR Oakes of Adapt Partners, Alexander Darwin of HomeAgency, Paul Shapiro of Catalyst SEM, the person I most trust in SEO, Tony Spencer, and a handful of others who wished to remain anonymous.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

A Step-by-Step Guide to Setting Up and Growing Your YouTube Presence

Posted by AnnSmarty

When was the last time you saw a video on YouTube? I bet you’ve seen one today. YouTube is too huge and too popular for marketers to ignore.

If you don’t have a YouTube channel, now’s the time to start one.

If you have a channel and you never got it off the ground, now’s the time to take action.

This article will take you through the process of setting up your YouTube presence, listing steps, tools, and important tips to get you started and moving forward.

1. Define your goals

If your goal is to become a YouTube star, you might be a bit late to the party: it’s really hard to get noticed these days — too competitive. Stardom will take years of hard work to achieve because of the number of channels users have to choose from.

Even back in 2014, when I was reading about YouTube celebrity bloggers, one quote really stood out to me:

“We think, if we were coming to YouTube today, it would be too hard. We couldn’t do it.”

That’s not to say, however, that you cannot achieve other, more tangible goals on YouTube. It’s an excellent venue for business owners and marketers.

Here are three achievable goals that make more sense than fame from a business perspective:

1.1. YouTube for reputation management

Here’s one thing about reputation management on Google: You’re never finished.

Even if your reputation is fabulous and you love every single result that comes up in the SERPs for your business name, you may still want to publish more content around your brand.

The thing is, for reputation management purposes, the more navigational queries you can control, the better:

Reputation

YouTube is the perfect platform for reputation management. YouTube videos rank incredibly well in Google, especially when it comes to low-competition navigational queries that include your brand name.

Furthermore, YouTube videos almost always get that rich snippet treatment (meaning that Google shows the video thumbnail, author, and length of the video in the SERPs). This means you can more easily attract attention to your video search result.

That being said, think about putting videos on YouTube that:

  • Give your product/service overview
  • Show happy customers
  • Visualize customer feedback (for example, visual testimonials beautifully collected and displayed in a video)
  • Offer a glimpse inside your team (show people behind the brand, publish videos from events or conferences, etc.)

1.2 YouTube videos for improved conversions

Videos improve conversions for a clear reason: They offer a low-effort way for your customer to see why they need your product. Over the years, there have been numerous case studies proving the point:

  • An older study (dating back to 2011) states that customers are 144% more likely to add products to a shopping cart after watching the product video
  • Around 1 in 3 millennials state they have bought a product directly as a result of watching a how-to video on it
  • This Animoto survey found that almost all the participants (96%) considered videos “helpful when making purchasing decisions online”
  • Wistia found that visitors who engage with a video are much more likely to convert than those who don’t

That being said, YouTube is a perfect platform to host your video product overviews: it’s free, it offers the additional benefit of ranking well in Google, and it provides additional exposure to your products through their huge community, allowing people to discover your business via native search and suggested videos.

1.3 YouTube for creating alternative traffic and exposure channels

YouTube has huge marketing potential that businesses in most niches just cannot afford to ignore: it serves as a great discovery engine.

Imagine your video being suggested next after your competitor’s product review. Imagine your competitors’ customers stumbling across your video comparison when searching for an alternative service on Youtube.

Just being there increases your chances of getting found.

Again, it’s not easy to reach the YouTube Top 10, but for specific low-competition queries it’s quite doable.

Note: To be able to build traffic from inside your YouTube videos, you need to build up your channel to 10,000 public overall views to qualify to become a YouTube partner. Once approved, you’ll be able to add clickable links to your site from within your videos using cards and actually build up your own site traffic via video views.

2. Develop a video editorial calendar

As with any type of content, video content requires a lot of brainstorming, organizing, and planning.

My regular routine when it comes to creating an editorial calendar is as follows:

  1. Start with keyword research
  2. Use question research to come up with more specific ideas
  3. Use seasonality to come up with timing for each piece of content
  4. Allocate sufficient time for production and promotion

You can read about my exact editorial process here. Here’s a sample of my content roadmap laying out a major content asset for each month of the year, based on keyword research and seasonality:

Content roadmap

For keyword and question research I use Serpstat because they offer a unique clustering feature. For each keyword list you provide, they use the Google search results page to identify overlapping and similar URLs, evaluate how related different terms in your list are, and based on that, cluster them into groups.

Keyword clustering

This grouping makes content planning easier, allowing you to see the concepts behind keyword groups and put them into your roadmap based on seasonality or other factors that come into play (e.g. is there a slot/gap you need to fill? Are there company milestones or events coming up?).

Depending on how much video content you plan to create, you can set up a separate calendar or include videos in your overall editorial calendar.

When creating your roadmap, keep your goals in mind, as well. Some videos, such as testimonials and product reviews, won’t be based on your keyword research but still need to be included in the roadmap.

3. Proceed to video production

Video production can be intimidating, especially if you have a modest budget, but these days it’s much easier and more affordable than you’d imagine.

Keeping lower-budget campaigns in mind, here are few types of videos and tools you can try out:

3.1 In-house video production

You can actually handle much of your video production in-house without the need to set up a separate room or purchase expensive gadgets.

Here are a few ideas:

  • Put together high-quality explanatory videos using Animatron (starts at $ 15/month): Takes a day or so to get to know all the available tools and options, but after that the production goes quite smoothly
  • Create beautiful visual testimonials, promo videos, and visual takeaways using Animoto ($ 8/month): You don’t need much time to learn to use it; it’s very easy and fun.
  • Create video tutorials using iMovie (free for Mac users): It will take you or your team about a week to properly figure out all its options, but you’ll get there eventually.
  • Create video interviews with niche influencers using Blue Jeans (starts at $ 12.49/month)
  • Create (whiteboard) presentations using ClickMeeting (starts at $ 25/month): Host a webinar first, then use the video recording as a permanent brand asset. ClickMeeting will save your whiteboard notes and let you reuse them in your article. You can brand your room to show your logo and brand colors in the video. Record your entire presentation using presentation mode, then upload them to your channel.

Clickmeeting

3.2 How to affordably outsource video production

The most obvious option for outsourcing video production is a site like Fiverr. Searching its gigs will actually give you even more ideas as to what kinds of videos you might create. While you may get burned there a few times, don’t let it discourage you — there are plenty of creative people who can put together awesome videos for you.

Another great idea is to reach out to YouTube bloggers in your niche. Some of them will be happy to work for you, and as a bonus you’ll be rewarded with additional exposure from their personal branding and social media channels.

I was able to find a great YouTube blogger to work for my client for as low as $ 75 per video; those videos were of top quality and upload-ready.

There’s lots of talent out there: just spend a few weeks searching and reaching out!

4. Optimize each video page

When uploading your videos to YouTube, spend some time optimizing each one. Add ample content to each video page, including a detailed title, a detailed description (at least 300–500 characters), and a lot of tags.

  • Title of the video: Generally, a more eye-catching and detailed title including:
    • Your core term/focus keyword (if any)
    • Product name and your brand name
    • The speaker’s name when applicable (for example, when you post interviews). This may include their other identifiable personal brand elements, such as their Twitter handle
    • Event name and hashtag (when applicable)
    • City, state, country (especially if you’re managing a local business)
  • Description of the video: The full transcript of the video. This can be obtained via services such as Speechpad.
  • A good readable and eye-catching thumbnail: These can be created easily using a tool like Canva.

Use a checklist:

Youtube SEO checklist

5. Generate clicks and engagement

Apart from basic keyword matching using video title and description, YouTube uses other video-specific metrics to determine how often the video should be suggested next to related videos and how high it should rank in search results.

Here’s an example of how that might work:

The more people that view more than the first half of your video, the better. If more than 50% of all your video viewers watched more than 50% of the video, YouTube would assume your video is high quality, and so it could pop up in “suggested” results next to or at the end of other videos. (Please note: These numbers are examples, made up using my best judgment. No one knows the exact percentage points YouTube is using, but you get the general idea of how this works.)

That being said, driving “deep” views to your videos is crucial when it comes to getting the YouTube algorithm to favor you.

5.1 Create a clickable table of contents to drive people in

Your video description and/or the pinned comment should have a clickable table of contents to draw viewers into the video. This will improve deep views into the video, which are a crucial factor in YouTube rankings.

Table of contents

5.2 Use social media to generate extra views

Promoting your videos on social media is an easy way to bring in some extra clicks and positive signals.

5.2.1 First, embed the video to your site

Important: Embed videos to your web page and promote your own URL instead of the actual YouTube page. This approach has two important benefits:

  • Avoid auto-plays: Don’t screw up your YouTube stats! YouTube pages auto-play videos by default, so if you share a YouTube URL on Twitter, many people will click and immediately leave (social media users are mostly lurkers). However, if you share your page with the video embedded on it, it won’t play until the user clicks to play. This way you’ll ensure the video is played only by people who seriously want to watch it.
  • Invest time and effort into your own site promotion instead of marketing the youtube.com page: Promoting your own site URL with the video embedded on it, you can rest assured that more people will keep interacting with your brand rather than leave to watch other people’s videos from YouTube suggested results.

There are also plenty of ways to embed YouTube videos naturally in your blog and offer more exposure. Look at some of these themes, for example, for ideas to display videos in ways that invite views and engagement.

Video sharing WordPress

5.2.2 Use tools to partially scale social media promotion

For better, easier social media exposure, consider these options:

  • Investing in paid social media ads, especially Facebook ads, as they work best for engagement
  • Use recurring tweets to scale video promotion. There are a few tools you can try, such as DrumUp. Schedule the same update to go live several times on your chosen social media channels, generating more YouTube views from each repeated share. This is especially helpful for Twitter, because the lifespan of a tweet is just several minutes (between two and ten minutes, depending on how active and engaged your Twitter audience is). With recurring tweets, you’ll make sure that more of your followers see your update.

  • A project I co-founded, Viral Content Bee, can put your videos in front of niche influencers on the lookout for more content to share on their social media accounts.

5.3 Build playlists

By sorting your videos into playlists, you achieve two important goals:

  • Keeping your viewers engaged with your brand videos longer: Videos within one playlist keep playing on autopilot until stopped
  • Creating separate brand assets of their own: Playlist URLs are able to rank both in YouTube and Google search results, driving additional exposure to your videos and brand overall, as well as allowing you to control more of those search results:

Playlists

Using playlists, you can also customize the look and feel of your YouTube channel more effectively to give your potential subscribers a glimpse into additional topics you cover:

Customize Youtube channel

Furthermore, by customizing the look of your YouTube channel, you transform it into a more effective landing page, highlighting important content that might otherwise get lost in the archives.

6. Monitor your progress

6.1 Topvisor

Topvisor is the only rank tracker I am aware of that monitors YouTube rankings. You’ll have to create a new project for each of your videos (which is somewhat of a pain), but you can monitor multiple keywords you’re targeting for each video. I always monitor my focus keyword, my brand name, and any other specific information I’m including in the video title (like location and the speaker’s name):

Topvisor

6.2 YouTube Analytics

YouTube provides a good deal of insight into how your channel and each individual video is doing, allowing you to build on your past success.

  • You’ll see traffic sources, i.e. where the views are coming from: suggested videos, YouTube search, external (traffic from websites and apps that embed your videos or link to them on YouTube), etc.
  • The number of times your videos were included in viewers’ playlists, including favorites, for the selected date range, region, and other filters. This is equal to additions minus removals.
  • Average view duration for each video.
  • How many interactions (subscribers, likes, comments) every video brought.

Youtube Analytics

You can see the stats for each individual video, as well as for each of your playlists.

6.3 Using a dashboard for the full picture

If you produce at least one video a month, you may want to set up a dashboard to get an overall picture of how your YouTube channel is growing.

Cyfe (disclaimer: as of recently, Cyfe is a content marketing client of mine) is a tool that offers a great way to keep you organized when it comes to tracking your stats across multiple platforms and assets. I have a separate dashboard there which I use to keep an eye on my YouTube channels.

Cyfe Youtube

Conclusion

Building a YouTube channel is hard work. You’re likely to see little or no activity for weeks at a time, maybe even months after you start working on it. Don’t let this discourage you. It’s a big platform with lots of opportunity, and if you keep working consistently, you’ll see your views and engagement steadily growing.

Do you have a YouTube channel? What are you doing to build it up and increase its exposure? Let us know in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Social Media Marketing: Setting expectations both internally and externally [Video]

In this MarketingSherpa Blog post, discover two strategies for preventing social media fails from Andrew Jones, Industry Analyst, Altimeter Group. The key is managing expectations both internally, and also externally. Watch this video from the MarketingSherpa Media Center at IRCE to learn how to address, and prevent, social media fails.

MarketingSherpa Blog

Als Hans-Peter Zimmermann noch Seminare zum Thema Internet-Marketing gab, galt der Grundsatz “Content is King” (Inhalt ist das Wichtigste). Bei Internet-Marketern der heutigen Zeit bekommt…
Video Rating: 5 / 5

Posted in IM NewsComments Off

Setting Up 4 Key Customer Loyalty Metrics in Google Analytics

Posted by Tom.Capper

Customer loyalty is one of the strongest assets a business can have, and one that any can aim to improve. However, improvement requires iteration and testing, and iteration and testing require measurement.

Traditionally, customer loyalty has been measured using customer surveys. The
Net Promoter Score, for example, is based on the question (on a scale of one to ten) “How likely is it that you would recommend our company/product/service to a friend or colleague?”. Regularly monitoring metrics like this with any accuracy is going to get expensive (and/or annoying to customers), and is never going to be hugely meaningful, as advocacy is only one dimension of customer loyalty. Even with a wider range of questions, there’s also some risk that you end up tracking what your customers claim about their loyalty rather than their actual loyalty, although you might expect the two to be strongly correlated.

Common mistakes

Google Analytics and other similar platforms collect data that could give you more meaningful metrics for free. However, they don’t always make them completely obvious – before writing this post, I checked to be sure there weren’t any very similar ones already published, and I found some fairly dubious reoccurring recommendations. The most common of these was
using % of return visitors as a sole or primary metric for customer loyalty. If the percentage of visitors to your site who are return visitors drops, there are plenty of reasons that could be behind that besides a drop in loyalty—a large number of new visitors from a successful marketing campaign, for example. Similarly, if the absolute number of return visitors rises, this could be as easily caused by an increase in general traffic levels as by an increase in the loyalty of existing customers.

Visitor frequency is another easily misinterpreted metric; 
infrequent visits do not always indicate a lack of loyalty. If you were a loyal Mercedes customer, and never bought any car that wasn’t a new Mercedes, you wouldn’t necessarily visit their website on a weekly basis, and someone who did wouldn’t necessarily be a more loyal customer than you.

The metrics

Rather than starting with the metrics Google Analytics shows us and deciding what they mean about customer loyalty (or anything else), a better approach is to decide what metrics you want, then deciding how you can replicate them in Google Analytics.

To measure the various dimensions of (online) customer loyalty well, I felt the following metrics would make the most sense:

  • Proportion of visitors who want to hear more
  • Proportion of visitors who advocate
  • Proportion of visitors who return
  • Proportion of macro-converters who convert again

Note that a couple of these may not be what they initially seem. If your registration process contains an awkwardly worded checkbox for email signup, for example, it’s not a good measure of whether people want to hear more. Secondly, “proportion of visitors who return” is not the same as “proportion of visitors who are return visitors.”

1. Proportion of visitors who want to hear more

This is probably the simplest of the above metrics, especially if you’re already tracking newsletter signups as a micro-conversion. If you’re not, you probably should be, so see Google’s guidelines for event tracking using the
analytics.js tracking snippet or Google Tag Manager, and set your new event as a goal in Google Analytics.

2. Proportion of visitors who advocate

It’s never possible to track every public or private recommendation, but there are two main ways that customer advocacy can be measured in Google Analytics: social referrals and social interactions. Social referrals may be polluted as a customer loyalty metric by existing campaigns, but these can be segmented out if properly tracked, leaving the social acquisition channel measuring only organic referrals.

Social interactions can also be tracked in Google Analytics, although surprisingly, with the exception of Google+, tracking them does require additional code on your site. Again, this is probably worth tracking anyway, so if you aren’t already doing so, see Google’s guidelines for
analytics.js tracking snippets, or this excellent post for Google Tag Manager analytics implementations.

3. Proportion of visitors who return

As mentioned above, this isn’t the same as the proportion of visitors who are return visitors. Fortunately, Google Analytics does give us a feature to measure this.

Even though date of first session isn’t available as a dimension in reports, it can be used as a criteria for custom segments. This allows us to start building a data set for how many visitors who made their first visit in a given period have returned since.

There are a couple of caveats. First, we need to pick a sensible time period based on our frequency and recency data. Second, this data obviously takes a while to produce; I can’t tell how many of this month’s new visitors will make further visits at some point in the future.

In Distilled’s case, I chose 3 months as a sensible period within which I would expect the vast majority of loyal customers to visit the site at least once. Unfortunately, due to the 90-day limit on time periods for this segment, this required adding together the totals for two shorter periods. I was then able to compare the number of new visitors in each month with how many of those new visitors showed up again in the subsequent 3 months:

As ever with data analysis, the headline figure doesn’t tell the story. Instead, it’s something we should seek to explain. Looking at the above graph, it would be easy to conclude “Distilled’s customer loyalty has bombed recently; they suck.” However, the fluctuation in the above graph is mostly due to the enormous amount of organic traffic that’s been generated by
Hannah‘s excellent blog post 4 Types of Content Every Site Needs.

Although many new visitors who discovered the Distilled site through this blog post have returned since, the return rate is unsurprisingly lower than some of the most business-orientated pages on the site. This isn’t a bad thing—it’s what you’d expect from top-of-funnel content like blog posts—but it’s a good example of why it’s worth keeping an eye out for this sort of thing if you want to analyse these metrics. If I wanted to dig a little deeper, I might start by segmenting this data to get a more controlled view of how new visitors are reacting to Distilled’s site over time.

4. Proportion of macro-converters who convert again

While a standard Google Analytics implementation does allow you to view how many users have made multiple purchases, it doesn’t allow you to see how these fell across their sessions. Similarly, if you can see how many users have had two sessions and two goal conversions, but you can’t see whether those conversions were in different visits, it’s entirely possible that some had one accidental visit that bounced, and one visit with two different conversions (note that you cannot perform the same conversion twice in one session).

It would be possible to create custom dimensions for first (and/or second, third, etc.) purchase dates using internal data, but this is a complex and site-specific implementation. Unfortunately, for the time being, I know of no good way of documenting user conversion patterns over multiple sessions using only Google Analytics, despite the fact that it collects all the data required to do this.

Contribute

These are only my favourite customer loyalty metrics. If you have any that you’re already tracking or are unsure how to track, please explain in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Setting Goals (Not Tools) as the Foundation of Your Marketing – Whiteboard Friday

Posted by MackenzieFogelson

With new tools introduced so regularly, it’s easy for marketers to spend an inordinate amount of time trying to figure out which ones are most effective for their own work. That focus, though, shifts our attention from what really matters: setting the right goals for our companies. In today’s Whiteboard Friday, Mackenzie Fogelson walks us through the five-stage process she uses to make sure her team’s attention is on what really matters.










For reference, here’s a still of this week’s whiteboard!

Video Transcription


Hey there, Moz community! I’m so excited to be here with you today. I wanted to share something with you that has been really powerful for the businesses we’ve been working with in the last year or so about building community. It’s a concept that we call “goals not tools,” and it works in this pyramid format where you start with your goals, you move on to KPIs, you develop a strategy, you execute that strategy, and then you analyze your data. And this is something that has been really powerful and helped businesses really grow. So I’m going to walk you through it here.

We start down at the bottom with goals. So the deal with goals is that you want to make sure that you’re setting goals for your entire business, not just for SEO or social media or content marketing, because you’re trying to grow your whole business. So keep your focus there. Then once you develop your goals, and those goals might be to improve customer communication or you want to become a thought leader. Whatever your goal is, that’s where you’re going to set it.

Then you move on to determining what your key performance indicators are and what you’re going to use to actually measure the fact that you may or may not be reaching your goals. So in terms of KPIs, it’s really going to depend on your business. When we determine KPIs with companies, we sit down and we have that discussion with them before we develop the strategy, and that helps us to have a very authentic and realistic discussion about expectations and how this is all going to work and what kind of data they’re expecting to see so that we’re proving that we’re actually making a difference in their business.

So once you’ve determined those KPIs, then you move on to developing a creative strategy, a creative way to meet those goals and to measure it the way you’ve determined in your KPIs. So this is your detailed roadmap, and it’s two to three months at a time. A lot of companies will go for maybe 12 months and try to get that high level overview of where they’re going for the year, and that’s fine. Just make sure that you’re not detailing out everything that you’re doing for the next year because it makes it harder to be agile. So we’d recommend two- to-three month iterations at a time. Go through, test things, and see how that works.

During your strategy development you’re also going to select the tools that you’re going to use. Maybe it’s Facebook, maybe it’s SEO, maybe it’s content marketing, maybe it’s email marketing, PPC. There’s all kinds of tools that could be used, and they don’t all have to be digital. So you just need to be creative and determine what you need to plan out so that you can reach the goals that you’ve set.

Then once you’ve got your strategy developed, that’s really some of the hardest part until you get to execution. Then you’re actually doing all the work. You need to be consistent. You need to make sure that you’re staying focused and following that strategy that you’ve set. You also want to test things because you want as much data as possible so that you can determine if things are working or not. So make sure that during execution there are going to be things that come up, emergent things, shiny things, exciting things. So what you’ll have to do is weigh whether those things wait for the next iteration in two to three months, or whether you deviate your plan and you integrate those at the time that they come up.

So once you’re through execution, then really what you’re doing is analyzing that data that you’ve collected. You’re trying to determine: Should we spend more time on something? Should we pull something? Should we determine if something else needs to completely change our plans so that we’re making sure that we’re adding value? So analysis is probably the most important part because you’re always going to want to be looking at the data.

So in this whole process, what we always do is try to make sure that we’re focusing on two questions, and the most important one is: Where can we add more value? So always be thinking about what you’re doing, and if you can’t answer the value question, you know, “Why are we doing this? Does this provide value for our customers or something internal that you’re working on? If you can’t answer that question, it’s probably not something valuable, and you don’t need to spend your time on it. Go somewhere else where you’re adding the value.

Then the last question is where you can make the biggest difference in your business, because that’s what this is all about is growing your business. So if you stay focused on goals, not tools, it’s going to be really easy to do that.

Thanks for having me today, Moz. Hope I helped you out. Let me know in the questions if you need any assistance.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Setting Up Actionable SEO Dashboards in the New Google Analytics

There have been many mixed reviews about the latest Google Analytics UI. Putting the frustration of having to learn a new UI aside (here’s a great guide to navigating the new Google Analytics interface), the new Google Analytics actually brings to the table great customization options. One of my favorites being custom dashboards.

Both the old and new interfaces offer a standard dashboard that acts as an overview of your analytics profile. But where the new UI has its advantage is with your ability to create your own dashboard – in fact, you can create up to 20 of them for each profile.

Creating Dashboards

The first thing we’ll want to do is click the “+ New Dashboard” link on the left navigation of your profile’s Home tab. Google will then ask you to name the dashboard and to choose either a “Blank Canvas” or a “Starter Dashboard.” The Starter Dashboard is just like the default dashboard you already have in Google Analytics, so let’s choose “Blank Canvas.” Now it’s time to populate your dashboard with widgets.

There are two ways you can customize your new dashboard:

  1. Use the “Add Widget” feature on your dashboard
  2. Navigate to the view you want in Google Analytics and click the “Add to Dashboard” link.

When you use the “Add Widget” feature, there are four types of widgets you can choose from:

  1. Metric – This will show you a single metric as well as a “sparkline” for that metric (which is basically a tiny line graph)
  2. Pie – Displays a breakdown of various metrics in pie chart form
  3. Timeline – A graph (only) of any metric (or compare two metrics) over any period of time
  4. Table – Your traditional Google Analytics table, but it can be customized to only display what you’ve setup (including filters)

You build each widget the same way you would segment/filter data in Google Analytics normally. The key here is saving the view to your dashboard so you can quickly login and review performance without having to set everything up again.

As you add more widgets to your custom dashboard, you can easily drag, drop and rearrange your widgets into one of the three dashboard columns.

Now that we know how to setup dashboards, let’s take a look at some useful SEO dashboards you should consider creating.

SEO Monitoring Dashboard

The purpose of this dashboard is simple: a quick look into the health of your SEO campaign.

Widget #1: Total Organic Non-Branded Keyword Traffic (Metric/Timeline)

With this metric/timeline widget, we’re simply wanting to look at our total number of organic, non-branded search traffic. Remember, with the metric widget, you can only look at a single metric. If you only want to see the total number of visits, add a metric widget. However, if you’d like to see the total visit count broken out over the selected date range, you’ll want to add it as a timeline widget.

For this widget, we’ll add a Metric/Timeline with the following dimensions:

Nonbranded Organic Traffic

Widget #2: Total Organic Non-Branded Keyword Conversions (Metric/Timeline)

In this widget we’re looking to get a snapshot of just how many total conversions (or transactions) that have been generated by our non-branded organic keyword referrals.

For this widget, we’ll add a Metric/Timeline with the following dimensions:

Non-Branded Organic Conversions

Just like before, if you’d prefer to see this over time you can change this widget to be a timeline instead of a metric widget.

Widget #3: Total Organic Non-Branded Keyword Traffic (Table)

This widget filters out your branded search keyword referrals so you can get right to the keywords you’re most interested in. You may also consider adding an additional filter to remove (not provided) if it takes up a significant number of the results.

For this widget, we’ll add a Table with the following dimensions:

Non-Branded Organic Keywords

You’ll notice that I didn’t choose any goals for the secondary metric. We’ll cover that in the next widget. For now, we want to get a good understanding of what keywords are driving

Widget #4: Total Organic Non-Branded Keyword Conversions/Transactions (Table)

In this widget we’re looking to get a quick look at our top converting/transaction keywords. Once again, I recommend filtering out your branded search terms. Depending on how many important conversion points you want to keep track of, you may need to add more than one widget of this type because you can only view two metrics in each Table widget.

For this widget, we’ll add a Table with the following dimensions:

Non-Branded Organic Keyword Conversions

Widget #5: Top Social Action Content (Table)

You’ll find it easier to navigate to this report in the Standard Reporting section of Google Analytics (Audience > Social > Pages) and adding the widget using the top navigation bar in Google Analytics. The goal of this particular widget is to quickly see which content on your site is getting shared the most in social media. That way you’ll know what content topics have the best chance of going viral.

By default Google will show you information for only Google+, in a future post I’ll walk you through how to get other sites like Twitter and Facebook setup on here, too.

If your blog content lives under a /blog/ subfolder, you may want to consider filtering the report to only look at that content.

For this widget, we’ll add a Table with the following dimensions:

Social Action Content

After I added the widget to our SEO Monitoring dashboard, I went back and edited it to also include total visits as well.

Widget #6: Top Content Traffic & Conversions (Table)

In addition to knowing what content is getting shared the most, I like to keep an eye on what blog content is getting the most traffic and conversions.

For this widget, we’ll add a Table with the following dimensions:

Top Organic Landing Pages

Don’t forget to filter in just your blog content if that is the area you want to focus on.

Widget #7: Organic Search Engine Referrals (Pie)

I like to keep an eye on which search engines are sending me traffic and how it changes over time. The best way to get a snapshot of this is to add a pie chart widget.

For this widget, we’ll add a Pie with the following dimensions:

Search Engine Referrals

I chose to only look at the top three organic search engine referrals, but you can select up to six for your pie chart.

Widget #8: Page Load Speed (Table)

We also need to keep an eye on any pages that are loading slow. We can actually setup the widget to only look at organic traffic page load speeds, although it would be in your best interest to look at all your content, not just that just with organic visits.

For this widget, we’ll add a Table with the following dimensions:

Page Load Speed

The above table shows you your top ten slowest loading landing pages, and also includes how many visits that pages receives. You can sort by either, but it’s probably best to tackle the pages with the slowest load time first.

Widget #9: Site Search Keywords (Table)

The final piece to our monitoring puzzle: a list of keywords being searched for the most on our internal site search. This is a great way to generate new keyword ideas and to find new usability ideas (more on that later).

For this widget, we’ll add a Table with the following dimensions:

Site Search

I also like to add conversions as a dimension to this widget so I can not only keep an eye on which terms are getting searched for the most, but also which lead to the most conversions.

Website Redesign Dashboard – SEO Focus

So it’s time for the dreaded redesign process. You have a pretty good idea of what’s ahead: long nights, lots of frustration and hopefully, a great looking website not too far down the line. With this dashboard you can quickly gain insight into what changes you should be making in the upcoming redesign to help out your SEO campaign.

You might also consider renaming this dashboard to be a Usability dashboard so you can frequently check-in on how well your site is performing for your visitors.

We’ll be borrowing a few of the widgets in our SEO Monitoring dashboard, but also adding a few. Let’s first look at which widgets we should be re-adding to this new dashboard:

Widget #1: Top Converting Keywords (SEO Monitoring Widget #2)

A website redesign offers a great opportunity for keyword inclusion throughout our site’s architecture (navigation, URLs, etc.) With this widget we can keep an eye on which keywords we should be focusing these optimization efforts on.

Widget #2: Top Social Action Content (SEO Monitoring Widget #5)

Which social networks are engaging the most with your content? What pages are getting the most engagements? Answering these questions will help you create a user experience that is not only tailored to your top social network traffic drivers, but that also encourages social sharing.

You’ll also want to look closely at what makes the content in this report so shareable. Is it because of the way they are laid out? The images they use? These insights can really help you carry that experience throughout your new site.

Widget #3: Top Converting Content (SEO Monitoring Widget #6)

Just like with the top social action content, you want to keep an eye on the content that is working best (and worst). This will allow you to duplicate your successes and (hopefully) eliminate your failures.

Widget #4: Page Load Speed (SEO Monitoring Widget #8)

The redesign is the perfect time to address page load speed problems. Take a look at the slowest rendering pages in this table and determine what the common problems are that are slowing the load speed down.

Widget #5: Site Search Keywords (SEO Monitoring Widget #9)

Site search is great for finding new keywords, it’s also a great way to figure out what problems people are having navigating your site. With this widget you can quickly see the types of content people are expecting to find on your site – but aren’t able to.

On to our new widgets!

Widget #6 & #7: SEO Geographic Summary (Table) & Language (Table)

Is it time to consider translating your site for a new geographic audience? This type of change will definitely need your attention as an SEO. It’s also an opportunity for you to branch out your link building into new languages.

For this widget, we’ll add a Table with the following dimensions:

Geographic Summary

The organic traffic filter I have in place is definitely optional. I think it helps keep the data set you’re looking at more consistent by restricting it to organic visits only like the other widgets are set to.

For the Language widget, we’ll add a Table with the following dimensions:

Language Summary

You’ll note that I also filtered out all non-organic traffic here, too.

Widget #8: Top Exit/Bounce Pages (Table)

For this particular widget, we’re once again trying to identify problem pages. Any pages that have a high exit/bounce rate should get a close review to see if the cause for people leaving can easily be identified.

For this widget, we’ll add a Table with the following dimensions:

Exit and Bounces Summary

It’s important that we filter out any blog content that naturally creates high bounce rates. If you also have an event like a Account Login on your site, you may wish to use Google’s Event Tracking to filter out those visits as well.

Widget #9: Mobile Devices (Pie)

Which mobile devices are your visitors using to access your site? Are you getting a substantial number of visits? Do you anticipate it growing during the life of the next site design? More than likely this will be an area of focus for your redesign. It’s important that you know exactly which devices your consumers are using to view your site so you can ensure compatibility.

For this widget, we’ll add a Pie with the following dimensions:

Mobile Summary

Widget #10: Browser Conversion Rate (Table)

Finally, I like to take a look into what browser our visitors are using most, and what their conversion rate currently is. We all say we test all browsers for compatibility, but there are always pages that were rushed or that just fell through the cracks that might not be presenting themselves the way you had hoped.

For this widget, we’ll add a Pie with the following dimensions:

Browsers Summary

Holistic Dashboard

It’s no secret that to succeed in today’s online marketing world you need to be doing more than just SEO. Not just from the sense that other marketing efforts can help drive in new leads, but because it helps your SEO campaign succeed.

This dashboard highlights how your PPC and social media efforts are performing, so you can take that information and apply it to your SEO campaigns.

Widget #1: Top Social Action Content (SEO Monitoring Widget #5)

This widget will allow us to keep track of what types of content are performing best from a social perspective.

Widget #2: Top Referral Conversion/Transaction Sources (Table)

Within this report we’ll be able to quickly see which social networks are the most profitable in terms of conversions and/or actual transactions. This is a great way to see which social networks respond well to your offering, and that you should be investing more time in.

For this widget, we’ll add a Table with the following dimensions:

Social Conversion Sources

Ideally you’ll want to setup a filter to only look at social networks. If you’re good about tagging your URLs with custom variables, then you can change the filter to look at the medium and enter the medium value you use for social URLs (example: social).

Widget #3: Top Paid Converting/Transaction Keywords (Table)

Ever since the (not provided) update, we’ve all lost out on valuable keyword data. But just as Google hoped we would, we can get this information from our PPC spend. With this widget we’ll look at the keywords that are driving the most conversions/transactions for our PPC marketing, so we can look into targeting them in our SEO marketing, too.

For this widget, we’ll add a Table with the following dimensions:

Top Paid Converting Keywords

Widget #4: Top Paid Revenue Generating Ad Groups

Just like with our previous keyword widget, I also like to look at the top performing ad groups. This is a good way to know what top level topics are performing the best for your paid search campaigns, so you can prioritize them in your SEO campaigns.

For this widget, we’ll add a Table with the following dimensions:

Top PPC Ad Groups

Widget #5: Top Paid Landing Pages (Table)

If you’re not using custom landing pages for your paid search campaigns, this is a great way to see which keywords are working best for the various pages on your site. I like to run these types of tests before I commit to any keywords for SEO.

For this widget, we’ll add a Table with the following dimensions:

Top Paid Landing Pages

That’s just three of the 20 dashboards you could setup in Google Analytics. What are you adding to your dashboards to make them more actionable?

Categories: 

SEO Book.com

Posted in IM NewsComments Off

New Google+ Setting: Control Your Inbound Notifications by @mattmcgee

Google has just announced a new Google+ setting that I’m sure will be a big hit: the ability to control who can send you notifications.

As Google’s Kathleen Ko explains in a Google+ post, this is feature that many users have requested. Ko says that the new setting won’t prevent you from getting…



Please visit Search Engine Land for the full article.




Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off


Advert