Tag Archive | "Whiteboard"

10 Basic SEO Tips to Index + Rank New Content Faster – Whiteboard Friday

Posted by Cyrus-Shepard

In SEO, speed is a competitive advantage.

When you publish new content, you want users to find it ranking in search results as fast as possible. Fortunately, there are a number of tips and tricks in the SEO toolbox to help you accomplish this goal. Sit back, turn up your volume, and let Cyrus Shepard show you exactly how in this week’s Whiteboard Friday.

[Note: #3 isn't covered in the video, but we've included in the post below. Enjoy!]

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. I’m Cyrus Shepard, back in front of the whiteboard. So excited to be here today. We’re talking about ten tips to index and rank new content faster.

You publish some new content on your blog, on your website, and you sit around and you wait. You wait for it to be in Google’s index. You wait for it to rank. It’s a frustrating process that can take weeks or months to see those rankings increase. There are a few simple things we can do to help nudge Google along, to help them index it and rank it faster. Some very basic things and some more advanced things too. We’re going to dive right in.

Indexing

1. URL Inspection / Fetch & Render

So basically, indexing content is not that hard in Google. Google provides us with a number of tools. The simplest and fastest is probably the URL Inspection tool. It’s in the new Search Console, previously Fetch and Render. As of this filming, both tools still exist. They are depreciating Fetch and Render. The new URL Inspection tool allows you to submit a URL and tell Google to crawl it. When you do that, they put it in their priority crawl queue. That just simply means Google has a list of URLs to crawl. It goes into the priority, and it’s going to get crawled faster and indexed faster.

2. Sitemaps!

Another common technique is simply using sitemaps. If you’re not using sitemaps, it’s one of the easiest, quickest ways to get your URLs indexed. When you have them in your sitemap, you want to let Google know that they’re actually there. There’s a number of different techniques that can actually optimize this process a little bit more.

The first and the most basic one that everybody talks about is simply putting it in your robots.txt file. In your robots.txt, you have a list of directives, and at the end of your robots.txt, you simply say sitemap and you tell Google where your sitemaps are. You can do that for sitemap index files. You can list multiple sitemaps. It’s really easy.

Sitemap in robots.txt

You can also do it using the Search Console Sitemap Report, another report in the new Search Console. You can go in there and you can submit sitemaps. You can remove sitemaps, validate. You can also do this via the Search Console API.

But a really cool way of informing Google of your sitemaps, that a lot of people don’t use, is simply pinging Google. You can do this in your browser URL. You simply type in google.com/ping, and you put in the sitemap with the URL. You can try this out right now with your current sitemaps. Type it into the browser bar and Google will instantly queue that sitemap for crawling, and all the URLs in there should get indexed quickly if they meet Google’s quality standard.

Example: https://www.google.com/ping?sitemap=https://example.com/sitemap.xml

3. Google Indexing API

(BONUS: This wasn’t in the video, but we wanted to include it because it’s pretty awesome)

Within the past few months, both Google and Bing have introduced new APIs to help speed up and automate the crawling and indexing of URLs.

Both of these solutions allow for the potential of massively speeding up indexing by submitting 100s or 1000s of URLs via an API.

While the Bing API is intended for any new/updated URL, Google states that their API is specifically for “either job posting or livestream structured data.” That said, many SEOs like David Sottimano have experimented with Google APIs and found it to work with a variety of content types.

If you want to use these indexing APIs yourself, you have a number of potential options:

Yoast announced they will soon support live indexing across both Google and Bing within their SEO WordPress plugin.

Indexing & ranking

That’s talking about indexing. Now there are some other ways that you can get your content indexed faster and help it to rank a little higher at the same time.

4. Links from important pages

When you publish new content, the basic, if you do nothing else, you want to make sure that you are linking from important pages. Important pages may be your homepage, adding links to the new content, your blog, your resources page. This is a basic step that you want to do. You don’t want to orphan those pages on your site with no incoming links. 

Adding the links tells Google two things. It says we need to crawl this link sometime in the future, and it gets put in the regular crawling queue. But it also makes the link more important. Google can say, “Well, we have important pages linking to this. We have some quality signals to help us determine how to rank it.” So linking from important pages.

5. Update old content 

But a step that people oftentimes forget is not only link from your important pages, but you want to go back to your older content and find relevant places to put those links. A lot of people use a link on their homepage or link to older articles, but they forget that step of going back to the older articles on your site and adding links to the new content.

Now what pages should you add from? One of my favorite techniques is to use this search operator here, where you type in the keywords that your content is about and then you do a site:example.com. This allows you to find relevant pages on your site that are about your target keywords, and those make really good targets to add those links to from your older content.

6. Share socially

Really obvious step, sharing socially. When you have new content, sharing socially, there’s a high correlation between social shares and content ranking. But especially when you share on content aggregators, like Reddit, those create actual links for Google to crawl. Google can see those signals, see that social activity, sites like Reddit and Hacker News where they add actual links, and that does the same thing as adding links from your own content, except it’s even a little better because it’s external links. It’s external signals.

7. Generate traffic to the URL

This is kind of an advanced technique, which is a little controversial in terms of its effectiveness, but we see it anecdotally working time and time again. That’s simply generating traffic to the new content. 

Now there is some debate whether traffic is a ranking signal. There are some old Google patents that talk about measuring traffic, and Google can certainly measure traffic using Chrome. They can see where those sites are coming from. But as an example, Facebook ads, you launch some new content and you drive a massive amount of traffic to it via Facebook ads. You’re paying for that traffic, but in theory Google can see that traffic because they’re measuring things using the Chrome browser. 

When they see all that traffic going to a page, they can say, “Hey, maybe this is a page that we need to have in our index and maybe we need to rank it appropriately.”

Ranking

Once we get our content indexed, talk about a few ideas for maybe ranking your content faster. 

8. Generate search clicks

Along with generating traffic to the URL, you can actually generate search clicks.

Now what do I mean by that? So imagine you share a URL on Twitter. Instead of sharing directly to the URL, you share to a Google search result. People click the link, and you take them to a Google search result that has the keywords you’re trying to rank for, and people will search and they click on your result.

You see television commercials do this, like in a Super Bowl commercial they’ll say, “Go to Google and search for Toyota cars 2019.” What this does is Google can see that searcher behavior. Instead of going directly to the page, they’re seeing people click on Google and choosing your result.

  1. Instead of this: https://moz.com/link-explorer
  2. Share this: https://www.google.com/search?q=link+tool+moz

This does a couple of things. It helps increase your click-through rate, which may or may not be a ranking signal. But it also helps you rank for auto-suggest queries. So when Google sees people search for “best cars 2019 Toyota,” that might appear in the suggest bar, which also helps you to rank if you’re ranking for those terms. So generating search clicks instead of linking directly to your URL is one of those advanced techniques that some SEOs use.

9. Target query deserves freshness

When you’re creating the new content, you can help it to rank sooner if you pick terms that Google thinks deserve freshness. It’s best maybe if I just use a couple of examples here.

Consider a user searching for the term “cafes open Christmas 2019.” That’s a result that Google wants to deliver a very fresh result for. You want the freshest news about cafes and restaurants that are going to be open Christmas 2019. Google is going to preference pages that are created more recently. So when you target those queries, you can maybe rank a little faster.

Compare that to a query like “history of the Bible.” If you Google that right now, you’ll probably find a lot of very old pages, Wikipedia pages. Those results don’t update much, and that’s going to be harder for you to crack into those SERPs with newer content.

The way to tell this is simply type in the queries that you’re trying to rank for and see how old the most recent results are. That will give you an indication of what Google thinks how much freshness this query deserves. Choose queries that deserve a little more freshness and you might be able to get in a little sooner.

10. Leverage URL structure

Finally, last tip, this is something a lot of sites do and a lot of sites don’t do because they’re simply not aware of it. Leverage URL structure. When Google sees a new URL, a new page to index, they don’t have all the signals yet to rank it. They have a lot of algorithms that try to guess where they should rank it. They’ve indicated in the past that they leverage the URL structure to determine some of that.

Consider The New York Times puts all its book reviews under the same URL, newyorktimes.com/book-reviews. They have a lot of established ranking signals for all of these URLs. When a new URL is published using the same structure, they can assign it some temporary signals to rank it appropriately.

If you have URLs that are high authority, maybe it’s your blog, maybe it’s your resources on your site, and you’re leveraging an existing URL structure, new content published using the same structure might have a little bit of a ranking advantage, at least in the short run, until Google can figure these things out.

These are only a few of the ways to get your content indexed and ranking quicker. It is by no means a comprehensive list. There are a lot of other ways. We’d love to hear some of your ideas and tips. Please let us know in the comments below. If you like this video, please share it for me. Thanks, everybody.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The One-Hour Guide to SEO: Technical SEO – Whiteboard Friday

Posted by randfish

We’ve arrived at one of the meatiest SEO topics in our series: technical SEO. In this fifth part of the One-Hour Guide to SEO, Rand covers essential technical topics from crawlability to internal link structure to subfolders and far more. Watch on for a firmer grasp of technical SEO fundamentals!

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome back to our special One-Hour Guide to SEO Whiteboard Friday series. This is Part V – Technical SEO. I want to be totally upfront. Technical SEO is a vast and deep discipline like any of the things we’ve been talking about in this One-Hour Guide.

There is no way in the next 10 minutes that I can give you everything that you’ll ever need to know about technical SEO, but we can cover many of the big, important, structural fundamentals. So that’s what we’re going to tackle today. You will come out of this having at least a good idea of what you need to be thinking about, and then you can go explore more resources from Moz and many other wonderful websites in the SEO world that can help you along these paths.

1. Every page on the website is unique & uniquely valuable

First off, every page on a website should be two things — unique, unique from all the other pages on that website, and uniquely valuable, meaning it provides some value that a user, a searcher would actually desire and want. Sometimes the degree to which it’s uniquely valuable may not be enough, and we’ll need to do some intelligent things.

So, for example, if we’ve got a page about X, Y, and Z versus a page that’s sort of, “Oh, this is a little bit of a combination of X and Y that you can get through searching and then filtering this way.Oh, here’s another copy of that XY, but it’s a slightly different version.Here’s one with YZ. This is a page that has almost nothing on it, but we sort of need it to exist for this weird reason that has nothing to do, but no one would ever want to find it through search engines.”

Okay, when you encounter these types of pages as opposed to these unique and uniquely valuable ones, you want to think about: Should I be canonicalizing those, meaning point this one back to this one for search engine purposes? Maybe YZ just isn’t different enough from Z for it to be a separate page in Google’s eyes and in searchers’ eyes. So I’m going to use something called the rel=canonical tag to point this YZ page back to Z.

Maybe I want to remove these pages. Oh, this is totally non-valuable to anyone. 404 it. Get it out of here. Maybe I want to block bots from accessing this section of our site. Maybe these are search results that make sense if you’ve performed this query on our site, but they don’t make any sense to be indexed in Google. I’ll keep Google out of it using the robots.txt file or the meta robots or other things.

2. Pages are accessible to crawlers, load fast, and can be fully parsed in a text-based browser

Secondarily, pages are accessible to crawlers. They should be accessible to crawlers. They should load fast, as fast as you possibly can. There’s a ton of resources about optimizing images and optimizing server response times and optimizing first paint and first meaningful paint and all these different things that go into speed.

But speed is good not only because of technical SEO issues, meaning Google can crawl your pages faster, which oftentimes when people speed up the load speed of their pages, they find that Google crawls more from them and crawls them more frequently, which is a wonderful thing, but also because pages that load fast make users happier. When you make users happier, you make it more likely that they will link and amplify and share and come back and keep loading and not click the back button, all these positive things and avoiding all these negative things.

They should be able to be fully parsed in essentially a text browser, meaning that if you have a relatively unsophisticated browser that is not doing a great job of processing JavaScript or post-loading of script events or other types of content, Flash and stuff like that, it should be the case that a spider should be able to visit that page and still see all of the meaningful content in text form that you want to present.

Google still is not processing every image at the I’m going to analyze everything that’s in this image and extract out the text from it level, nor are they doing that with video, nor are they doing that with many kinds of JavaScript and other scripts. So I would urge you and I know many other SEOs, notably Barry Adams, a famous SEO who says that JavaScript is evil, which may be taking it a little bit far, but we catch his meaning, that you should be able to load everything into these pages in HTML in text.

3. Thin content, duplicate content, spider traps/infinite loops are eliminated

Thin content and duplicate content — thin content meaning content that doesn’t provide meaningfully useful, differentiated value, and duplicate content meaning it’s exactly the same as something else — spider traps and infinite loops, like calendaring systems, these should generally speaking be eliminated. If you have those duplicate versions and they exist for some reason, for example maybe you have a printer-friendly version of an article and the regular version of the article and the mobile version of the article, okay, there should probably be some canonicalization going on there, the rel=canonical tag being used to say this is the original version and here’s the mobile friendly version and those kinds of things.

If you have search results in the search results, Google generally prefers that you don’t do that. If you have slight variations, Google would prefer that you canonicalize those, especially if the filters on them are not meaningfully and usefully different for searchers. 

4. Pages with valuable content are accessible through a shallow, thorough internal links structure

Number four, pages with valuable content on them should be accessible through just a few clicks, in a shallow but thorough internal link structure.

Now this is an idealized version. You’re probably rarely going to encounter exactly this. But let’s say I’m on my homepage and my homepage has 100 links to unique pages on it. That gets me to 100 pages. One hundred more links per page gets me to 10,000 pages, and 100 more gets me to 1 million.

So that’s only three clicks from homepage to one million pages. You might say, “Well, Rand, that’s a little bit of a perfect pyramid structure. I agree. Fair enough. Still, three to four clicks to any page on any website of nearly any size, unless we’re talking about a site with hundreds of millions of pages or more, should be the general rule. I should be able to follow that through either a sitemap.

If you have a complex structure and you need to use a sitemap, that’s fine. Google is fine with you using an HTML page-level sitemap. Or alternatively, you can just have a good link structure internally that gets everyone easily, within a few clicks, to every page on your site. You don’t want to have these holes that require, “Oh, yeah, if you wanted to reach that page, you could, but you’d have to go to our blog and then you’d have to click back to result 9, and then you’d have to click to result 18 and then to result 27, and then you can find it.”

No, that’s not ideal. That’s too many clicks to force people to make to get to a page that’s just a little ways back in your structure. 

5. Pages should be optimized to display cleanly and clearly on any device, even at slow connection speeds

Five, I think this is obvious, but for many reasons, including the fact that Google considers mobile friendliness in its ranking systems, you want to have a page that loads clearly and cleanly on any device, even at slow connection speeds, optimized for both mobile and desktop, optimized for 4G and also optimized for 2G and no G.

6. Permanent redirects should use the 301 status code, dead pages the 404, temporarily unavailable the 503, and all okay should use the 200 status code

Permanent redirects. So this page was here. Now it’s over here. This old content, we’ve created a new version of it. Okay, old content, what do we do with you? Well, we might leave you there if we think you’re valuable, but we may redirect you. If you’re redirecting old stuff for any reason, it should generally use the 301 status code.

If you have a dead page, it should use the 404 status code. You could maybe sometimes use 410, permanently removed, as well. Temporarily unavailable, like we’re having some downtime this weekend while we do some maintenance, 503 is what you want. Everything is okay, everything is great, that’s a 200. All of your pages that have meaningful content on them should have a 200 code.

These status codes, anything else beyond these, and maybe the 410, generally speaking should be avoided. There are some very occasional, rare, edge use cases. But if you find status codes other than these, for example if you’re using Moz, which crawls your website and reports all this data to you and does this technical audit every week, if you see status codes other than these, Moz or other software like it, Screaming Frog or Ryte or DeepCrawl or these other kinds, they’ll say, “Hey, this looks problematic to us. You should probably do something about this.”

7. Use HTTPS (and make your site secure)

When you are building a website that you want to rank in search engines, it is very wise to use a security certificate and to have HTTPS rather than HTTP, the non-secure version. Those should also be canonicalized. There should never be a time when HTTP is the one that is loading preferably. Google also gives a small reward — I’m not even sure it’s that small anymore, it might be fairly significant at this point — to pages that use HTTPS or a penalty to those that don’t. 

8. One domain > several, subfolders > subdomains, relevant folders > long, hyphenated URLs

In general, well, I don’t even want to say in general. It is nearly universal, with a few edge cases — if you’re a very advanced SEO, you might be able to ignore a little bit of this — but it is generally the case that you want one domain, not several. Allmystuff.com, not allmyseattlestuff.com, allmyportlandstuff.com, and allmylastuff.com.

Allmystuff.com is preferable for many, many technical reasons and also because the challenge of ranking multiple websites is so significant compared to the challenge of ranking one. 

You want subfolders, not subdomains, meaning I want allmystuff.com/seattle, /la, and /portland, not seattle.allmystuff.com.

Why is this? Google’s representatives have sometimes said that it doesn’t really matter and I should do whatever is easy for me. I have so many cases over the years, case studies of folks who moved from a subdomain to a subfolder and saw their rankings increase overnight. Credit to Google’s reps.

I’m sure they’re getting their information from somewhere. But very frankly, in the real world, it just works all the time to put it in a subfolder. I have never seen a problem being in the subfolder versus the subdomain, where there are so many problems and there are so many issues that I would strongly, strongly urge you against it. I think 95% of professional SEOs, who have ever had a case like this, would do likewise.

Relevant folders should be used rather than long, hyphenated URLs. This is one where we agree with Google. Google generally says, hey, if you have allmystuff.com/seattle/ storagefacilities/top10places, that is far better than /seattle- storage-facilities-top-10-places. It’s just the case that Google is good at folder structure analysis and organization, and users like it as well and good breadcrumbs come from there.

There’s a bunch of benefits. Generally using this folder structure is preferred to very, very long URLs, especially if you have multiple pages in those folders. 

9. Use breadcrumbs wisely on larger/deeper-structured sites

Last, but not least, at least last that we’ll talk about in this technical SEO discussion is using breadcrumbs wisely. So breadcrumbs, actually both technical and on-page, it’s good for this.

Google generally learns some things from the structure of your website from using breadcrumbs. They also give you this nice benefit in the search results, where they show your URL in this friendly way, especially on mobile, mobile more so than desktop. They’ll show home > seattle > storage facilities. Great, looks beautiful. Works nicely for users. It helps Google as well.

So there are plenty more in-depth resources that we can go into on many of these topics and others around technical SEO, but this is a good starting point. From here, we will take you to Part VI, our last one, on link building next week. Take care.

Video transcription by Speechpad.com

In case you missed them:

Check out the other episodes in the series so far:

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The One-Hour Guide to SEO: Searcher Satisfaction – Whiteboard Friday

Posted by randfish

Satisfying your searchers is a big part of what it means to be successful in modern SEO. And optimal searcher satisfaction means gaining a deep understanding of them and the queries they use to search. In this section of the One-Hour Guide to SEO, Rand covers everything you need to know about how to satisfy searchers, including the top four priorities you need to have and tips on how to avoid pogo-sticking in the SERPs.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to our special edition One-Hour Guide to SEO Part III on searcher satisfaction. So historically, if we were doing a guide to SEO in the long-ago past, we probably wouldn’t even be talking about searcher satisfaction.

What do searchers want from Google’s results?

But Google has made such a significant number of advances in the last 5 to 10 years that searcher satisfaction is now a huge part of how you can be successful in SEO. I’ll explain what I mean here. Let’s say our friend Arlen here is thinking about going on vacation to Italy.

So she goes to Google. She types in “best places to visit in Italy,” and she gets a list of results. Now Google sorts those results in a number of ways. They sort them by the most authoritative, the most comprehensive. They use links and link data in a lot of different ways to try and get at that. They use content data, what’s on the page, and keyword data.

They use historical performance data about which sites have done well for searchers in the past. All of these things sort of feed into searcher satisfaction. So when Arlen performs this query, she has a bunch of questions in her head, things like I want a list of popular Italian vacation destinations, and I want some comparison of those locations.

Maybe I want the ability to sort and filter based on my personal preferences. I want to know the best times of year to go. I want to know the weather forecast and what to see and do and hotel and lodging info and transportation and accessibility information and cultural tips and probably dozens more questions that I can’t even list out here. But when you, as a content creator and as a search engine optimization professional, are creating and crafting content and trying to optimize that content so that it performs well in Google’s results, you need to be considering what are all of these questions.

How to craft content that satisfies your searchers

This is why searcher empathy, customer empathy, being able to get inside Arlen’s head or your customer’s head and say, “What does she want? What is she looking for?” is one of the most powerful ways to craft content that performs better than your competition in search engines, because it turns out a lot of people don’t do this.

Priority 1: Answer the searcher’s questions comprehensively and with authority

So if I’m planning my page, what is the best page I could possibly craft to try and rank for “best places to visit in Italy,” which is a very popular search term, extremely competitive? I would think about obviously there’s all sorts of keyword stuff and on-page optimization stuff, which we will talk about in Part IV, but my priorities are answer the searcher’s primary questions comprehensively and authoritatively. If I can do that, I am in good shape. I’m ahead of a lot of the pack.

Priority 2: Provide an easy-to-use, fast-loading, well-designed interface that’s a pleasure to interact with

Second, I want to provide a great user experience. That means easy to use, fast-loading, well-designed, that’s a pleasure to interact with. I want the experience of a visitor, a searcher who lands on this page to be, “Wow, this is much better than the typical experience that I get when I land on a lot of other sites.”

Priority 3: Solve the searcher’s next tasks and questions with content, tools, or links

Priority three, I want to solve the searcher’s next tasks and questions with either content on my own site or tools and resources or links or the ability to do them right here so that they don’t have to go back to Google and do other things or visit other websites to try and accomplish the tasks, like figuring out a good hotel or figuring out the weather forecast. A lot of sites don’t do this comprehensively today, which is why it’s an advantage if you do. 

Priority 4: Consider creative elements that may give you a long-term competitive advantage

Priority four is consider some creative elements, maybe interactive tools or an interactive map or sorting and filtering options that could give you a long-term, competitive advantage, something that’s difficult for other people who want to rank for this search term to build.

Maybe that’s the data that you get. Maybe it’s the editorial content. Maybe it’s your photographs. Maybe it’s your tools and interactive elements. Whatever the case. 

Do NOT give searchers a reason to click that back button!

One of the biggest goals of searcher satisfaction is to make sure that this scenario does not happen to you. You do not want to give searchers a reason to click that back button and choose someone else.

The search engine literature calls this “pogo sticking.” Basically, if I do a search for “best places to visit in Italy”and I click on, let’s say, US News & World Reports and I find that that page does not do a great job answering my query, or it does a fine job, but it’s got a bunch of annoying popovers and it’s slow loading and it has all these things that it’s trying to sell me, and so I click the back button and I choose a different result from Touropia or Earth Trackers.

Over time, Google will figure out that US News & World Reports is not doing a good job of answering the searcher’s query, of providing a satisfactory experience, and they will push them down in the results and they will push these other ones, that are doing a good job, up in the results. You want to be the result that satisfies a searcher, that gets into their head and answers their questions and helps them solve their task, and that will give you an advantage over time in Google’s rankings.

All right, we’ll see you next time for Part IV on on-page optimization. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The One-Hour Guide to SEO, Part 2: Keyword Research – Whiteboard Friday

Posted by randfish

Before doing any SEO work, it’s important to get a handle on your keyword research. Aside from helping to inform your strategy and structure your content, you’ll get to know the needs of your searchers, the search demand landscape of the SERPs, and what kind of competition you’re up against.

In the second part of the One-Hour Guide to SEO, the inimitable Rand Fishkin covers what you need to know about the keyword research process, from understanding its goals to building your own keyword universe map. Enjoy!


Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another portion of our special edition of Whiteboard Friday, the One-Hour Guide to SEO. This is Part II – Keyword Research. Hopefully you’ve already seen our SEO strategy session from last week. What we want to do in keyword research is talk about why keyword research is required. Why do I have to do this task prior to doing any SEO work?

The answer is fairly simple. If you don’t know which words and phrases people type into Google or YouTube or Amazon or Bing, whatever search engine you’re optimizing for, you’re not going to be able to know how to structure your content. You won’t be able to get into the searcher’s brain, into their head to imagine and empathize with them what they actually want from your content. You probably won’t do correct targeting, which will mean your competitors, who are doing keyword research, are choosing wise search phrases, wise words and terms and phrases that searchers are actually looking for, and you might be unfortunately optimizing for words and phrases that no one is actually looking for or not as many people are looking for or that are much more difficult than what you can actually rank for.

The goals of keyword research

So let’s talk about some of the big-picture goals of keyword research. 

Understand the search demand landscape so you can craft more optimal SEO strategies

First off, we are trying to understand the search demand landscape so we can craft better SEO strategies. Let me just paint a picture for you.

I was helping a startup here in Seattle, Washington, a number of years ago — this was probably a couple of years ago — called Crowd Cow. Crowd Cow is an awesome company. They basically will deliver beef from small ranchers and small farms straight to your doorstep. I personally am a big fan of steak, and I don’t really love the quality of the stuff that I can get from the store. I don’t love the mass-produced sort of industry around beef. I think there are a lot of Americans who feel that way. So working with small ranchers directly, where they’re sending it straight from their farms, is kind of an awesome thing.

But when we looked at the SEO picture for Crowd Cow, for this company, what we saw was that there was more search demand for competitors of theirs, people like Omaha Steaks, which you might have heard of. There was more search demand for them than there was for “buy steak online,” “buy beef online,” and “buy rib eye online.” Even things like just “shop for steak” or “steak online,” these broad keyword phrases, the branded terms of their competition had more search demand than all of the specific keywords, the unbranded generic keywords put together.

That is a very different picture from a world like “soccer jerseys,” where I spent a little bit of keyword research time today looking, and basically the brand names in that field do not have nearly as much search volume as the generic terms for soccer jerseys and custom soccer jerseys and football clubs’ particular jerseys. Those generic terms have much more volume, which is a totally different kind of SEO that you’re doing. One is very, “Oh, we need to build our brand. We need to go out into this marketplace and create demand.” The other one is, “Hey, we need to serve existing demand already.”

So you’ve got to understand your search demand landscape so that you can present to your executive team and your marketing team or your client or whoever it is, hey, this is what the search demand landscape looks like, and here’s what we can actually do for you. Here’s how much demand there is. Here’s what we can serve today versus we need to grow our brand.

Create a list of terms and phrases that match your marketing goals and are achievable in rankings

The next goal of keyword research, we want to create a list of terms and phrases that we can then use to match our marketing goals and achieve rankings. We want to make sure that the rankings that we promise, the keywords that we say we’re going to try and rank for actually have real demand and we can actually optimize for them and potentially rank for them. Or in the case where that’s not true, they’re too difficult or they’re too hard to rank for. Or organic results don’t really show up in those types of searches, and we should go after paid or maps or images or videos or some other type of search result.

Prioritize keyword investments so you do the most important, high-ROI work first

We also want to prioritize those keyword investments so we’re doing the most important work, the highest ROI work in our SEO universe first. There’s no point spending hours and months going after a bunch of keywords that if we had just chosen these other ones, we could have achieved much better results in a shorter period of time.

Match keywords to pages on your site to find the gaps

Finally, we want to take all the keywords that matter to us and match them to the pages on our site. If we don’t have matches, we need to create that content. If we do have matches but they are suboptimal, not doing a great job of answering that searcher’s query, well, we need to do that work as well. If we have a page that matches but we haven’t done our keyword optimization, which we’ll talk a little bit more about in a future video, we’ve got to do that too.

Understand the different varieties of search results

So an important part of understanding how search engines work — we’re going to start down here and then we’ll come back up — is to have this understanding that when you perform a query on a mobile device or a desktop device, Google shows you a vast variety of results. Ten or fifteen years ago this was not the case. We searched 15 years ago for “soccer jerseys,” what did we get? Ten blue links. I think, unfortunately, in the minds of many search marketers and many people who are unfamiliar with SEO, they still think of it that way. How do I rank number one? The answer is, well, there are a lot of things “number one” can mean today, and we need to be careful about what we’re optimizing for.

So if I search for “soccer jersey,” I get these shopping results from Macy’s and soccer.com and all these other places. Google sort has this sliding box of sponsored shopping results. Then they’ve got advertisements below that, notated with this tiny green ad box. Then below that, there are couple of organic results, what we would call classic SEO, 10 blue links-style organic results. There are two of those. Then there’s a box of maps results that show me local soccer stores in my region, which is a totally different kind of optimization, local SEO. So you need to make sure that you understand and that you can convey that understanding to everyone on your team that these different kinds of results mean different types of SEO.

Now I’ve done some work recently over the last few years with a company called Jumpshot. They collect clickstream data from millions of browsers around the world and millions of browsers here in the United States. So they are able to provide some broad overview numbers collectively across the billions of searches that are performed on Google every day in the United States.

Click-through rates differ between mobile and desktop

The click-through rates look something like this. For mobile devices, on average, paid results get 8.7% of all clicks, organic results get about 40%, a little under 40% of all clicks, and zero-click searches, where a searcher performs a query but doesn’t click anything, Google essentially either answers the results in there or the searcher is so unhappy with the potential results that they don’t bother taking anything, that is 62%. So the vast majority of searches on mobile are no-click searches.

On desktop, it’s a very different story. It’s sort of inverted. So paid is 5.6%. I think people are a little savvier about which result they should be clicking on desktop. Organic is 65%, so much, much higher than mobile. Zero-click searches is 34%, so considerably lower.

There are a lot more clicks happening on a desktop device. That being said, right now we think it’s around 60–40, meaning 60% of queries on Google, at least, happen on mobile and 40% happen on desktop, somewhere in those ranges. It might be a little higher or a little lower.

The search demand curve

Another important and critical thing to understand about the keyword research universe and how we do keyword research is that there’s a sort of search demand curve. So for any given universe of keywords, there is essentially a small number, maybe a few to a few dozen keywords that have millions or hundreds of thousands of searches every month. Something like “soccer” or “Seattle Sounders,” those have tens or hundreds of thousands, even millions of searches every month in the United States.

But people searching for “Sounders FC away jersey customizable,” there are very, very few searches per month, but there are millions, even billions of keywords like this. 

The long-tail: millions of keyword terms and phrases, low number of monthly searches

When Sundar Pichai, Google’s current CEO, was testifying before Congress just a few months ago, he told Congress that around 20% of all searches that Google receives each day they have never seen before. No one has ever performed them in the history of the search engines. I think maybe that number is closer to 18%. But that is just a remarkable sum, and it tells you about what we call the long tail of search demand, essentially tons and tons of keywords, millions or billions of keywords that are only searched for 1 time per month, 5 times per month, 10 times per month.

The chunky middle: thousands or tens of thousands of keywords with ~50–100 searches per month

If you want to get into this next layer, what we call the chunky middle in the SEO world, this is where there are thousands or tens of thousands of keywords potentially in your universe, but they only have between say 50 and a few hundred searches per month.

The fat head: a very few keywords with hundreds of thousands or millions of searches

Then this fat head has only a few keywords. There’s only one keyword like “soccer” or “soccer jersey,” which is actually probably more like the chunky middle, but it has hundreds of thousands or millions of searches. The fat head is higher competition and broader intent.

Searcher intent and keyword competition

What do I mean by broader intent? That means when someone performs a search for “soccer,” you don’t know what they’re looking for. The likelihood that they want a customizable soccer jersey right that moment is very, very small. They’re probably looking for something much broader, and it’s hard to know exactly their intent.

However, as you drift down into the chunky middle and into the long tail, where there are more keywords but fewer searches for each keyword, your competition gets much lower. There are fewer people trying to compete and rank for those, because they don’t know to optimize for them, and there’s more specific intent. “Customizable Sounders FC away jersey” is very clear. I know exactly what I want. I want to order a customizable jersey from the Seattle Sounders away, the particular colors that the away jersey has, and I want to be able to put my logo on there or my name on the back of it, what have you. So super specific intent.

Build a map of your own keyword universe

As a result, you need to figure out what the map of your universe looks like so that you can present that, and you need to be able to build a list that looks something like this. You should at the end of the keyword research process — we featured a screenshot from Moz’s Keyword Explorer, which is a tool that I really like to use and I find super helpful whenever I’m helping companies, even now that I have left Moz and been gone for a year, I still sort of use Keyword Explorer because the volume data is so good and it puts all the stuff together. However, there are two or three other tools that a lot of people like, one from Ahrefs, which I think also has the name Keyword Explorer, and one from SEMrush, which I like although some of the volume numbers, at least in the United States, are not as good as what I might hope for. There are a number of other tools that you could check out as well. A lot of people like Google Trends, which is totally free and interesting for some of that broad volume data.



So I might have terms like “soccer jersey,” “Sounders FC jersey”, and “custom soccer jersey Seattle Sounders.” Then I’ll have these columns: 

  • Volume, because I want to know how many people search for it; 
  • Difficulty, how hard will it be to rank. If it’s super difficult to rank and I have a brand-new website and I don’t have a lot of authority, well, maybe I should target some of these other ones first that are lower difficulty. 
  • Organic Click-through Rate, just like we talked about back here, there are different levels of click-through rate, and the tools, at least Moz’s Keyword Explorer tool uses Jumpshot data on a per keyword basis to estimate what percent of people are going to click the organic results. Should you optimize for it? Well, if the click-through rate is only 60%, pretend that instead of 100 searches, this only has 60 or 60 available searches for your organic clicks. Ninety-five percent, though, great, awesome. All four of those monthly searches are available to you.
  • Business Value, how useful is this to your business? 
  • Then set some type of priority to determine. So I might look at this list and say, “Hey, for my new soccer jersey website, this is the most important keyword. I want to go after “custom soccer jersey” for each team in the U.S., and then I’ll go after team jersey, and then I’ll go after “customizable away jerseys.” Then maybe I’ll go after “soccer jerseys,” because it’s just so competitive and so difficult to rank for. There’s a lot of volume, but the search intent is not as great. The business value to me is not as good, all those kinds of things.
  • Last, but not least, I want to know the types of searches that appear — organic, paid. Do images show up? Does shopping show up? Does video show up? Do maps results show up? If those other types of search results, like we talked about here, show up in there, I can do SEO to appear in those places too. That could yield, in certain keyword universes, a strategy that is very image centric or very video centric, which means I’ve got to do a lot of work on YouTube, or very map centric, which means I’ve got to do a lot of local SEO, or other kinds like this.

Once you build a keyword research list like this, you can begin the prioritization process and the true work of creating pages, mapping the pages you already have to the keywords that you’ve got, and optimizing in order to rank. We’ll talk about that in Part III next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The One-Hour Guide to SEO, Part 1: SEO Strategy – Whiteboard Friday

Posted by randfish

Can you learn SEO in an hour? Surprisingly, the answer is yes, at least when it comes to the fundamentals! 

With this edition of Whiteboard Friday, we’re kicking off something special: a six-part series of roughly ten-minute-long videos designed to deliver core SEO concepts efficiently and effectively. It’s our hope that this will serve as a helpful resource for a wide range of people:

  • Beginner SEOs looking to get acquainted with the field concisely & comprehensively
  • Clients, bosses, and stakeholders who would benefit from an enhanced understanding of your work
  • New team members who need quick and easy onboarding
  • Colleagues with SEO-adjacent roles, such as web developers and software engineers

Today we’ll be covering Part 1: SEO Strategy with the man who wrote the original guide on SEO, our friend Rand. Settle in, and stay tuned next Friday for our second video covering keyword research!

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to a special edition of the Whiteboard Friday series. I’m Rand Fishkin, the founder and former CEO of Moz, and I’m here with you today because I’m going to deliver a one-hour guide to SEO, front and back, so that you can learn in just an hour the fundamentals of the practice and be smarter at choosing a great SEO firm to work with, hiring SEO people. 

A handy SEO resource for your clients, team, and colleagues

If you are already in SEO, you might pick up some tips and tactics that you didn’t otherwise know or hadn’t previously considered. I want to ask those of you who are sort of intermediate level and advanced level SEOs — and I know there are many of you who have historically watched me on Whiteboard Friday and I really appreciate that — to give this video a chance even though it is at the beginner level, because my hope is that it will be valuable to you to send to your clients, your potential customers, people who join your team and work with you, developers or software engineers or web devs who you are working with and whose help you need but you want them to understand the fundamentals of SEO.

If those are the people that you’re talking to, excellent. This series is for you. We’re going to begin with SEO strategy. That is our first part. Then we’ll get into things like keyword research and technical SEO and link building and all of that good stuff as well. 

The essentials: What is SEO, and what does it do?

So first off, SEO is search engine optimization. It is essentially the practice of influencing or being able to control some of the results that Google shows when someone types in or speaks a query to their system.

I say Google. You can influence other search engines, like Bing and DuckDuckGo and Yahoo and Seznam if you’re in the Czech Republic or Baidu. But we are primarily focused on Google because Google has more than a 90% market share in the United States and, in fact, in North America and South America, in most of Europe, Asia, and the Middle East with a few exceptions.

Start with business goals

So SEO is a tactic. It’s a way to control things. It is not a business goal. No one forms a new company or sits down with their division and says, “Okay, we need to rank for all of these keywords.” Instead what you should be saying, what hopefully is happening in your teams is, “We have these business goals.”

Example: “Grow our online soccer jersey sales to a web-savvy, custom heavy audience.”

Let’s say we’re an online e-commerce shop and we sell customized soccer jerseys, well, football for those of you outside of the United States. So we want to grow our online soccer jersey sales. Great, that is a true business goal. We’re trying to build a bigger audience. We want to sell more of these jerseys. In order to do that, we have marketing goals that we want to achieve, things like we want to build brand awareness.

Next, marketing goals

Build brand awareness

We want more people to know who we are, to have heard of our particular brand, because people who have heard of us are going to be more likely to buy from us. The first time you hear about someone, very unlikely to buy. The seventh time you’ve heard about someone, much more likely to buy from them. So that is a good marketing goal, and SEO can help with that. We’ll talk about that in a sec.

Grow top-of-funnel traffic

You might want to grow top-of-funnel traffic. We want more people coming to the site overall so that we can do a better job of figuring out who is the right audience for us and converting some of those people, retargeting some of those people, capturing emails from some of those people, all those good things. 

Attract ready-to-buy fans

We want to attract ready-to-buy fans, people who are chomping at the bit to buy our soccer jerseys, customize them and get them shipped.

SEO, as a strategy, is essentially a set of tactics, things that you will do in the SEO world to rank for different keywords in the search engines or control and influence what already ranks in there so that you can achieve your marketing goals so that you can achieve your business goals.

Don’t get this backwards. Don’t start from a place of SEO. Especially if you are an SEO specialist or a practitioner or you’re joining a consulting firm, you should always have an excellent idea of what these are and why the SEO tactics that you are undertaking fit into them. If you don’t, you should be asking those questions before you begin any SEO work.

Otherwise you’re going to accomplish things and do things that don’t have the impact or don’t tie directly to the impact that the business owners care about, and that’s going to mean probably you won’t get picked up for another contract or you won’t accomplish the goals that mean you’re valuable to the team or you do things that people don’t necessarily need and want in the business and therefore you are seen as a less valuable part of it.

Finally, move into SEO strategy

But if you’re accomplishing things that can clearly tie to these, the opposite. People will really value what you do. 

Rank for low-demand, high-conversion keywords

So SEO can do things like rank for low demand, things that don’t have a lot of searches per month but they are high conversion likely keywords, keywords like “I am looking for a customized Seattle Sounders soccer jersey that’s in the away colors.” Well, there’s not a lot of search demand for that exact phrase. But if you’re searching for it, you’re very likely to convert. 

Earn traffic from high-demand, low-competition, less commerce-focused keywords

You could try and earn traffic from high-demand, low competition keywords that are less focused directly on e-commerce. So it could be things like “Seattle Sounders news” or “Seattle Sounders stats” or a comparison of “Portland Timbers versus Seattle Sounders.” These are two soccer or football clubs in the Pacific Northwest. 

Build content that attracts links and influencer engagement

Or you might be trying to do things like building content that attracts links and influencer engagement so that in the future you can rank for more competitive keywords. We’ll talk about that in a sec. SEO can do some amazing things, but there are also things that it cannot do.

What SEO can do:

If you put things in here, if you as an SEO pitch to your marketing team or your business owners that SEO can do things that it can’t, you’re going to be in trouble. So when we compose an SEO strategy, a set of tactics that tries to accomplish marketing goals that tie to business goals, SEO can do things like:

  • Attract searchers that are seeking your content.
  • Control how your brand is seen in search results when someone searches for your particular name. 
  • Nudge searchers toward queries by influencing what gets suggested in the auto suggest or by suggesting related searches or people also ask boxes. 

Anything that shows up in the search results, nearly anything can be influenced by what we as SEOs can do.

What SEO cannot do:

Grow or create search demand on its own

But SEO cannot grow or create search demand by itself. So if someone says, “Hey, I want us to get more traffic for this specific keyword,” if you’re already ranking number one and you have some videos showing in the results and you’re also in the image results and you’ve got maybe a secondary page that links off to you from the results, you might say, “Hey, there’s just not more demand,” and SEO by itself can’t create that additional demand.

Build brand (by itself)

SEO also can’t build brand, at least not by itself. It can certainly be a helpful part of that structure. But if someone says, “Hey, I want us to be better known among this audience,”you can say, “Well, SEO can help a little, but it can’t build a brand on its own, and it certainly can’t build brand perception on its own.” People are going to go and visit your website. They’re going to go and experience, have an interaction with what you’ve created on the web. That is going to be far more of a brand builder, a brand indicator than just what appears in the search results. So SEO can’t do that alone. 

Directly convert customers

It also can’t directly convert customers. A lot of the time what we find is that someone will do a great job of ranking, but when you actually reach the website, when visitors reach the website, they are unsatisfied by the search, which by the way is one of the reasons why this one-hour guide is going to include a section on searcher satisfaction.

When Google sees over time that searchers are unsatisfied by a result, they will push that result down in the rankings and find someone who does a great job of satisfying searchers, and they will rank them instead. So the website has to do this. It is part of SEO. It’s certainly part of the equation, but SEO can’t influence it or control it on its own.

WORK OVERNIGHT!

Finally, last but not least, SEO cannot work overnight. It just won’t happen. SEO is a long-term investment. It is very different from paid search ads, PPC, also called SEM sometimes, buying from Google ads or from Bing ads and appearing in the sponsored results. That is a tactic where you can pour money in and optimize and get results out in 24 hours. SEO is more like a 24-month long process. 

The SEO Growth Path

I’ve tried to show that here. The fundamental concept is when you have a new website, you need to earn these things — links and engagement and historical performance in the rankings.

As you earn those things, other people are linking to you from around the web, people are talking about you, people are engaging with your pages and your brand, people start searching for your brand specifically, people are clicking you more in the search results and then having good experiences on your website, as all those great things happen, you will grow your historical engagement and links and ranking factors, all these things that we sort of put into the bucket of the authority and influence of a website.

3–6 months: Begin to rank for things in the long tail of search demand

As that grows, you will be able to first, over time, this might be three to six months down here, you might be able to rank for a few keywords in the long tail of search demand. 

6–9 months: Begin to rank for more and more competitive keywords

After six to nine months, if you’re very good at this, you may be able to rank for more and more competitive keywords.

12–18 months: Compete for tougher keywords

As you truly grow a brand that is well-known and well thought of on the internet and by search engines, 12 to 18 months in, maybe longer, you may be able to compete for tougher and tougher keywords. When I started the Moz website, back in the early days of Google, it took me years, literally two or three years before I was ranking for anything in Google, anything in the search engines, and that is because I had to first earn that brand equity, that trust, that relationship with the search engines, those links and that engagement.

Today this is more true than ever because Google is so good at estimating these things. All right. I look forward to hearing all about the amazing strategies and structures that you’ve got probably in the comments down below. I’m sure it will be a great thread. We’ll move on to the second part of our one-hour guide next time — keyword research. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

How to Use Domain Authority for SEO – Whiteboard Friday

Posted by Cyrus-Shepard

Domain Authority is an incredibly well-known metric throughout the SEO industry, but what exactly is the right way to use it? In this week’s edition of Whiteboard Friday, we’re delighted to welcome Cyrus Shepard as he explains both what’s new with the new Domain Authority 2.0 update, and how to best harness its power for your own SEO success. 

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, SEO fans. Welcome to a very special edition of Whiteboard Friday. I’m Cyrus Shepard. I’m honored to be here today with Moz to talk about the new Domain Authority. I want to talk about how to use Domain Authority to do actual SEO.

What is Domain Authority?

Let’s start with a definition of what Domain Authority actually is because there’s a lot of confusion out there. A Domain Authority is a metric, from 1 to 100, which predicts how well a domain will rank in Google. Now let’s break that down a little bit and talk about some of the myths of Domain Authority. 

Is Domain Authority a ranking factor? No, Domain Authority is not a ranking factor. Does Google use Domain Authority in its algorithm? No, Google does not use Domain Authority in its algorithm. Now Google may use some domain-like metrics based on links similar to Domain Authority, but they do not use Domain Authority itself. In fact, it’s best if you don’t bring it up with them. They don’t tend to like that very much.

So if it’s not a ranking factor, if Google doesn’t use it, what does Domain Authority actually do? It does one thing very, very well. It predicts rankings. That’s what it was built to do. That’s what it was designed to do, and it does that job very, very well. And because of that, we can use it for SEO in a lot of different ways. So Domain Authority has been around since 2010, about 8 years now, and since then it’s become a very popular metric, used and abused in different ways.

What’s New With Domain Authority 2.0?

So what’s new about the new Domain Authority that makes it so great and less likely to be abused and gives it so many more uses? Before I go into this, a big shout-out to two of the guys who helped develop this — Russ Jones and Neil Martinsen-Burrell — and many other smart people at Moz. Some of our search scientists did a tremendous job of updating this metric for 2019.

1. Bigger Link Index

So the first thing is the new Domain Authority is based on a new, bigger link index, and that is Link Explorer, which was released last year. It contains 35 trillion links. There are different ways of judging index sizes, but that is one of the biggest or if not the biggest link indexes publicly available that we know of.

Thirty-five trillion links, to give you an idea of how big that is, if you were to count one link per second, you would be counting for 1.1 million years. That’s a lot of links, and that’s how many links are in the index that the new Domain Authority is based upon. Second of all, it uses a new machine learning model. Now part of Domain Authority looks at Google rankings and uses machine learning to try to fit the model in to predict how those rankings are stacked.

2. New Machine Learning Model

Now the new Domain Authority not only looks at what’s winning in Google search, but it’s also looking at what’s not ranking in Google search. The old model used to just look at the winners. This makes it much more accurate at determining where you might fall or where any domain or URL might fall within that prediction. 

3. Spam Score Incorporation

Next the new Domain Authority incorporates spam detection.

Spam Score is a proprietary Moz metric that looks at a bunch of on-page factors, and those have been incorporated into the new metric, which makes it much more reliable. 

4. Detects Link Manipulation

It also, and this is very important, the new Domain Authority detects link manipulation. This is people that are buying and selling links, PBNs, things like that.

It’s much better. In fact, Russ Jones, in a recent webinar, said that link buyers with the new Domain Authority will drop an average of 11 points. So the new Domain Authority is much better at rooting out this link manipulation, just like Google is attempting to do. So it much more closely resembles what Google is attempting.

5. Daily Updates

Lastly, the new Domain Authority is updated daily. This is a huge improvement. The old Domain Authority used to update about approximately every month or so.* The new Domain Authority is constantly being updated, and our search scientists are constantly adding improvements as they come along.

So it’s being updated much more frequently and improved much more frequently. So what does this mean? The new Domain Authority is the most accurate domain-level metric to predict Google search results that we know of. When you look at ranking factors that we know of, like title tags or even generally backlinks, they predict a certain amount of rankings. But Domain Authority blows those out of the water in its ranking potential.

*Note: Our former link research tool, Open Site Explorer, updated on a monthly cadence, resulting in monthly updates to DA scores. With the launch of Link Explorer in April 2018, Domain Authority scores moved to a daily update cadence. This remains true with the new underlying algorithm, Domain Authority 2.0.

How to Use Domain Authority for SEO

So the question is how do we actually use this? We have this tremendous power with Domain Authority that can predict rankings to a certain degree. How do we use this for SEO? So I want to go over some general tips for success. 

The first tip, never use Domain Authority in isolation. You always want to use it with other metrics and in context, because it can only tell you so much.

It’s a powerful tool, but it’s limited. For example, when you’re looking at rankings on-page, you’re going to want to look at the keyword targeting. You’re going to want to look at the on-page content, the domain history, other things like that. So never use Domain Authority by itself. That’s a key tip. 

Second, you want to keep in mind that the scale of Domain Authority is roughly logarithmic.

It’s not linear. Now what does this mean? It’s fairly easy to move from a zero Domain Authority or a one Domain Authority to a ten Domain Authority. You can get a handful of links, and that works pretty well. But moving from like a 70 to an 80 is much, much harder. It gets harder as you get higher. So a DA 40 is not twice a DA 20.

It’s actually much, much bigger because as you go higher and higher and higher, until you get to 100, it gets much harder. Sites like Google and Facebook, they’re near the 100 range, and everything else comes into it. It’s almost like a funnel. 

Next, keep in mind that DA is a relative metric. When you’re using DA, you always want to compare between competitors or your past scores.

Having a DA 50 doesn’t really tell you much unless you’re comparing it to other DA scores. So if you’re looking in Google and a site has a DA of 50, it doesn’t make much sense unless you put it in the context of “what do the other sites have?” Are they 40? Are they 60? In that regard, when you’re looking at your own DA, you can compare against past performance or competitors.

So if I have a 50 this month and a 40 last month, that might tell me that my ability to rank in Google has increased in that time period. 

1. Evaluate Potential Value of a Link

So talking about SEO use cases, we have this. We understand how to use it. What are some practical ways to use Domain Authority? Well, a very popular one with the old DA as well is judging the potential value of a link.

For instance, you have 1,000 outreach targets that you’re thinking about asking for a link, but you only have time for 100 because you want to spend your time wisely and it’s not worth it to ask all 1,000. So you might use DA as a filter to find the most valuable link targets. A DA 90 might be more valuable than a DA 5 or a 10.

But again, you do not want to use it in isolation. You’d be looking at other metrics as well, such as Page Authority, relevance, and traffic. But still DA might be a valuable metric to add to that experience. 

2. Judging Keyword Difficulty

Judging keyword difficulty, judging when you look at SERPs and see what is my potential of ranking for this SERP with this particular keyword?

If you look at a SERP and everybody has a DA 95, it’s going to be pretty hard to rank in that SERP. But if everybody has a lower DA, you might have a chance. But again, you’re going to want to look at other metrics, such as Page Authority, keyword volume, on-page targeting. You can use Moz’s Keyword Difficulty Score to run these calculations as well.

3. Campaign Performance

Very popular in the agency world is link campaign performance or campaign performance in general, and this kind of makes sense. If you’re building links for a client and you want to show progress, a common way of doing this is showing Domain Authority, meaning that we built these links for you and now your potential to rank is higher.

It’s a good metric, but it’s not the only metric I would report. I would definitely report rankings for targeted keywords. I would report traffic and conversions, because ranking potential is one thing, but I’d actually like to show that those links actually did something. So I’d be more inclined to show the other things. But DA is perfectly fine to report for campaign performance as long as you show it in context.

4. Purchasing Existing Domains

A popular one on the marketplaces is buying existing domains. Sites like Flippa often show DA or some similar metric like that. Again, the new Domain Authority is going to be much better at rooting out link manipulation, so these scores might be a little more trustworthy in this sense. But again, never buy a domain just on Domain Authority alone.

You’re going to want to look at a lot of factors, such as the content, the traffic, the domain history, things like that. But Domain Authority might be a good first-line filter for you. 

How to Find Domain Authority Metrics

So where can you find the new Domain Authority? It is available right now. You can go to Link Explorer. It’s available through the Moz API.

The free MozBar, you can download the MozBar for free and turn on SERP overlay, and it will show you the DA of everything as you browse through Google. 

It’s available in Moz Campaigns and also Keyword Explorer. I hope this gives you some ideas about how to use Domain Authority. Please share your ideas and thoughts in the comments below. If you like this video, please share.

Thanks a lot, everybody. Have a great day.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

7 Red Flags to Watch Out For When Auditing Your Link Profile – Whiteboard Friday

Posted by KameronJenkins

From irrelevant, off-topic backlinks to cookie-cutter anchor text, there are more than a few clues hidden in your backlink profile that something spammy is going on. Alone they might not be something to worry about, but in conjunction, common red flags can spell trouble when you’re performing an audit on your backlink profile. In this week’s Whiteboard Friday, Kameron Jenkins shares her best advice from years working with clients on what to watch out for in a link profile audit.


Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, guys. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz. Today we’re going to be talking about auditing your backlink profile, why you might want to do it, when you should do it, and then how to do it. So let’s just dive right in.

It might be kind of confusing to be talking about auditing your backlink profile. When I say auditing your backlink profile, I’m specifically talking about trying to diagnose if there’s anything funky or manipulative going on. There’s been quite a bit of debate among SEOs, so in a post-Penguin 4.0 world, we all wonder if Google can ignore spammy backlinks and low-quality backlinks, why would we also need to disavow, which essentially tells Google the same thing: “Just ignore these links.”

I posed three reasons why we might still want to consider this in some situations. 

Why should you audit your backlink profile?

Disavow is still offered

Disavow is still an option — you can go to and submit a disavow file right now if you wanted to.

You can still get manual penalties

Google still has guidelines that outline all of the link schemes and types of link manipulation. If you violate those, you could get a manual penalty. In your Google Search Console, it will say something like unnatural links to your site detected, total or partial. You can still get those. That’s another reason I would say that the disavow is still something you could consider doing.

Google says their stance hasn’t changed

I know there’s like a little bit of back-and-forth about this, but technically Google has said, “Our stance hasn’t changed. Still use the disavow file carefully and when it’s appropriate.” So we’ll talk about when it might be appropriate, but that’s why we consider that this is still a legitimate activity that you could do.

When should you audit your backlink profile?

Look for signs of a link scheme or link manipulation

I would say that, in today’s climate, it’s probably best just to do this when you see overt signs of a link scheme or link manipulation, something that looks very wrong or very concerning. Because Google is so much better at uncovering when there are manipulative links and just ignoring them and not penalizing a whole site for them, it’s not as important, I think, to be as aggressive as we maybe used to be previously. But if you do, maybe you inherit a client and you just look at their link profile for the first time and you notice that there’s something sketchy in there, I might want to consider doing it if there are signs. You’re an SEO. You can detect the signs of whether there’s a link scheme going on.

How do you audit your backlink profile?

Check for red flags in Moz Link Explorer

But if you’re not quite sure how to diagnose that, check for red flags in Moz Link Explorer, and that’s the second part of this. We’re going to go through some red flags that I have noticed. But huge disclaimer — seven possible red flags. Please don’t just take one of these and say, “Oh, I found this,” and immediately disavow.

These are just things that I have noticed over time. I started in SEO in 2012, right around the time of Penguin, and so I did a lot of cleanup of so many spammy links. I kind of just saw patterns, and this is the result of that. I think that’s stayed true over the last couple of years, links that haven’t been cleaned up. Some people are still doing these kinds of low-quality link building techniques that actually could get you penalized.

These are some things that I have noticed. They should just pique your interest. If you see something like this, if you detect one of these red flags, it should prompt you to look into it further, not immediately write off those links as “those are bad.” They’re just things to spark your interest so that you can explore further on your own. So with that big disclaimer, let’s dive into the red flags.

7 possible red flags

1. Irrelevance

Countries you don’t serve

A couple of examples of this. Maybe you are working on a client. They are US-based, and all of their locations are in the US. Their entire audience is US-based. But you get a quick glimpse of the inbound links. Maybe you’re on Link Explorer and you go to the inbound links report and you see a bunch of domains linking to you that are .ru and .pl, and that’s kind of confusing. Why is my site getting a huge volume of links from other countries that we don’t serve and we don’t have any content in Russian or Polish or anything like that? So that might spark my interest to look into it further. It could be a sign of something.

Off-topic links

Another thing is off-topic. My favorite example, just because it was so ridiculous, was I was working with an Atlanta DUI attorney, and he had a huge chunk of backlinks that were from party planning, like low-quality party planning directories, and they didn’t make any sense. I clicked on them just to see what it was. You can go to it and see okay, yes, there really is no reason they should be linking to each other. It was clear he just went to Fiverr and was like, “$ 5, here build me links,” and he didn’t care where they came from. So you might notice a lot of totally off-topic, irrelevant stuff.

But obviously a disclaimer, it might look irrelevant, but then when you dive in further, they are in the same market and they kind of have a co-marketing relationship going on. Just be careful with that. But it could be a sign that there is some link manipulation going on if you have totally off-topic links in there.

2. Anchor text

The second red flag is anchor text. Again, this is another cool report in Moz Link Explorer. You can go in there and see the anchor text report. When I notice that there’s link manipulation going on, usually what I see is that there is a huge majority of their backlinks coming with the same exact anchor text, and usually it’s the exact match keyword that they want to rank for. That’s usually a huge earmark of, hey, they’ve been doing some shady linking.

The example I like to use for this and why that is concerning — and there’s no percentage that’s like, whoa, that’s manipulative. But if you see a really disproportionate percentage of links coming with the same exact anchor text, it might prompt you to look into it further. The example I like to use is, say you meet with five different friends throughout the course of your day, different occasions. They’re not all in the same room with you. You talk to each of them and they all say, “Hey, yeah, my weekend was great, but like I broke my foot.” You would be suspicious: “What, they all broke their foot? This is weird. What’s going on?”

Same thing with anchor text. If you’re earning links naturally, they’re not all going to look the same and mechanical. Something suspicious is probably going on if they’re all linking with the exact same anchor text. So that’s that.

3. Nofollow/follow

Nofollow to follow, this is another one — please don’t use this as a sweeping rule, because I think even Russ Jones has come out and said at a mass scale that’s not a good predictor of spamminess. But what I have tended to see is usually if they also have spammy anchor text and they’re irrelevant, usually I also see that there’s a really, really disproportionate ratio of nofollow to follow. Use these red flags in conjunction with each other. When they start to pile on, it’s even more of a sign to me that there’s something fishy going on.

Nofollow to follow, you might see something ridiculous. Again, it’s something you can see in Link Explorer. Maybe like 99% of all of their backlinks are follow, which are the ones that pass PageRank. If you’re going to do a link scheme, you’re going to go out and get the ones that you think are going to pass PageRank to your site. Then one percent or no percent is nofollow. It may be something to look into.

4. Links/domains

Same thing with links to domains. Again, not an overt sign of spamminess. There’s no magic ratio here. But sometimes when I notice all of these other things, I will also notice that there’s a really disproportionate ratio of, say, they have 10,000 inbound links, but they’re coming from only 5 domains. Sometimes this happens. An example of this: I was auditing a client’s backlink profile, and they had set up five different websites, and on those websites they had put site-wide links to all of their other websites. They had created their own little network. By linking to each other, they were hoping to bolster all of their sites’ authority. Obviously, be careful with something like that. It could indicate that you’re self-creating follow links, which is a no-no.

5. Domain naming

“DIR” or “directory”

This one is just kind of like the eyeball test, which I’ll get to later. If you go to your inbound links, you can start to notice domain names that just look weird, and they’ll start to look off the more you look into stuff like this. When I was doing a lot of backlink auditing, what I noticed was that a lot of these spammier links came from low-quality directory submission sites. A lot of those tend to have or they would say “directory” in it or “DIR,” so like bestlinkdir.co, whatever. A lot of times when they have naming conventions like that, I have noticed that those tend to be low-quality directory submission sites. You could even eyeball or scan and see if there are any “DIR” directory-type of links.

“Article”

Same thing with articles. Like back in the day, when people use to submit like e-zine articles or Article Base or something like that, if it has the word “article” in the domain name, it might be something to look into. Maybe they were doing some low-quality article submission with backlinks to their site. 

“SEO”/”links”

Then if you tend to see a lot of links in their backlink profile that have like SEO link type naming conventions, unless you’re working on a site that is in the SEO space, they shouldn’t have a bunch of links that say like bestSEOsite.com or bestlinksforyou.com. I’ve seen a lot of that. It’s just something that I have noticed. It’s something to maybe watch out for.

6. Tool metrics

These can be super helpful. If you see tool metrics that maybe there is a really high Spam score, it’s something to look into. It might be helpful that Moz on their Help Hub has a list of all 27 criteria that they look at when evaluating a site’s spamminess. That might be something helpful to look into how Moz’s Spam Score calculates spamminess. 

DA and PA, just to know on this Domain Authority and Page Authority, if you see links coming from low DA or low PA URLs, just make sure you don’t write those off right off the bat. It could just be that those domains are very new. Maybe they haven’t engaged in a lot of marketing yet. It doesn’t necessarily mean they’re spammy. It just means they haven’t done much to earn any authority. Watch out for kind of writing off links and thinking they’re spammy just because they have a low DA or PA. Just something to consider.

7. Eyeball test

Then finally we have the eyeball test. Like I said, the more you do this, and it’s not something that you should be engaging in constantly all the time nowadays, but you’ll start to notice patterns if you are working on clients with spammier link profiles. These kind of low-quality sites tend to have like the same template. You’ll have 100 sites that are all blue. They have the exact same navigation, exact same logo. They’re all on the same network. You’ll start to notice themes like that. A lot of times they don’t have any contact information because no one maintains the things. They’re just up for the purpose of links. They don’t care about them, so no phone number, no contact information, no email address, nothing. Also a telltale sign, which I tend to notice on these like self-submission type of link sites is they’ll have a big PayPal button on the top and it will say, “Pay to Submit Links” or even worse it will be like “Uses this PayPal to get your links removed from this site,” because they know that it’s low-quality and people ask them all the time. Just something to consider on the eyeball test front.

I hope this was helpful. Hopefully it helped you understand when you might want to do this, when you might not want to do this, and then if you do try to engage in some kind of link audit, some things to watch out for. So I hope that was helpful. If you have any tips for this, if you’ve noticed anything else that you think would be helpful for other SEOs to know, drop it in the comments.

That’s it for this week’s Whiteboard Friday. Come back again next week for another one.

Check my link profile!

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

What a Two-Tiered SERP Means for Content Strategy – Whiteboard Friday

Posted by willcritchlow

If you’re a big site competing to rank for popular head terms, where’s the best place to focus your content strategy? According to a hypothesis by the good folks at Distilled, the answer may lie in perfectly satisfying searcher intent.

Click on the whiteboard image above to open a high resolution version in a new tab!

If you haven’t heard the news, the Domain Authority metric discussed in this episode will be updated on March 5th, 2019 to better correlate with Google algorithm changes. Learn about what’s changing below:

Learn more about the new DA


Video Transcription

Hi, Whiteboard Friday fans. I’m Will Critchlow, one of the founders at Distilled, and what I want to talk about today is joining the dots between some theoretical work that some of my colleagues have been doing and some of the client work that we’ve been doing recently and the results that we’ve been seeing from that in the wild and what I think it means for strategies for different-sized sites going on from here.

Correlations and a hypothesis

The beginning of this I credit to one of my colleagues, Tom Capper, THCapper on Twitter, who presented at our Search Love London conference a presentation entitled “The Two-Tiered SERP,” and I’m going to describe what that means in just a second. But what I’m going to do today is talk about what I think that the two-tiered SERP means for content strategy going forward and base that a little bit on some of what we’re seeing in the wild with some of our client projects.

What Tom presented at Search Love London was he started by looking at the fact that the correlation between domain authority and rankings has decreased over time. So he pulled out some stats from February 2017 and looked at those same stats 18 months later and saw a significant drop in the correlation between domain authority and rankings. This ties into a bunch of work that he’s done and presented elsewhere around potentially less reliance on links going forward and some other data that Google might be using, some other metrics and ranking factors that they might be using in their place, particularly branded metrics and so forth.

But Tom saw this drop and had a hypothesis that it wasn’t just an across-the-board drop. This wasn’t just Google not using links anymore or using links less. It was actually a more granular effect than that. This is the two-tiered SERP or what we mean by the two-tiered SERP. So a search engine result page, a SERP, you’ve got some results at the top and some results further down the page.

What Tom found — he had this hypothesis that was born out in the data — was that the correlation between domain authority and rankings was much higher among the positions 6 through 10 than it was among the top half of the search results page and that this can be explained by essentially somewhat traditional ranking factors lower down the page and in lower competition niches and that at the top of the page, where there’s more usage data, greater search volume and so forth in these top positions, that traditional ranking factors played less of a part.

They maybe get you into the consideration set. There are no domains ranking up here that are very, very weak. But once you’re in the consideration set, there’s much less of a correlation between these different positions. So it’s still true on average that these positions 1 through 5 are probably more authoritative than the sites that are appearing in lower positions. But within this set there’s less predictive value.

The domain authority is less predictive of ranking within this set than it is of ranking within this set. So this is the two-tiered SERP, and this is consistent with a bunch of data that we’ve seen across the place and in particular with the outcomes that we’re seeing among content campaigns and content strategies for different kinds of sites.

At Distilled, we get quite a lot of clients coming to us wanting either a content strategy put together or in some cases coming to us essentially with their content strategy and saying, “Can you execute this? Can you help us execute this plan?” It’s very common for that plan to be, “We want to create a bunch of big pieces of content that get a ton of links, and we’re going to use that link authority to make our site more authoritative and that is going to result in our whole site doing better and ranking better.”

An anonymized case study

We’ve seen that that is performing differently in different cases, and in particular it’s performing better on smaller sites than it is on big sites. So this is a little anonymized case study. This is a real example of a story that happened with one of our consulting clients where we put in place a content strategy for them that did include a plan to build the domain authority because this was a site that came to us with a domain authority significantly below that of their key competitors, also with all of these sites not having a ton of domain authority.

This was working in a B2B space, relatively small domains. They came to us with that, and we figured that actually growing the authority was a key part of this content strategy and over the next 18 months put out a bunch of pieces that have done really well and generated a ton of press coverage and traction and things. Over that time, they’ve actually outstripped their key competitors in the domain authority metrics, and crucially we saw that tie directly to increases in traffic that went hand-in-hand with this increase in domain authority.

But this contrasts to what we’ve seen with some much larger sites in much more competitive verticals where they’re already very, very high domain authority, maybe they’re already stronger than some of their competitors and adding to that. So adding big content pieces that get even more big authoritative links has not moved the needle in the way that it might have done a few years ago.

That’s totally consistent with this kind of setup, where if you are currently trying to edge in the bottom or you’re competing for less competitive search terms, then this kind of approach might really work for you and it might, in fact, be necessary to get into the consideration set for the more competitive end. But if you’re operating on a much bigger site, you’ve already got the competitive domain authority, you and your competitors are all very powerful sites, then our kind of hypothesis is that you’re going to be needing to look more towards the user experience, the conversion rate, and intent research.

Are you satisfying searcher intent for competitive head terms?

What is somebody who performs this search actually looking to do? Can you satisfy that intent? Can you make sure that they don’t bounce back to the search results and click on a competitor? Can you make sure that in fact they stay on your site, they get done the thing they want to get done, and it all works out for them, because we think that these kinds of things are going to be much more powerful for moving up through the very top end of the most competitive head terms.

So when we’re working on a content strategy or putting our creative team to work on these kinds of things on bigger sites, we’re more likely to be creating content directly designed to rank. We might be creating content based off a ton of this research, and we’re going to be incrementally improving those things to try and say, “Have we actually satisfied the perfect intent for this super competitive head term?”

What we’re seeing is that’s more likely to move the needle up at this top end than growing the domain authority on a big site. So I hope you found that interesting. I’m looking forward to a vigorous discussion in the comments on this one. But thank you for joining me for this week’s Whiteboard Friday. I’ve been Will Critchlow from Distilled. Take care.

Video transcription by Speechpad.com


Learn about Domain Authority 2.0!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

4 Ways to Improve Your Data Hygiene – Whiteboard Friday

Posted by DiTomaso

We base so much of our livelihood on good data, but managing that data properly is a task in and of itself. In this week’s Whiteboard Friday, Dana DiTomaso shares why you need to keep your data clean and some of the top things to watch out for.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi. My name is Dana DiTomaso. I am President and partner at Kick Point. We’re a digital marketing agency, based in the frozen north of Edmonton, Alberta. So today I’m going to be talking to you about data hygiene.

What I mean by that is the stuff that we see every single time we start working with a new client this stuff is always messed up. Sometimes it’s one of these four things. Sometimes it’s all four, or sometimes there are extra things. So I’m going to cover this stuff today in the hopes that perhaps the next time we get a profile from someone it is not quite as bad, or if you look at these things and see how bad it is, definitely start sitting down and cleaning this stuff up.

1. Filters

So what we’re going to start with first are filters. By filters, I’m talking about analytics here, specifically Google Analytics. When go you into the admin of Google Analytics, there’s a section called Filters. There’s a section on the left, which is all the filters for everything in that account, and then there’s a section for each view for filters. Filters help you exclude or include specific traffic based on a set of parameters.

Filter out office, home office, and agency traffic

So usually what we’ll find is one Analytics property for your website, and it has one view, which is all website data which is the default that Analytics gives you, but then there are no filters, which means that you’re not excluding things like office traffic, your internal people visiting the website, or home office. If you have a bunch of people who work from home, get their IP addresses, exclude them from this because you don’t necessarily want your internal traffic mucking up things like conversions, especially if you’re doing stuff like checking your own forms.

You haven’t had a lead in a while and maybe you fill out the form to make sure it’s working. You don’t want that coming in as a conversion and then screwing up your data, especially if you’re a low-volume website. If you have a million hits a day, then maybe this isn’t a problem for you. But if you’re like the rest of us and don’t necessarily have that much traffic, something like this can be a big problem in terms of the volume of traffic you see. Then agency traffic as well.

So agencies, please make sure that you’re filtering out your own traffic. Again things like your web developer, some contractor you worked with briefly, really make sure you’re filtering out all that stuff because you don’t want that polluting your main profile.

Create a test and staging view

The other thing that I recommend is creating what we call a test and staging view. Usually in our Analytics profiles, we’ll have three different views. One we call master, and that’s the view that has all these filters applied to it.

So you’re only seeing the traffic that isn’t you. It’s the customers, people visiting your website, the real people, not your office people. Then the second view we call test and staging. So this is just your staging server, which is really nice. For example, if you have a different URL for your staging server, which you should, then you can just include that traffic. Then if you’re making enhancements to the site or you upgraded your WordPress instance and you want to make sure that your goals are still firing correctly, you can do all that and see that it’s working in the test and staging view without polluting your main view.

Test on a second property

That’s really helpful. Then the third thing is make sure to test on a second property. This is easy to do with Google Tag Manager. What we’ll have set up in most of our Google Tag Manager accounts is we’ll have our usual analytics and most of the stuff goes to there. But then if we’re testing something new, like say the content consumption metric we started putting out this summer, then we want to make sure we set up a second Analytics view and we put the test, the new stuff that we’re trying over to the second Analytics property, not view.

So you have two different Analytics properties. One is your main property. This is where all the regular stuff goes. Then you have a second property, which is where you test things out, and this is really helpful to make sure that you’re not going to screw something up accidentally when you’re trying out some crazy new thing like content consumption, which can totally happen and has definitely happened as we were testing the product. You don’t want to pollute your main data with something different that you’re trying out.

So send something to a second property. You do this for websites. You always have a staging and a live. So why wouldn’t you do this for your analytics, where you have a staging and a live? So definitely consider setting up a second property.

2. Time zones

The next thing that we have a lot of problems with are time zones. Here’s what happens.

Let’s say your website, basic install of WordPress and you didn’t change the time zone in WordPress, so it’s set to UTM. That’s the default in WordPress unless you change it. So now you’ve got your data for your website saying it’s UTM. Then let’s say your marketing team is on the East Coast, so they’ve got all of their tools set to Eastern time. Then your sales team is on the West Coast, so all of their tools are set to Pacific time.

So you can end up with a situation where let’s say, for example, you’ve got a website where you’re using a form plugin for WordPress. Then when someone submits a form, it’s recorded on your website, but then that data also gets pushed over to your sales CRM. So now your website is saying that this number of leads came in on this day, because it’s in UTM mode. Well, the day ended, or it hasn’t started yet, and now you’ve got Eastern, which is when your analytics tools are recording the number of leads.

But then the third wrinkle is then you have Salesforce or HubSpot or whatever your CRM is now recording Pacific time. So that means that you’ve got this huge gap of who knows when this stuff happened, and your data will never line up. This is incredibly frustrating, especially if you’re trying to diagnose why, for example, I’m submitting a form, but I’m not seeing the lead, or if you’ve got other data hygiene issues, you can’t match up the data and that’s because you have different time zones.

So definitely check the time zones of every product you use –website, CRM, analytics, ads, all of it. If it has a time zone, pick one, stick with it. That’s your canonical time zone. It will save you so many headaches down the road, trust me.

3. Attribution

The next thing is attribution. Attribution is a whole other lecture in and of itself, beyond what I’m talking about here today.

Different tools have different ways of showing attribution

But what I find frustrating about attribution is that every tool has its own little special way of doing it. Analytics is like the last non-direct click. That’s great. Ads says, well, maybe we’ll attribute it, maybe we won’t. If you went to the site a week ago, maybe we’ll call it a view-through conversion. Who knows what they’re going to call it? Then Facebook has a completely different attribution window.

You can use a tool, such as Supermetrics, to change the attribution window. But if you don’t understand what the default attribution window is in the first place, you’re just going to make things harder for yourself. Then there’s HubSpot, which says the very first touch is what matters, and so, of course, HubSpot will never agree with Analytics and so on. Every tool has its own little special sauce and how they do attribution. So pick a source of truth.

Pick your source of truth

This is the best thing to do is just say, “You know what? I trust this tool the most.” Then that is your source of truth. Do not try to get this source of truth to match up with that source of truth. You will go insane. You do have to make sure that you are at least knowing that things like your time zones are clear so that’s all set.

Be honest about limitations

But then after that, really it’s just making sure that you’re being honest about your limitations.

Know where things are necessarily going to fall down, and that’s okay, but at least you’ve got this source of truth that you at least can trust. That’s the most important thing with attribution. Make sure to spend the time and read how each tool handles attribution so when someone comes to you and says, “Well, I see that we got 300 visits from this ad campaign, but in Facebook it says we got 6,000.

Why is that? You have an answer. That might be a little bit of an extreme example, but I mean I’ve seen weirder things with Facebook attribution versus Analytics attribution. I’ve even talked about stuff like Mixpanel and Kissmetrics. Every tool has its own little special way of recording attributions. It’s never the same as anyone else’s. We don’t have a standard in the industry of how this stuff works, so make sure you understand these pieces.

4. Interactions

Then the last thing are what I call interactions. The biggest thing that I find that people do wrong here is in Google Tag Manager it gives you a lot of rope, which you can hang yourself with if you’re not careful.

GTM interactive hits

One of the biggest things is what we call an interactive hit versus a non-interactive hit. So let’s say in Google Tag Manager you have a scroll depth.

You want to see how far down the page people scroll. At 25%, 50%, 75%, and 100%, it will send off an alert and say this is how far down they scrolled on the page. Well, the thing is that you can also make that interactive. So if somebody scrolls down the page 25%, you can say, well, that’s an interactive hit, which means that person is no longer bounced, because it’s counting an interaction, which for your setup might be great.

Gaming bounce rate

But what I’ve seen are unscrupulous agencies who come in and say if the person scrolls 2% of the way down the page, now that’s an interactive hit. Suddenly the client’s bounce rate goes down from say 80% to 3%, and they think, “Wow, this agency is amazing.” They’re not amazing. They’re lying. This is where Google Tag Manager can really manipulate your bounce rate. So be careful when you’re using interactive hits.

Absolutely, maybe it’s totally fair that if someone is reading your content, they might just read that one page and then hit the back button and go back out. It’s totally fair to use something like scroll depth or a certain piece of the content entering the user’s view port, that that would be interactive. But that doesn’t mean that everything should be interactive. So just dial it back on the interactions that you’re using, or at least make smart decisions about the interactions that you choose to use. So you can game your bounce rate for that.

Goal setup

Then goal setup as well, that’s a big problem. A lot of people by default maybe they have destination goals set up in Analytics because they don’t know how to set up event-based goals. But what we find happens is by destination goal, I mean you filled out the form, you got to a thank you page, and you’re recording views of that thank you page as goals, which yes, that’s one way to do it.

But the problem is that a lot of people, who aren’t super great at interneting, will bookmark that page or they’ll keep coming back to it again and again because maybe you put some really useful information on your thank you page, which is what you should do, except that means that people keep visiting it again and again without actually filling out the form. So now your conversion rate is all messed up because you’re basing it on destination, not on the actual action of the form being submitted.

So be careful on how you set up goals, because that can also really game the way you’re looking at your data.

Ad blockers

Ad blockers could be anywhere from 2% to 10% of your audience depending upon how technically sophisticated your visitors are. So you’ll end up in situations where you have a form fill, you have no corresponding visit to match with that form fill.

It just goes into an attribution black hole. But they did fill out the form, so at least you got their data, but you have no idea where they came from. Again, that’s going to be okay. So definitely think about the percentage of your visitors, based on you and your audience, who probably have an ad blocker installed and make sure you’re comfortable with that level of error in your data. That’s just the internet, and ad blockers are getting more and more popular.

Stuff like Apple is changing the way that they do tracking. So definitely make sure that you understand these pieces and you’re really thinking about that when you’re looking at your data. Again, these numbers may never 100% match up. That’s okay. You can’t measure everything. Sorry.

Bonus: Audit!

Then the last thing I really want you to think about — this is the bonus tip — audit regularly.

So at least once a year, go through all the different stuff that I’ve covered in this video and make sure that nothing has changed or updated, you don’t have some secret, exciting new tracking code that somebody added in and then forgot because you were trying out a trial of this product and you tossed it on, and it’s been running for a year even though the trial expired nine months ago. So definitely make sure that you’re running the stuff that you should be running and doing an audit at least on an yearly basis.

If you’re busy and you have a lot of different visitors to your website, it’s a pretty high-volume property, maybe monthly or quarterly would be a better interval, but at least once a year go through and make sure that everything that’s there is supposed to be there, because that will save you headaches when you look at trying to compare year-over-year and realize that something horrible has been going on for the last nine months and all of your data is trash. We really don’t want to have that happen.

So I hope these tips are helpful. Get to know your data a little bit better. It will like you for it. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Redirects: One Way to Make or Break Your Site Migration – Whiteboard Friday

Posted by KameronJenkins

Correctly redirecting your URLs is one of the most important things you can do to make a site migration go smoothly, but there are clear processes to follow if you want to get it right. In this week’s Whiteboard Friday, Kameron Jenkins breaks down the rules of redirection for site migrations to make sure your URLs are set up for success.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, guys. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz. What we’re going to be talking about today is redirects and how they’re one way that you can make or break your site migration. Site migration can mean a lot of different things depending on your context.

Migrations?

I wanted to go over quickly what I mean before we dive into some tips for avoiding redirection errors. When I talk about migration, I’m coming from the experience of these primary activities.

CMS moving/URL format

One example of a migration I might be referring to is maybe we’re taking on a client and they previously used a CMS that had a default kind of URL formatting, and it was dated something.

So it was like /2018/May/ and then the post. Then we’re changing the CMS. We have more flexibility with how our pages, our URLs are structured, so we’re going to move it to just /post or something like that. In that way a lot of URLs are going to be moving around because we’re changing the way that those URLs are structured.

“Keywordy” naming conventions

Another instance is that sometimes we’ll get clients that come to us with kind of dated or keywordy URLs, and we want to change this to be a lot cleaner, shorten them where possible, just make them more human-readable.

An example of that would be maybe the client used URLs like /best-plumber-dallas, and we want to change it to something a little bit cleaner, more natural, and not as keywordy, to just /plumbers or something like that. So that can be another example of lots of URLs moving around if we’re taking over a whole site and we’re kind of wanting to do away with those.

Content overhaul

Another example is if we’re doing a complete content overhaul. Maybe the client comes to us and they say, “Hey, we’ve been writing content and blogging for a really long time, and we’re just not seeing the traffic and the rankings that we want. Can you do a thorough audit of all of our content?” Usually what we notice is that you have maybe even thousands of pages, but four of them are ranking.

So there are a lot of just redundant pages, pages that are thin and would be stronger together, some pages that just don’t really serve a purpose and we want to just let die. So that’s another example where we would be merging URLs, moving pages around, just letting some drop completely. That’s another example of migrating things around that I’m referring to.

Don’t we know this stuff? Yes, but…

That’s what I’m referring to when it comes to migrations. But before we dive in, I kind of wanted to address the fact that like don’t we know this stuff already? I mean I’m talking to SEOs, and we all know or should know the importance of redirection. If there’s not a redirect, there’s no path to follow to tell Google where you’ve moved your page to.

It’s frustrating for users if they click on a link that no longer works, that doesn’t take them to the proper destination. We know it’s important, and we know what it does. It passes link equity. It makes sure people aren’t frustrated. It helps to get the correct page indexed, all of those things. So we know this stuff. But if you’re like me, you’ve also been in those situations where you have to spend entire days fixing 404s to correct traffic loss or whatever after a migration, or you’re fixing 301s that were maybe done but they were sent to all kinds of weird, funky places.

Mistakes still happen even though we know the importance of redirects. So I want to talk about why really quickly.

Unclear ownership

Unclear ownership is something that can happen, especially if you’re on a scrappier team, a smaller team and maybe you don’t handle these things very often enough to have a defined process for this. I’ve been in situations where I assumed the tech was going to do it, and the tech assumed that the project assistant was going to do it.

We’re all kind of pointing fingers at each other with no clear ownership, and then the ball gets dropped because no one really knows whose responsibility it is. So just make sure that you designate someone to do it and that they know and you know that that person is going to be handling it.

Deadlines

Another thing is deadlines. Internal and external deadlines can affect this. So one example that I encountered pretty often is the client would say, “Hey, we really need this project done by next Monday because we’re launching another initiative. We’re doing a TV commercial, and our domain is going to be listed on the TV commercial. So I’d really like this stuff wrapped up when those commercials go live.”

So those kind of external deadlines can affect how quickly we have to work. A lot of times it just gets left by the wayside because it is not a very visible thing. If you don’t know the importance of redirects, you might handle things like content and making sure the buttons all work and the template looks nice and things like that, the visible things. Where people assume that redirects, oh, that’s just a backend thing. We can take care of it later. Unfortunately, redirects usually fall into that category if the person doing it doesn’t really know the importance of it.

Another thing with deadlines is internal deadlines. Sometimes maybe you might have a deadline for a quarterly game or a monthly game. We have to have all of our projects done by this date. The same thing with the deadlines. The redirects are usually unfortunately something that tends to miss the cutoff for those types of things.

Non-SEOs handling the redirection

Then another situation that can cause site migration errors and 404s after moving around is non-SEOs handling this. Now you don’t have to be a really experienced SEO usually to handle these types of things. It depends on your CMS and how complicated is the way that you’re implementing your redirects. But sometimes if it’s easy, if your CMS makes redirection easy, it can be treated as like a data entry-type of job, and it can be delegated to someone who maybe doesn’t know the importance of doing all of them or formatting them properly or directing them to the places that they’re supposed to go.

The rules of redirection for site migrations

Those are all situations that I’ve encountered issues with. So now that we kind of know what I’m talking about with migrations and why they kind of sometimes still happen, I’m going to launch into some rules that will hopefully help prevent site migration errors because of failed redirects.

1. Create one-to-one redirects

Number one, always create one-to-one redirects. This is super important. What I’ve seen sometimes is oh, man, it could save me tons of time if I just use a wildcard and redirect all of these pages to the homepage or to the blog homepage or something like that. But what that tells Google is that Page A has moved to Page B, whereas that’s not the case. You’re not moving all of these pages to the homepage. They haven’t actually moved there. So it’s an irrelevant redirect, and Google has even said, I think, that they treat those essentially as a soft 404. They don’t even count. So make sure you don’t do that. Make sure you’re always linking URL to its new location, one-to-one every single time for every URL that’s moving.

2. Watch out for redirect chains

Two, watch out for chains. I think Google says something oddly specific, like watch out for redirect chains, three, no more than five. Just try to limit it as much as possible. By chains, I mean you have URL A, and then you redirect it to B, and then later you decide to move it to a third location. Instead of doing this and going through a middleman, A to B to C, shorten it if you can. Go straight from the source to the destination, A to C.

3. Watch out for loops

Three, watch out for loops. Similarly what can happen is you redirect position A to URL B to another version C and then back to A. What happens is it’s chasing its tail. It will never resolve, so you’re redirecting it in a loop. So watch out for things like that. One way to check those things I think is a nifty tool, Screaming Frog has a redirect chains report. So you can see if you’re kind of encountering any of those issues after you’ve implemented your redirects.

4. 404 strategically

Number four, 404 strategically. The presence of 404s on your site alone, that is not going to hurt your site’s rankings. It is letting pages die that were ranking and bringing your site traffic that is going to cause issues. Obviously, if a page is 404ing, eventually Google is going to take that out of the index if you don’t redirect it to its new location. If that page was ranking really well, if it was bringing your site traffic, you’re going to lose the benefits of it. If it had links to it, you’re going to lose the benefits of that backlink if it dies.

So if you’re going to 404, just do it strategically. You can let pages die. Like in these situations, maybe you’re just outright deleting a page and it has no new location, nothing relevant to redirect it to. That’s okay. Just know that you’re going to lose any of the benefits that URL was bringing your site.

5. Prioritize “SEO valuable” URLs

Number five, prioritize “SEO valuable” URLs, and I do that because I prefer to obviously redirect everything that you’re moving, everything that’s legitimately moving.

But because of situations like deadlines and things like that, when we’re down to the wire, I think it’s really important to at least have started out with your most important URLs. So those are URLs that are ranking really well, giving you a lot of good traffic, URLs that you’ve earned links to. So those really SEO valuable URLs, if you have a deadline and you don’t get to finish all of your redirects before this project goes live, at least you have those most critical, most important URLs handled first.

Again, obviously, it’s not ideal, I don’t think in my mind, to save any until after the launch. Obviously, I think it’s best to have them all set up by the time it goes live. But if that’s not the case and you’re getting rushed and you have to launch, at least you will have handled the most important URLs for SEO value.

6. Test!

Number six, just to end it off, test. I think it’s super important just to monitor these things, because you could think that you have set these all up right, but maybe there were some formatting errors, or maybe you mistakenly redirected something to the wrong place. It is super important just to test. So what you can do, you can do a site:domain.com and just start clicking on all the results that come up and see if any are redirecting to the wrong place, maybe they’re 404ing.

Just checking all of those indexed URLs to make sure that they’re going to a proper new destination. I think Moz’s Site Crawl is another huge benefit here for testing purposes. What it does, if you have a domain set up or a URL set up in a campaign in Moz Pro, it checks this every week, and you can force another run if you want it to.

But it will scan your site for errors like this, 404s namely. So if there are any issues like that, 500 or 400 type errors, Site Crawl will catch it and notify you. If you’re not managing the domain that you’re working on in a campaign in Moz Pro, there’s on-demand crawl too. So you can run that on any domain that you’re working on to test for things like that.

There are plenty of other ways you can test and find errors. But the most important thing to remember is just to do it, just to test and make sure that even once you’ve implemented these things, that you’re checking and making sure that there are no issues after a launch. I would check right after a launch and then a couple of days later, and then just kind of taper off until you’re absolutely positive that everything has gone smoothly.

So those are my tips, those are my rules for how to implement redirects properly, why you need to, when you need to, and the risks that can happen with that. If you have any tips of your own that you’d like to share, pop them in the comments and share it with all of us in the SEO community. That’s it for this week’s Whiteboard Friday.

Come back again next week for another one. Thanks, everybody.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert