Tag Archive | "Whiteboard"

YouTube SEO: Top Factors to Invest In – Whiteboard Friday

Posted by randfish

If you have an audience on YouTube, are you doing everything you can to reach them? Inspired by a large-scale study from Justin Briggs, Rand covers the top factors to invest in when it comes to YouTube SEO in this week’s episode of Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about YouTube SEO. So I was lucky enough to be speaking at the Search Love Conference down in San Diego a little while ago, and Justin Briggs was there presenting on YouTube SEO and on a very large-scale study that he had conducted with I think it was 100,000 different video rankings across YouTube’s search engine as well as looking at the performance of many thousands of channels and individual videos in YouTube.

Justin came up with some fascinating results. I’ve called them out here @JustinBriggs on Twitter, and his website is Briggsby.com. You can find this study, including an immense amount of data, there. But I thought I would try and sum up some of the most important points that he brought up and some of the conclusions he came to in his research. I do urge you to check out the full study, especially if you’re doing YouTube SEO.

5 crucial elements for video ranking success

So first off, there are some crucial elements for video ranking success. Now video ranking success, what do we mean by that? We mean if you perform a search query in YouTube for a specific keyword, and not necessarily a branded one, what are the things that will come up? So sort of like the same thing we talk about when we talk about Google success ranking factors, these are success factors for YouTube. That doesn’t necessarily mean that these are the things that will get you the most possible views. In fact, some of them work the other way.

1. Video views and watch time

First off, video views and watch time. So it turns out these are both very well correlated and in Justin’s opinion probably causal with higher rankings. So if you have a video and you’re competing against a competitor’s video and you get more views and a greater amount of watch time on average per view — so that’s how many people make it through a greater proportion of the video itself –you tend to do better than your competitors.

2. Keyword matching the searcher’s query in the title

Number two, keyword matching still more important we think on YouTube than it is in classic Google search. That’s not to say it’s not important in classic Google, but that in YouTube it’s even more important. It’s even a bigger factor. Essentially what Justin’s data showed is that exact match keywords, exactly matching the keyword phrase in the video title tended to outperform partial by a little bit, and partial outperformed none or only some by a considerable portion.

So if you’re trying to rank your video for what pandas eat and your video is called “What Pandas Eat,”that’s going to do much better than, for example, “Panda Consumption Habits” or “Panda Food Choices.” So describe your video, name your video in the same way that searchers are searching, and you can get intel into how searchers are using YouTube.

You can also use the data that comes back from Google keyword searches, especially if videos appear at the top of Google keyword searches, that means there’s probably a lot of demand on YouTube as well.

3. Shorter titles (<50 characters) with keyword-rich descriptions

Next up, shorter titles, less than 50 characters, with keyword-rich descriptions between 200 and 350 words tended to perform best in this dataset.

So if you’re looking for guidelines around how big should I make my YouTube title, how big should I make my description, that’s generally probably some best practices. If you leak over a little bit, it’s not a huge deal. The curve doesn’t fall off dramatically. But certainly staying around there is a good idea.

4. Keyword tags

Number four, keyword tags. So YouTube will let you apply keyword tags to a video.

This is something that used to exist in Google SEO decades ago with the meta keywords tag. It still does exist in YouTube. These keyword tags seem to matter a little for rankings, but they seem to matter more for the recommended videos. So those recommended videos are sort of what appear on the right-hand side of the video player if you’re in a desktop view or below the video on a mobile player.

Those recommended videos are also kind of what play when you keep watching a video and it’s what comes up next. So those both figure prominently into earning you more views, which can then help your rankings of course. So using keyword tags in two to three word phrase elements and usually the videos that Justin’s dataset saw performing best were those with 31 to 40 unique tags, which is a pretty hefty number.

That means folks are going through and they’re taking their “What Pandas Eat” and they’re tagging it with pandas, zoo animals, mammals, and they might even be tagging it with marsupials — I think pandas are a marsupial — but those kinds of things. So they’re adding a lot of different tags on there, 31 to 40, and those tended to do the best.

So if you’re worried that adding too many keyword tags can hurt you, maybe it can, but not up until you get to a pretty high limit here.

5. Certain video lengths perform and rank well

Number five, the videos that perform best — I like that this correlates with how Whiteboard Fridays do well as well — 10 to 16 minutes in length tend to do best in the rankings. Under two minutes in length tend to be very disliked by YouTube’s audience. They don’t perform well. Four to six minutes get the most views. So it depends on what you’re optimizing for. At Whiteboard Friday, we’re trying to convey information and make it useful and interesting and valuable. So we would probably try and stick to 10 to 16 minutes. But if we had a promotional video, for example, for a new product that we were launching, we might try and aim for a four to six minute video to get the most views, the most amplification, the most awareness that we possibly could.

3 takeaways of interest

Three other takeaways of interest that I just found potentially valuable.

Older videos do better on average, but new videos get a boost

One is older videos on average tend to do better in the rankings, but new videos get a boost when they initially come out. So in the dataset, Justin created a great graph that looks like this –zero to two weeks after a video is published, two to six weeks, six to twelve weeks, and after a year, and there are a few other ones in here.

But you can see the slope of this curve follows this concept that there’s a fresh boost right here in those first two to six weeks, and it’s strongest in the first zero to two weeks. So if you are publishing regularly and you sort of have that like, “Oh, this video didn’t hit. Let me try again.This video didn’t hit. Oh, this one got it.This nailed what my audience was looking for.This was really powerful.” That seems to do quite well.

Channels help boost their videos

Channels is something Justin looked deeply into. I haven’t covered it much here, but he looked into channel optimization a lot. Channels do help boost their individual videos with things like subscribers who comment and like and have a higher watch time on average than videos that are disconnected from subscribers. He noted that about 1,000 or more subscriptions is a really good target to start to benefit from the metrics that a good subscriber base can bring. These tend to have a positive impact on views and also on rankings. Although whether that’s correlated or merely causal, hard to say.

Embeds and links are correlated, but unsure if causal

Again on the correlation but not causation, embeds and links. So the study looked at the rankings, higher rankings up here and lower rankings down there, versus embeds.

Videos that received more embeds, they were embedded on websites more, did tend to perform better. But through experimentation, we’re not quite clear if we can prove that by embedding a video a lot we can increase its rankings. So it could just be that as something ranks well and gets picked up a lot, many people embed it rather than many embeds lead to better rankings.

All right, everyone, if you’re producing video, which I probably recommend that you do if video is ranking in the SERPs that you care about or if your audience is on YouTube, hopefully this will be helpful, and I urge you to check out Justin’s research. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The Difference Between URL Structure and Information Architecture – Whiteboard Friday

Posted by willcritchlow

Questions about URL structure and information architecture are easy to get confused, but it’s an important distinction to maintain. IA tends to be more impactful than URL decisions alone, but advice given around IA often defaults to suggestions on how to best structure your URLs. In this Whiteboard Friday, Will Critchlow helps us distinguish between the two disparate topics and shares some guiding questions to ask about each.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to a British Whiteboard Friday. My name is Will Critchlow. I’m one of the founders of Distilled, and I wanted to go back to some basics today. I wanted to cover a little bit of the difference between URL structure and information architecture, because I see these two concepts unfortunately mixed up a little bit too often when people are talking about advice that they want to give.

I’m thinking here particularly from an SEO perspective. So there is a much broader study of information architecture. But here we’re thinking really about: What do the search engines care about, and what do users care about when they’re searching? So we’ll link some basics about things like what is URL structure, but we’re essentially talking here about the path, right, the bit that comes after the domain www.example.com/whatever-comes-next.

There’s a couple of main ways of structuring your URL. You can have kind of a subfolder type of structure or a much flatter structure where everything is kind of collapsed into the one level. There are pros and cons of different ways of doing this stuff, and there’s a ton of advice. You’re generally trading off considerations around, in general, it’s better to have shorter URLs than longer URLs, but it’s also better, on average, to have your keyword there than not to have your keyword there.

These are in tension. So there’s a little bit of art that goes into structuring good URLs. But too often I see people, when they’re really trying to give information architecture advice, ending up talking about URL structure, and I want to just kind of tease those things apart so that we know what we’re talking about.

So I think the confusion arises because both of them can involve questions around which pages exist on my website and what hierarchies are there between pages and groups of pages.

URL questions

So what pages exist is clearly a URL question at some level. Literally if I go to /shoes/womens, is that a 200 status? Is that a page that returns things on my website? That is, at its basics, a URL question. But zoom out a little bit and say what are the set of pages, what are the groups of pages that exist on my website, and that is an information architecture question, and, in particular, how they’re structured and how those hierarchies come together is an information architecture question.

But it’s muddied by the fact that there are hierarchy questions in the URL. So when you’re thinking about your red women’s shoes subcategory page on an e-commerce site, for example, you could structure that in a flat way like this or in a subfolder structure. That’s just a pure URL question. But it gets muddied with the information architecture questions, which we’ll come on to.

I think probably one of the key ones that comes up is: Where do your detail-level pages sit? So on an e-commerce site, imagine a product page. You could have just /product-slug. Ideally that would have some kind of descriptive keywords in it, rather than just being an anonymous number. But you can have it just in the root like this, or you can put it in a subfolder, the category it lives in.

So if this is a pair of red women’s shoes, then you could have it in /shoes/women/red slug, for example. There are pros and cons of both of these. I’m not going to get deep into it, but in general the point is you can make any of these decisions about your URLs independent of your information architecture questions.

Information architecture questions

Let’s talk about the information architecture, because these are actually, in general, the more impactful questions for your search performance. So these are things like, as I said at the beginning, it’s essentially what pages exist and what are their hierarchies.

  • How many levels of category and subcategory should we have on our website?
  • What do we do in our faceted navigation?
  • Do we go two levels deep?
  • Do we go three levels deep?
  • Do we allow all those pages to be crawled and indexed?
  • How do we link between things?
  • How do we link between the sibling products that are in the same category or subcategory?
  • How do we link back up the structure to the parent subcategory or category?
  • How do we crucially build good link paths out from the big, important pages on our website, so our homepage or major category pages?
  • What’s the link path that you can follow by clicking multiple links from there to get to detail level for every product on your website?

Those kind of questions are really impactful. They make a big difference, on an SEO front, both in terms of crawl depth, so literally a search engine spider coming in and saying, “I need to discover all these pages, all these detail-level pages on your website.” So what’s the click depth and crawl path out from those major pages?

Think about link authority and your link paths

It’s also a big factor in a link authority sense. Your internal linking structure is how your PageRank and other link metrics get distributed out around your website, and so it’s really critical that you have these great linking paths down into the products, between important products, and between categories and back up the hierarchy. How do we build the best link paths from our important pages down to our detail-level pages and back up?

Make your IA decisions before your URL structure decisions

After you have made whatever IA decisions you like, then you can independently choose your preferred URLs for each page type.

These are SEO information architecture questions, and the critical thing to realize is that you can make all of your information architecture decisions — which pages exist, which subcategories we’re going to have indexed, how we link between sibling products, all of this linking stuff — we can make all these decisions, and then we can say, independently of whatever decisions we made, we can choose any of the URL structures we like for what those actual pages’ paths are, what the URLs are for those pages.

We need to not get those muddied, and I see that getting muddied too often. People talk about these decisions as if they’re information architecture questions, and they make them first, when actually you should be making these decisions first and then picking the best, like I said, it’s a bit more art than science sometimes to making the decision between longer URLs, more descriptive URLs, or shorter URL paths.

So I hope that’s been a helpful intro to a basic topic. I’ve written a bunch of this stuff up in a blog post, and we’ll link to that. But yeah, I’ve enjoyed this Whiteboard Friday. I hope you have too. See you soon.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

How Do Sessions Work in Google Analytics? – Whiteboard Friday

Posted by Tom.Capper

One of these sessions is not like the other. Google Analytics data is used to support tons of important work, ranging from our everyday marketing reporting all the way to investment decisions. To that end, it’s integral that we’re aware of just how that data works.

In this week’s edition of Whiteboard Friday, we welcome Tom Capper to explain how the sessions metric in Google Analytics works, several ways that it can have unexpected results, and as a bonus, how sessions affect the time on page metric (and why you should rethink using time on page for reporting).

How do sessions work in Google Analytics?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans, and welcome to another edition of Whiteboard Friday. I am Tom Capper. I am a consultant at Distilled, and today I’m going to be talking to you about how sessions work in Google Analytics. Obviously, all of us use Google Analytics. Pretty much all of us use Google Analytics in our day-to-day work.

Data from the platform is used these days in everything from investment decisions to press reporting to the actual marketing that we use it for. So it’s important to understand the basic building blocks of these platforms. Up here I’ve got the absolute basics. So in the blue squares I’ve got hits being sent to Google Analytics.

So when you first put Google Analytics on your site, you get that bit of tracking code, you put it on every page, and what that means is when someone loads the page, it sends a page view. So those are the ones I’ve marked P. So we’ve got page view and page view and so on as you’re going around the site. I’ve also got events with an E and transactions with a T. Those are two other hit types that you might have added.

The job of Google Analytics is to take all this hit data that you’re sending it and try and bring it together into something that actually makes sense as sessions. So they’re grouped into sessions that I’ve put in black, and then if you have multiple sessions from the same browser, then that would be a user that I’ve marked in pink. The issue here is it’s kind of arbitrary how you divide these up.

These eight hits could be one long session. They could be eight tiny ones or anything in between. So I want to talk today about the different ways that Google Analytics will actually split up those hit types into sessions. So over here I’ve got some examples I’m going to go through. But first I’m going to go through a real-world example of a brick-and-mortar store, because I think that’s what they’re trying to emulate, and it kind of makes more sense with that context.

Brick-and-mortar example

So in this example, say a supermarket, we enter by a passing trade. That’s going to be our source. Then we’ve got an entrance is in the lobby of the supermarket when we walk in. We got passed from there to the beer aisle to the cashier, or at least I do. So that’s one big, long session with the source passing trade. That makes sense.

In the case of a brick-and-mortar store, it’s not to difficult to divide that up and try and decide how many sessions are going on here. There’s not really any ambiguity. In the case of websites, when you have people leaving their keyboard for a while or leaving the computer on while they go on holiday or just having the same computer over a period of time, it becomes harder to divide things up, because you don’t know when people are actually coming and going.

So what they’ve tried to do is in the very basic case something quite similar: arrive by Google, category page, product page, checkout. Great. We’ve got one long session, and the source is Google. Okay, so what are the different ways that that might go wrong or that that might get divided up?

Several things that can change the meaning of a session

1. Time zone

The first and possibly most annoying one, although it doesn’t tend to be a huge issue for some sites, is whatever time zone you’ve set in your Google Analytics settings, the midnight in that time zone can break up a session. So say we’ve got midnight here. This is 12:00 at night, and we happen to be browsing. We’re doing some shopping quite late.

Because Google Analytics won’t allow a session to have two dates, this is going to be one session with the source Google, and this is going to be one session and the source will be this page. So this is a self-referral unless you’ve chosen to exclude that in your settings. So not necessarily hugely helpful.

2. Half-hour cutoff for “coffee breaks”

Another thing that can happen is you might go and make a cup of coffee. So ideally if you went and had a cup of coffee while in you’re in Tesco or a supermarket that’s popular in whatever country you’re from, you might want to consider that one long session. Google has made the executive decision that we’re actually going to have a cutoff of half an hour by default.

If you leave for half an hour, then again you’ve got two sessions. One, the category page is the landing page and the source of Google, and one in this case where the blog is the landing page, and this would be another self-referral, because when you come back after your coffee break, you’re going to click through from here to here. This time period, the 30 minutes, that is actually adjustable in your settings, but most people do just leave it as it is, and there isn’t really an obvious number that would make this always correct either. It’s kind of, like I said earlier, an arbitrary distinction.

3. Leaving the site and coming back

The next issue I want to talk about is if you leave the site and come back. So obviously it makes sense that if you enter the site from Google, browse for a bit, and then enter again from Bing, you might want to count that as two different sessions with two different sources. However, where this gets a little murky is with things like external payment providers.

If you had to click through from the category page to PayPal to the checkout, then unless PayPal is excluded from your referral list, then this would be one session, entrance from Google, one session, entrance from checkout. The last issue I want to talk about is not necessarily a way that sessions are divided, but a quirk of how they are.

4. Return direct sessions

If you were to enter by Google to the category page, go on holiday and then use a bookmark or something or just type in the URL to come back, then obviously this is going to be two different sessions. You would hope that it would be one session from Google and one session from direct. That would make sense, right?

But instead, what actually happens is that, because Google and most Google Analytics and most of its reports uses last non-direct click, we pass through that source all the way over here, so you’ve got two sessions from Google. Again, you can change this timeout period. So that’s some ways that sessions work that you might not expect.

As a bonus, I want to give you some extra information about how this affects a certain metric, mainly because I want to persuade you to stop using it, and that metric is time on page.

Bonus: Three scenarios where this affects time on page

So I’ve got three different scenarios here that I want to talk you through, and we’ll see how the time on page metric works out.

I want you to bear in mind that, basically, because Google Analytics really has very little data to work with typically, they only know that you’ve landed on a page, and that sent a page view and then potentially nothing else. If you were to have a single page visit to a site, or a bounce in other words, then they don’t know whether you were on that page for 10 seconds or the rest of your life.

They’ve got no further data to work with. So what they do is they say, “Okay, we’re not going to include that in our average time on page metrics.” So we’ve got the formula of time divided by views minus exits. However, this fudge has some really unfortunate consequences. So let’s talk through these scenarios.

Example 1: Intuitive time on page = actual time on page

In the first scenario, I arrive on the page. It sends a page view. Great. Ten seconds later I trigger some kind of event that the site has added. Twenty seconds later I click through to the next page on the site. In this case, everything is working as intended in a sense, because there’s a next page on the site, so Google Analytics has that extra data of another page view 20 seconds after the first one. So they know that I was on here for 20 seconds.

In this case, the intuitive time on page is 20 seconds, and the actual time on page is also 20 seconds. Great.

Example 2: Intuitive time on page is higher than measured time on page

However, let’s think about this next example. We’ve got a page view, event 10 seconds later, except this time instead of clicking somewhere else on the site, I’m going to just leave altogether. So there’s no data available, but Google Analytics knows we’re here for 10 seconds.

So the intuitive time on page here is still 20 seconds. That’s how long I actually spent looking at the page. But the measured time or the reported time is going to be 10 seconds.

Example 3: Measured time on page is zero

The last example, I browse for 20 seconds. I leave. I haven’t triggered an event. So we’ve got an intuitive time on page of 20 seconds and an actual time on page or a measured time on page of 0.

The interesting bit is when we then come to calculate the average time on page for this page that appeared here, here, and here, you would initially hope it would be 20 seconds, because that’s how long we actually spent. But your next guess, when you look at the reported or the available data that Google Analytics has in terms of how long we’re on these pages, the average of these three numbers would be 10 seconds.

So that would make some sense. What they actually do, because of this formula, is they end up with 30 seconds. So you’ve got the total time here, which is 30, divided by the number of views, we’ve got 3 views, minus 2 exits. Thirty divided 3 minus 2, 30 divided by 1, so we’ve got 30 seconds as the average across these 3 sessions.

Well, the average across these three page views, sorry, for the amount of time we’re spending, and that is longer than any of them, and it doesn’t make any sense with the constituent data. So that’s just one final tip to please not use average time on page as a reporting metric.

I hope that’s all been useful to you. I’d love to hear what you think in the comments below. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Log File Analysis 101 – Whiteboard Friday

Posted by BritneyMuller

Log file analysis can provide some of the most detailed insights about what Googlebot is doing on your site, but it can be an intimidating subject. In this week’s Whiteboard Friday, Britney Muller breaks down log file analysis to make it a little more accessible to SEOs everywhere.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we’re going over all things log file analysis, which is so incredibly important because it really tells you the ins and outs of what Googlebot is doing on your sites.

So I’m going to walk you through the three primary areas, the first being the types of logs that you might see from a particular site, what that looks like, what that information means. The second being how to analyze that data and how to get insights, and then the third being how to use that to optimize your pages and your site.

For a primer on what log file analysis is and its application in SEO, check out our article: How to Use Server Log Analysis for Technical SEO

1. Types

So let’s get right into it. There are three primary types of logs, the primary one being Apache. But you’ll also see W3C, elastic load balancing, which you might see a lot with things like Kibana. But you also will likely come across some custom log files. So for those larger sites, that’s not uncommon. I know Moz has a custom log file system. Fastly is a custom type setup. So just be aware that those are out there.

Log data

So what are you going to see in these logs? The data that comes in is primarily in these colored ones here.

So you will hopefully for sure see:

  • the request server IP;
  • the timestamp, meaning the date and time that this request was made;
  • the URL requested, so what page are they visiting;
  • the HTTP status code, was it a 200, did it resolve, was it a 301 redirect;
  • the user agent, and so for us SEOs we’re just looking at those user agents’ Googlebot.

So log files traditionally house all data, all visits from individuals and traffic, but we want to analyze the Googlebot traffic. Method (Get/Post), and then time taken, client IP, and the referrer are sometimes included. So what this looks like, it’s kind of like glibbery gloop.

It’s a word I just made up, and it just looks like that. It’s just like bleh. What is that? It looks crazy. It’s a new language. But essentially you’ll likely see that IP, so that red IP address, that timestamp, which will commonly look like that, that method (get/post), which I don’t completely understand or necessarily need to use in some of the analysis, but it’s good to be aware of all these things, the URL requested, that status code, all of these things here.

2. Analyzing

So what are you going to do with that data? How do we use it? So there’s a number of tools that are really great for doing some of the heavy lifting for you. Screaming Frog Log File Analyzer is great. I’ve used it a lot. I really, really like it. But you have to have your log files in a specific type of format for them to use it.

Splunk is also a great resource. Sumo Logic and I know there’s a bunch of others. If you’re working with really large sites, like I have in the past, you’re going to run into problems here because it’s not going to be in a common log file. So what you can do is to manually do some of this yourself, which I know sounds a little bit crazy.

Manual Excel analysis

But hang in there. Trust me, it’s fun and super interesting. So what I’ve done in the past is I will import a CSV log file into Excel, and I will use the Text Import Wizard and you can basically delineate what the separators are for this craziness. So whether it be a space or a comma or a quote, you can sort of break those up so that each of those live within their own columns. I wouldn’t worry about having extra blank columns, but you can separate those. From there, what you would do is just create pivot tables. So I can link to a resource on how you can easily do that.

Top pages

But essentially what you can look at in Excel is: Okay, what are the top pages that Googlebot hits by frequency? What are those top pages by the number of times it’s requested?

Top folders

You can also look at the top folder requests, which is really interesting and really important. On top of that, you can also look into: What are the most common Googlebot types that are hitting your site? Is it Googlebot mobile? Is it Googlebot images? Are they hitting the correct resources? Super important. You can also do a pivot table with status codes and look at that. I like to apply some of these purple things to the top pages and top folders reports. So now you’re getting some insights into: Okay, how did some of these top pages resolve? What are the top folders looking like?

You can also do that for Googlebot IPs. This is the best hack I have found with log file analysis. I will create a pivot table just with Googlebot IPs, this right here. So I will usually get, sometimes it’s a bunch of them, but I’ll get all the unique ones, and I can go to terminal on your computer, on most standard computers.

I tried to draw it. It looks like that. But all you do is you type in “host” and then you put in that IP address. You can do it on your terminal with this IP address, and you will see it resolve as a Google.com. That verifies that it’s indeed a Googlebot and not some other crawler spoofing Google. So that’s something that these tools tend to automatically take care of, but there are ways to do it manually too, which is just good to be aware of.

3. Optimize pages and crawl budget

All right, so how do you optimize for this data and really start to enhance your crawl budget? When I say “crawl budget,” it primarily is just meaning the number of times that Googlebot is coming to your site and the number of pages that they typically crawl. So what is that with? What does that crawl budget look like, and how can you make it more efficient?

  • Server error awareness: So server error awareness is a really important one. It’s good to keep an eye on an increase in 500 errors on some of your pages.
  • 404s: Valid? Referrer?: Another thing to take a look at is all the 400s that Googlebot is finding. It’s so important to see: Okay, is that 400 request, is it a valid 400? Does that page not exist? Or is it a page that should exist and no longer does, but you could maybe fix? If there is an error there or if it shouldn’t be there, what is the referrer? How is Googlebot finding that, and how can you start to clean some of those things up?
  • Isolate 301s and fix frequently hit 301 chains: 301s, so a lot of questions about 301s in these log files. The best trick that I’ve sort of discovered, and I know other people have discovered, is to isolate and fix the most frequently hit 301 chains. So you can do that in a pivot table. It’s actually a lot easier to do this when you have kind of paired it up with crawl data, because now you have some more insights into that chain. What you can do is you can look at the most frequently hit 301s and see: Are there any easy, quick fixes for that chain? Is there something you can remove and quickly resolve to just be like a one hop or a two hop?
  • Mobile first: You can keep an eye on mobile first. If your site has gone mobile first, you can dig into that, into the logs and evaluate what that looks like. Interestingly, the Googlebot is still going to look like this compatible Googlebot 2.0. However, it’s going to have all of the mobile implications in the parentheses before it. So I’m sure these tools can automatically know that. But if you’re doing some of the stuff manually, it’s good to be aware of what that looks like.
  • Missed content: So what’s really important is to take a look at: What’s Googlebot finding and crawling, and what are they just completely missing? So the easiest way to do that is to cross-compare with your site map. It’s a really great way to take a look at what might be missed and why and how can you maybe reprioritize that data in the site map or integrate it into navigation if at all possible.
  • Compare frequency of hits to traffic: This was an awesome tip I got on Twitter, and I can’t remember who said it. They said compare frequency of Googlebot hits to traffic. I thought that was brilliant, because one, not only do you see a potential correlation, but you can also see where you might want to increase crawl traffic or crawls on a specific, high-traffic page. Really interesting to kind of take a look at that.
  • URL parameters: Take a look at if Googlebot is hitting any URLs with the parameter strings. You don’t want that. It’s typically just duplicate content or something that can be assigned in Google Search Console with the parameter section. So any e-commerce out there, definitely check that out and kind of get that all straightened out.
  • Evaluate days, weeks, months: You can evaluate days, weeks, and months that it’s hit. So is there a spike every Wednesday? Is there a spike every month? It’s kind of interesting to know, not totally critical.
  • Evaluate speed and external resources: You can evaluate the speed of the requests and if there’s any external resources that can potentially be cleaned up and speed up the crawling process a bit.
  • Optimize navigation and internal links: You also want to optimize that navigation, like I said earlier, and use that meta no index.
  • Meta noindex and robots.txt disallow: So if there are things that you don’t want in the index and if there are things that you don’t want to be crawled from your robots.txt, you can add all those things and start to help some of this stuff out as well.

Reevaluate

Lastly, it’s really helpful to connect the crawl data with some of this data. So if you’re using something like Screaming Frog or DeepCrawl, they allow these integrations with different server log files, and it gives you more insight. From there, you just want to reevaluate. So you want to kind of continue this cycle over and over again.

You want to look at what’s going on, have some of your efforts worked, is it being cleaned up, and go from there. So I hope this helps. I know it was a lot, but I want it to be sort of a broad overview of log file analysis. I look forward to all of your questions and comments below. I will see you again soon on another Whiteboard Friday. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Overcoming Blockers: How to Build Your Red Tape Toolkit – Whiteboard Friday

Posted by HeatherPhysioc

Have you ever made SEO recommendations that just don’t go anywhere? Maybe you run into a lack of budget, or you can’t get buy-in from your boss or colleagues. Maybe your work just keeps getting deprioritized in favor of other initiatives. Whatever the case, it’s important to set yourself up for success when it comes to the tangled web of red tape that’s part and parcel of most organizations.

In this week’s Whiteboard Friday, Heather Physioc shares her tried-and-true methods for building yourself a toolkit that’ll help you tear through roadblocks and bureaucracy to get your work implemented.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

What up, Moz fans? This is Heather Physioc. I’m the Director of the Discoverability Group at VML, headquartered in Kansas City. So today we’re going to talk about how to build your red tape toolkit to overcome obstacles to getting your search work implemented. So do you ever feel like your recommendations are overlooked, ignored, forgotten, deprioritized, or otherwise just not getting implemented?

Common roadblocks to implementing SEO recommendations

#SEOprobs

If so, you’re not alone. So I asked 140-plus of our industry colleagues the blockers that they run into and how they overcome them.

  • Low knowledge. So if you’re anything like every other SEO ever, you might be running into low knowledge and understanding of search, either on the client side or within your own agency.
  • Low buy-in. You may be running into low buy-in. People don’t care about SEO as much as you do.
  • Poor prioritization. So other things frequently come to the top of the list while SEO keeps falling further behind.
  • High bureaucracy. So a lot of red tape or slow approvals or no advocacy within the organization.
  • Not enough budget. A lot of times it’s not enough budget, not enough resources to get the work done.
  • Unclear and overcomplicated process. So people don’t know where they fit or even how to get started implementing your SEO work.
  • Bottlenecks. And finally bottlenecks where you’re just hitting blockers at every step along the way.

So if you’re in-house, you probably said that not enough budget and resources was your biggest problem. But on the agency side or individual practitioners, they said low understanding or knowledge of search on the client side was their biggest blocker.

So a lot of the time when we run into these blockers and it seems like nothing is getting done, we start to play the blame game. We start to complain that it’s the client who hung up the project or if the client had only listened or it’s something wrong with the client’s business.

Build out your red tape toolkit

But I don’t buy it. So we’re going to not do that. We’re going to build out our red tape toolkit. So here are some of the suggestions that came out of that survey.

1. Assess client maturity

First is to assess your client’s maturity. This could include their knowledge and capabilities for doing SEO, but also their organizational search program, the people, process, ability to plan, knowledge, capacity.

These are the problems that tend to stand in the way of getting our best work done. So I’m not going to go in-depth here because we’ve actually put out a full-length article on the Moz blog and another Whiteboard Friday. So if you need to pause, watch that and come back, no problem.

2. Speak your client’s language

So the next thing to put in your toolkit is to speak your client’s language. I think a lot of times we’re guilty of talking to fellow SEOs instead of the CMOs and CEOs who buy into our work. So unless your client is a super technical mind or they have a strong search background, it’s in our best interests to lift up and stay at 30,000 feet. Let’s talk about things that they care about, and I promise you that is not canonicalization or SSL encryption and HTTPS.

They’re thinking about ROI and their customers and operational costs. Let’s translate and speak their language. Now this could also mean using analogies that they can relate to or visual examples and data visualizations that tell the story of search better than words ever could. Help them understand. Meet them in the middle.

3. Seek greater perspective

Now let’s seek greater perspective. So what this means is SEO does not or should not operate in a silo. We’re one small piece of your client’s much larger marketing mix. They have to think about the big picture. A lot of times our clients aren’t just dedicated to SEO. They’re not even dedicated to just digital sometimes. A lot of times they have to think about how all the pieces fit together. So we need to have the humility to understand where search fits into that and ladder our SEO goals up to the brand goals, campaign goals, business and revenue goals. We also need to understand that every SEO project we recommend comes with a time and a cost associated with it.

Everything we recommend to a CMO is an opportunity cost as well for something else that they could be working on. So we need to show them where search fits into that and how to make those hard choices. Sometimes SEO doesn’t need to be the leader. Sometimes we’re the follower, and that’s okay.

4. Get buy-in

The next tool in your toolkit is to get buy-in. So there are two kinds of buy-in you can get.

Horizontal buy-in

One is horizontal buy-in. So a lot of times search is dependent on other disciplines to get our work implemented. We need copywriters. We need developers. So the number-one complaint SEOs have is not being brought in early. That’s the same complaint all your teammates on development and copywriting and everywhere else have.

Respect the expertise and the value that they bring to this project and bring them to the table early. Let them weigh in on how this project can get done. Build mockups together. Put together a plan together. Estimate the level of effort together.

Vertical buy-in

Which leads us to vertical buy-in. Vertical is up and down. When you do this horizontal buy-in first, you’re able to go to the client with a much smarter, better vetted recommendation. So a lot of times your day-to-day client isn’t the final decision maker. They have to sell this opportunity internally. So give them the tools and the voice that they need to do that by the really strong recommendation you put together with your peers and make it easy for them to take it up to their boss and their CMO and their CEO. Then you really increase the likelihood that you’re going to get that work done.

5. Build a bulletproof plan

Next, build a bulletproof plan.

Case studies

So the number-one recommendation that came out of this survey was case studies. Case studies are great. They talk about the challenge that you tried to overcome, the solution, how you actually tackled it, and the results you got out of that.

Clients love case studies. They show that you have the chops to do the work. They better explain the outcomes and the benefits of doing this kind of work, and you took the risk on that kind of project with someone else’s money first. So that’s going to reduce the perceived risk in the client’s mind and increase the likelihood that they’re going to do the work.

Make your plan simple and clear, with timelines

Another thing that helps here is building a really simple, clear plan so it’s stupid-easy for everybody who needs to be a part of it to know where they fit in and what they’re responsible for. So do the due diligence to put together a step-by-step plan and assign ownership to each step and put timelines to it so they know what pace they should be following.

Forecast ROI

Finally, forecast ROI. This is not optional. So a lot of times I think SEOs are hesitant to forecast the potential outcomes or ROI of a project because of the sheer volume of unknowns.

We live in a world of theory, and it’s very hard to commit to something that we can’t be certain about. But we have to give the client some sense of return. We have to know why we are recommending this project over others. There’s a wealth of resources out there to do that for even heavily caveated and conservative estimate, including case studies that others have published online.

Show the cost of inaction

Now sometimes forecasting the opportunity of ROI isn’t enough to light a fire for clients. Sometimes we need to show them the cost of inaction. I find that with clients the risk is not so much that they’re going to make the wrong move. It’s that they’ll make no move at all. So a lot of times we will visualize what that might look like. So we’ll show them this is the kind of growth we think that you can get if you invest and you follow this plan we put together.

Here’s what it will look like if you invest just a little to monitor and maintain, but you’re not aggressively investing in search. Oh, and here, dropping down and to the right, is what happens when you don’t invest at all. You stagnate and you get surpassed by your competitors. That can be really helpful for clients to contrast those different levels of investment and convince them to do the work that you’re recommending.

6. Use headlines & soundbites

Next use headlines, taglines, and sound bites. What we recommend is really complicated to some clients. So let’s help translate that into simple, usable language that’s memorable so they can go repeat those lines to their colleagues and their bosses and get that work sold internally. We also need to help them prioritize.

So if you’re anything like me, you love it when the list of SEO action items is about a mile long. But when we dump that in their laps, it’s too much. They get overwhelmed and bombarded, and they tune out. So instead, you are the expert consultant. Use what you know about search and know about your client to help them prioritize the single most important thing that they should be focusing on.

7. Patience, persistence, and parallel paths

Last in your toolkit, patience, persistence, and parallel paths. So getting this work done is a combination of communication, follow-up, patience, and persistence. While you’ve got your client working on this one big thing that you recommended, you can be building parallel paths, things that have fewer obstacles that you can own and run with.

They may not be as high impact as the one big thing, but you can start to get small wins that get your client excited and build momentum for more of the big stuff. But the number one thing out of all of the responses in the survey that our colleagues recommended to you is to stay strong. Have empathy and understanding for the hard decisions that your client has to make. But come with a strong, confident point of view on where to go next.

All right, gang, these are a lot of great tips to start your red tape toolkit and overcome obstacles to get your best search work done. Try these out. Let us know what you think. If you have other great ideas on how you overcome obstacles to get your best work done with clients, let us know down in the comments. Thank you so much for watching, and we’ll see you next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

How to Create a Local Marketing Results Dashboard in Google Data Studio – Whiteboard Friday

Posted by DiTomaso

Showing clients that you’re making them money is one of the most important things you can communicate to them, but it’s tough to know how to present your results in a way they can easily understand. That’s where Google Data Studio comes in. In this week’s edition of Whiteboard Friday, our friend Dana DiTomaso shares how to create a client-friendly local marketing results dashboard in Google Data Studio from start to finish.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Dana DiTomaso. I’m President and partner of Kick Point. We’re a digital marketing agency way up in the frozen north of Edmonton, Alberta. We work with a lot of local businesses, both in Edmonton and around the world, and small local businesses usually have the same questions when it comes to reporting.

Are we making money?

What I’m going to share with you today is our local marketing dashboard that we share with clients. We build this in Google Data Studio because we love Google Data Studio. If you haven’t watched my Whiteboard Friday yet on how to do formulas in Google Data Studio, I recommend you hit Pause right now, go back and watch that, and then come back to this because I am going to talk about what happened there a little bit in this video.

The Google Data Studio dashboard

This is a Google Data Studio dashboard which I’ve tried to represent in the medium of whiteboard as best as I could. Picture it being a little bit better design than my left-handedness can represent on a whiteboard, but you get the idea. Every local business wants to know, “Are we making money?” This is the big thing that people care about, and really every business cares about making money. Even charities, for example: money is important obviously because that’s what keeps the lights on, but there’s also perhaps a mission that they have.

But they still want to know: Are people filling out our donation form? Are people contacting us? These are important things for every business, organization, not-for-profit, whatever to understand and know. What we’ve tried to do in this dashboard is really boil it down to the absolute basics, one thing you can look at, see a couple of data points, know whether things are good or things are bad.

Are people contacting you?

Let’s start with this up here. The first thing is: Are people contacting you? Now you can break this out into separate columns. You can do phone calls and emails for example. Some of our clients prefer that. Some clients just want one mashed up number. So we’ll take the number of calls that people are getting.

If you’re using a call tracking tool, such as CallRail, you can import this in here. Emails, for example, or forms, just add it all together and then you have one single number of the number of times people contacted you. Usually this is a way bigger number than people think it is, which is also kind of cool.

Are people taking the action you want them to take?

The next thing is: Are people doing the thing that you want them to do? This is really going to decide on what’s meaningful to the client.

For example, if you have a client, again thinking about a charity, how many people filled out your donation form, your online donation form? For a psychologist client of ours, how many people booked an appointment? For a client of ours who offers property management, how many people booked a viewing of a property? What is the thing you want them to do? If they have online e-commerce, for example, then maybe this is how many sales did you have.

Maybe this will be two different things — people walking into the store versus sales. We’ve also represented in this field if a person has a people counter in their store, then we would pull that people counter data into here. Usually we can get the people counter data in a Google sheet and then we can pull it into Data Studio. It’s not the prettiest thing in the world, but it certainly represents all their data in one place, which is really the whole point of why we do these dashboards.

Where did visitors com from, and where are your customers coming from?

People contacting you, people doing the thing you want them to do, those are the two major metrics. Then we do have a little bit deeper further down. On this side here we start with: Where did visitors come from, and where are your customers coming from? Because they’re really two different things, right? Not every visitor to the website is going to become a customer. We all know that. No one has a 100% conversion rate, and if you do, you should just retire.

Filling out the dashboard

We really need to differentiate between the two. In this case we’re looking at channel, and there probably is a better word for channel. We’re always trying to think about, “What would clients call this?” But I feel like clients are kind of aware of the word “channel” and that’s how they’re getting there. But then the next column, by default this would be called users or sessions. Both of those are kind of cruddy. You can rename fields in Data Studio, and we can call this the number of people, for example, because that’s what it is.

Then you would use the users as the metric, and you would just call it number of people instead of users, because personally I hate the word “users.” It really boils down the humanity of a person to a user metric. Users are terrible. Call them people or visitors at least. Then unfortunately, in Data Studio, when you do a comparison field, you cannot rename and call it comparison. It does this nice percentage delta, which I hate.

It’s just like a programmer clearly came up with this. But for now, we have to deal with it. Although by the time this video comes out, maybe it will be something better, and then I can go back and correct myself in the comments. But for now it’s percentage delta. Then goal percentage and then again delta. They can sort by any of these columns in Data Studio, and it’s real live data.

Put a time period on this, and people can pick whatever time period they want and then they can look at this data as much as they want, which is delightful. If you’re not delivering great results, it may be a little terrifying for you, but really you shouldn’t be hiding that anyway, right? Like if things aren’t going well, be honest about it. That’s another talk for another time. But start with this kind of chart. Then on the other side, are you showing up on Google Maps?

We use the Supermetrics Google My Business plug-in to grab this kind of information. We hook it into the customer’s Google Maps account. Then we’re looking at branded searches and unbranded searches and how many times they came up in the map pack. Usually we’ll have a little explanation here. This is how many times you came up in the map pack and search results as well as Google Maps searches, because it’s all mashed in together.

Then what happens when they find you? So number of direction requests, number of website visits, number of phone calls. Now the tricky thing is phone calls here may be captured in phone calls here. You may not want to add these two pieces of data or just keep this off on its own separately, depending upon how your setup is. You could be using a tracking number, for example, in your Google My Business listing and that therefore would be captured up here.

Really just try to be honest about where that data comes from instead of double counting. You don’t want to have that happen. The last thing is if a client has messages set up, then you can pull that message information as well.

Tell your clients what to do

Then at the very bottom of the report we have a couple of columns, and usually this is a longer chart and this is shorter, so we have room down here to do this. Obviously, my drawing skills are not as good as as aligning things in Data Studio, so forgive me.

But we tell them what to do. Usually when we work with local clients, they can’t necessarily afford a monthly retainer to do stuff for clients forever. Instead, we tell them, “Here’s what you have to do this month.Here’s what you have to do next month. Hey, did you remember you’re supposed to be blogging?” That sort of thing. Just put it in here, because clients are looking at results, but they often forget the things that may get them those results. This is a really nice reminder of if you’re not happy with these numbers, maybe you should do these things.

Tell your clients how to use the report

Then the next thing is how to use. This is a good reference because if they only open it say once every couple months, they probably have forgotten how to do the stuff in this report or even things like up at the top make sure to set the time period for example. This is a good reminder of how to do that as well.

Because the report is totally editable by you at any time, you can always go in and change stuff later, and because the client can view the report at any time, they have a dashboard that is extremely useful to them and they don’t need to bug you every single time they want to see a report. It saves you time and money. It saves them time and money. Everybody is happy. Everybody is saving money. I really recommend setting up a really simple dashboard like this for your clients, and I bet you they’ll be impressed.

Thanks so much.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Faceted Navigation Intro – Whiteboard Friday

Posted by sergeystefoglo

The topic of faceted navigation is bound to come up at some point in your SEO career. It’s a common solution to product filtering for e-commerce sites, but managing it on the SEO side can quickly spin out of control with the potential to cause indexing bloat and crawl errors. In this week’s Whiteboard Friday, we welcome our friend Sergey Stefoglo to give us a quick refresher on just what faceted nav is and why it matters, then dive into a few key solutions that can help you tame it.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. My name is Serge. I’m from Distilled. I work at the Seattle office as a consultant. For those of you that don’t know about Distilled, we’re a full-service digital marketing agency specializing in SEO, but have branched out since to work on all sorts of things like content, PR, and recently a split testing tool, ODN.

Today I’m here to talk to you guys about faceted navigation, just the basics. We have a few minutes today, so I’m just going to cover kind of the 101 version of this. But essentially we’re going to go through what the definition is, why we should care as SEOs, why it’s important, what are some options we have with this, and then also what a solution could look like.

1. What is faceted navigation?

For those that don’t know, faceted navigation is essentially something like this, probably a lot nicer than this to be honest. But it’s essentially a page that allows you to filter down or allows a user to filter down based on what they’re looking for. So this is an example we have here of a list of products on a page that sells laptops, Apple laptops in this case.

Right here on the left side, in the green, we have a bunch of facets. Essentially, if you’re a user and you’re going in here, you could look at the size of the screen you might want. You could look at the price of the laptop, etc. That’s what faceted navigation is. Previously, when I worked at my previous agency, I worked on a lot of local SEO things, not really e-commerce, big-scale websites, so I didn’t run into this issue often. I actually didn’t even know it was a thing until I started at Distilled. So this might be interesting for you even if it doesn’t apply at the moment.

2. Why does faceted navigation matter?

Essentially, we should care as SEOs because this can get out of control really quickly. While being very useful to users, obviously it’s helpful to be able to filter down to the specific thing you want. this could get kind of ridiculous for Googlebot.

Faceted navigation can result in indexing bloat and crawl issues

We’ve had clients at Distilled that come to us that are e-commerce brands that have millions of pages in the index being crawled that really shouldn’t be. They don’t bring any value to the site, any revenue, etc. The main reason we should care is because we want to avoid indexation bloat and kind of crawl errors or issues.

3. What options do we have when it comes to controlling which pages are indexed/crawled?

The third thing we’ll talk about is what are some options we have in terms of controlling some of that, so controlling whether a page gets indexed or crawled, etc. I’m not going to get into the specifics of each of these today, but I have a blog post on this topic that we’ll link to at the bottom.

The main, most common options that we have for controlling this kind of thing would be around no indexing a page and stopping Google from indexing it, using canonical tags to choose a page that’s essentially the canonical version, using a disallow rule in robots.txt to stop Google from crawling a certain part of the site, or using the nofollow meta directive as well. Those are some of the most common options. Again, we’re not going to go into the nitty-gritty of each one. They each have their kind of pros and cons, so you can research that for yourselves.

4. What could a solution look like?

So okay, we know all of this. What could be an ideal solution? Before I jump into this, I don’t want you guys to run in to your bosses and say, “This is what we need to do.”

Please, please do your research beforehand because it’s going to vary a lot based on your site. Based on the dev resources you have, you might have to get scrappy with it. Also, do some keyword research mainly around the long tail. There are a lot of instances where you could and might want to have three or four facets indexed.

So again, a huge caveat: this isn’t the end-all be-all solution. It’s something that we’ve recommended at times, when appropriate, to clients. So let’s jump into what an ideal solution, or not ideal solution, a possible solution could look like.

Category, subcategory, and sub-subcategory pages open to indexing and crawling

What we’re looking at here is we’re going to have our category, subcategory, and sub-subcategory pages open to indexation and open to being crawled. In our example here, that would be this page, so /computers/laptops/apple. Perfectly fine. People are probably searching for Apple laptops. In fact, I know they are.

Any pages with one or more facets selected = indexed, facet links get nofollowed

The second step here is any page that has one facet selected, so for example, if I was on this page and I wanted an Apple laptop with a solid state drive in it, I would select that from these options. Those are fine to be indexed. But any time you have one or more facets selected, we want to make sure to nofollow all of these internal links pointing to other facets, essentially to stop link equity from being wasted and to stop Google from wasting time crawling those pages.

Any pages with 2+ facets selected = noindex tag gets added

Then, past that point, if a user selects two or more facets, so if I was interested in an Apple laptop with a solid state hard drive that was in the $ 1,000 price range for example, the chances of there being a lot of search volume for an Apple laptop for $ 1,000 with a solid state drive is pretty low.

So what we want to do here is add a noindex tag to those two-plus facet options, and that will again help us control crawl bloat and indexation bloat.

Already set up faceted nav? Think about keyword search volume, then go back and whitelist

The final thing I want to mention here, I touched on it a little bit earlier. But essentially, if you’re doing this after the fact, after the faceted navigation is already set up, which you probably are, it’s worth, again, having a strong think about where there is keyword search volume. If you do this, it’s worth also taking a look back a few months in to see the impact and also see if there’s anything you might want to whitelist. There might be a certain set of facets that do have search volume, so you might want to throw them back into the index. It’s worth taking a look at that.

That’s what faceted navigation is as a quick intro. Thank you for watching. I’d be really interested to hear what you guys think in the comments. Again, like I said, there isn’t a one-size-fits-all solution. So I’d be really interested to hear what’s worked for you, or if you have any questions, please ask them below.

Thank you.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Surprising SEO A/B Test Results – Whiteboard Friday

Posted by willcritchlow

You can make all the tweaks and changes in the world, but how do you know they’re the best choice for the site you’re working on? Without data to support your hypotheses, it’s hard to say. In this week’s edition of Whiteboard Friday, Will Critchlow explains a bit about what A/B testing for SEO entails and describes some of the surprising results he’s seen that prove you can’t always trust your instinct in our industry.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to another British Whiteboard Friday. My name is Will Critchlow. I’m the founder and CEO at Distilled. At Distilled, one of the things that we’ve been working on recently is building an SEO A/B testing platform. It’s called the ODN, the Optimization Delivery Network. We’re now deployed on a bunch of big sites, and we’ve been running these SEO A/B tests for a little while. I want to tell you about some of the surprising results that we’ve seen.

What is SEO A/B testing?

We’re going to link to some resources that will show you more about what SEO A/B testing is. But very quickly, the general principle is that you take a site section, so a bunch of pages that have a similar structure and layout and template and so forth, and you split those pages into control and variant, so a group of A pages and a group of B pages.

Then you make the change that you’re hypothesizing is going to make a difference just to one of those groups of pages, and you leave the other set unchanged. Then, using your analytics data, you build a forecast of what would have happened to the variant pages if you hadn’t made any changes to them, and you compare what actually happens to the forecast. Out of that you get some statistical confidence intervals, and you get to say, yes, this is an uplift, or there was no difference, or no, this hurt the performance of your site.

This is data that we’ve never really had in SEO before, because this is very different to running a controlled experiment in a kind of lab environment or on a test domain. This is in the wild, on real, actual, live websites. So let’s get to the material. The first surprising result I want to talk about is based off some of the most basic advice that you’ve ever seen.

Result #1: Targeting higher-volume keywords can actually result in traffic drops

I’ve stood on stage and given this advice. I have recommended this stuff to clients. Probably you have too. You know that process where you do some keyword research and you find that there’s one particular way of searching for whatever it is that you offer that has more search volume than the way that you’re talking about it on your website right now, so higher search volume for a particular way of phrasing?

You make the recommendation, “Let’s talk about this stuff on our website the way that people are searching for it. Let’s put this kind of phrasing in our title and elsewhere on our pages.” I’ve made those recommendations. You’ve probably made those recommendations. They don’t always work. We’ve seen a few times now actually of testing this kind of process and seeing what are actually dramatic drops.

We saw up to 20-plus-percent drops in organic traffic after updating meta information in titles and so forth to target the more commonly-searched-for variant. Various different reasons for this. Maybe you end up with a worse click-through rate from the search results. So maybe you rank where you used to, but get a worse click-through rate. Maybe you improve your ranking for the higher volume target term and you move up a little bit, but you move down for the other one and the new one is more competitive.

So yes, you’ve moved up a little bit, but you’re still out of the running, and so it’s a net loss. Or maybe you end up ranking for fewer variations of key phrases on these pages. However it happens, you can’t be certain that just putting the higher-volume keyword phrasing on your pages is going to perform better. So that’s surprising result number one. Surprising result number two is possibly not that surprising, but pretty important I think.

Result #2: 30–40% of common tech audit recommendations make no difference

So this is that we see as many as 30% or 40% of the common recommendations in a classic tech audit make no difference. You do all of this work auditing the website. You follow SEO best practices. You find a thing that, in theory, makes the website better. You go and make the change. You test it.

Nothing, flatlines. You get the same performance as the forecast, as if you had made no change. This is a big deal because it’s making these kinds of recommendations that damages trust with engineers and product teams. You’re constantly asking them to do stuff. They feel like it’s pointless. They do all this stuff, and there’s no difference. That is what burns authority with engineering teams too often.

This is one of the reasons why we built the platform is that we can then take our 20 recommendations and hypotheses, test them all, find the 5 or 6 that move the needle, only go to the engineering team to build those ones, and that builds so much trust and relationship over time, and they get to work on stuff that moves the needle on the product side.

So the big deal there is really be a bit skeptical about some of this stuff. The best practices, at the limit, probably make a difference. If everything else is equal and you make that one tiny, little tweak to the alt attribute or a particular image somewhere deep on the page, if everything else had been equal, maybe that would have made the difference.

But is it going to move you up in a competitive ranking environment? That’s what we need to be skeptical about.

Result #3: Many lessons don’t generalize

So surprising result number three is: How many lessons do not generalize? We’ve seen this broadly across different sections on the same website, even different industries. Some of this is about the competitive dynamics of the industry.

Some of it is probably just the complexity of the ranking algorithm these days. But we see this in particular with things like this. Who’s seen SEO text on a category page? Those kind of you’ve got all of your products, and then somebody says, “You know what? We need 200 or 250 words that mention our key phrase a bunch of times down at the bottom of the page.” Sometimes, helpfully, your engineers will even put this in an SEO-text div for you.

So we see this pretty often, and we’ve tested removing it. We said, “You know what? No users are looking at this. We know that overstuffing the keyword on the page can be a negative ranking signal. I wonder if we’ll do better if we just cut that div.” So we remove it, and the first time we did it, plus 6% result. This was a good thing.

The pages are better without it. They’re now ranking better. We’re getting better performance. So we say, “You know what? We’ve learnt this lesson. You should remove this really low-quality text from the bottom of your category pages.” But then we tested it on another site, and we see there’s a drop, a small one admittedly, but it was helping on these particular pages.

So I think what that’s just telling us is we need to be testing these recommendations every time. We need to be trying to build testing into our core methodologies, and I think this trend is only going to increase and continue, because the more complex the ranking algorithms get, the more machine learning is baked into it and it’s not as deterministic as it used to be, and the more competitive the markets get, so the narrower the gap between you and your competitors, the less stable all this stuff is, the smaller differences there will be, and the bigger opportunity there will be for something that works in one place to be null or negative in another.

So I hope I have inspired you to check out some SEO A/B testing. We’re going to link to some of the resources that describe how you do it, how you can do it yourself, and how you can build a program around this as well as some other of our case studies and lessons that we’ve learnt. But I hope you enjoyed this journey on surprising results from SEO A/B tests.

Resources:

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

SEO "Dinosaur" Tactics That You Should Retire – Whiteboard Friday

Posted by randfish

It’s tough to admit it, but many of us still practice outdated SEO tactics in the belief that they still have a great deal of positive influence. In this week’s Whiteboard Friday, Rand gently sets us straight and offers up a series of replacement activities that will go much farther toward moving the needle. Share your own tips and favorites in the comments!

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to go back in time to the prehistoric era and talk about a bunch of “dinosaur” tactics, things that SEOs still do, many of us still do, and we probably shouldn’t.

We need to replace and retire a lot of these tactics. So I’ve got five tactics, but there’s a lot more, and in fact I’d loved to hear from some of you on some of yours.

Dino Tactic #1: AdWords/Keyword Planner-based keyword research

But the first one we’ll start with is something we’ve talked about a few times here — AdWords and Keyword Planner-based keyword research. So you know there’s a bunch of problems with the metrics in there, but I still see a lot of folks starting their keyword research there and then expanding into other tools.

Replace it with clickstream data-driven tools with Difficulty and CTR %

My suggestion would be start with a broader set if you possibly can. If you have the budget, replace this with something that is driven by clickstream data, like Ahrefs or SEMrush or Keyword Explorer. Even Google Search Suggest and related searches plus Google Trends tend to be better at capturing more of this.

Why it doesn’t work

I think is just because AdWords hides so many keywords that they don’t think are commercially relevant. It’s too inaccurate, especially the volume data. If you’re actually creating an AdWords campaign, the volume data gets slightly better in terms of its granularity, but we found it is still highly inaccurate as compared as to when you actually run that campaign.

It’s too imprecise, and it lacks a bunch of critical metrics, including difficulty and click-through rate percentage, which you’ve got to know in order to prioritize keywords effectively.

Dino Tactic #2: Subdomains and separate domains for SERP domination

Next up, subdomains and separate domains for SERP domination. So classically, if you wanted to own the first page of Google search results for a branded query or an unbranded query, maybe you just want to try and totally dominate, it used to be the case that one of the ways to do this was to add in a bunch of subdomains to your website or register some separate domains so that you’d be able to control that top 10.

Why it doesn’t work

What has happened recently, though, is that Google has started giving priority to multiple subpages in a single SERP from a single domain. You can see this for example with Yelp on virtually any restaurant-related searches, or with LinkedIn on a lot of business topic and job-related searches.

You can see it with Quora on a bunch of question style searches, where they’ll come up for all of them, or Stack Overflow, where they come up for a lot of engineering and development-related questions.

Replace it with barnacle SEO and subfolder hosted content

So one of the better ways to do this nowadays is with barnacle SEO and subfolder hosted content, meaning you don’t have to put your content on a separate subdomain in order to rank multiple times in the same SERP.

Barnacle SEO also super handy because Google is giving a lot of benefit to some of these websites that host content you can create or generate and profiles you can create and generate. That’s a really good way to go. This is mostly just because of this shift from the subdomains being the way to get into SERPs multiple times to individual pages being that path.

Dino Tactic #3: Prioritizing number one rankings over other traffic-driving SEO techniques

Third, prioritizing number one rankings over other traffic-driving SEO techniques. This is probably one of the most common “dinosaur” tactics I see, where a lot of folks who are familiar with the SEO world from maybe having used consultants or agencies or brought it in-house 10, 15, 20 years ago are still obsessed with that number one organic ranking over everything else.

Replace it with SERP feature SEO (especially featured snippets) and long-tail targeting

In fact, that’s often a pretty poor ROI investment compared to things like SERP features, especially the featured snippet, which is getting more and more popular. It’s used in voice search. It oftentimes doesn’t need to come from the number one ranking result in the SERP. It can come number three, number four, or number seven.

It can even be the result that brings back the featured snippet at the very top. Its click-through rate is often higher than number one, meaning SERP features a big way to go. This is not the only one, too. Image SEO, doing local SEO when the local pack appears, doing news SEO, potentially having a Twitter profile that can rank in those results when Google shows tweets.

And, of course, long-tail targeting, meaning going after other keywords that are not as competitive, where you don’t need to compete against as many folks in order to get that number one ranking spot, and often, in aggregate, long tail can be more than ranking number one for that “money” keyword, that primary keyword that you’re going after.

Why it doesn’t work

Why is this happening? Well, it’s because SERP features are biasing the click-through rate such that number one just isn’t worth what it used to be, and the long tail is often just higher ROI per hour spent.

Dino Tactic #4: Moving up rankings with link building alone

Fourth, moving up the rankings on link building alone. Again, I see a lot of people do this, where they’re ranking number 5, number 10, number 20, and they think, “Okay, I’m ranking in the first couple of pages of Google. My next step is link build my way to the top.”

Why it no longer works on its own

Granted, historically, back in the dinosaur era, dinosaur era of being 2011, this totally worked. This was “the” path to get higher rankings. Once you were sort of in the consideration set, links would get you most of the way up to the top. But today, not the case.

Replace it with searcher task accomplishment, UX optimization, content upgrades, and brand growth

Instead I’m going to suggest you retire that and replace it with searcher task accomplishment, which we’ve seen a bunch of people invest in optimization there and springboard their site, even with worse links, not as high DA, all of that kind of stuff. UX optimization, getting the user experience down and nailing the format of the content so that it better serves searchers.

Content upgrades, improving the actual content on the page, and brand growth, associating your brand more with the topic or the keyword. Why is this happening? Well, because links alone it feels like today are just not enough. They’re still a powerful ranking factor. We can’t ignore them entirely certainly.

But if you want to unseat higher ranked pages, these types of investments are often much easier to make and more fruitful.

Dino Tactic #5: Obsessing about keyword placement in certain tags/areas

All right, number five. Last but not least, obsessing about keyword placement in certain tags and certain areas. For example, spending inordinate amounts of time and energy making sure that the H1 and H2, the headline tags, can contain keywords, making sure that the URL contains the keywords in exactly the format that you want with the hyphens, repeating text a certain number of times in the content, making sure that headlines and titles are structured in certain ways.

Why it (kind of) doesn’t work

It’s not that this doesn’t work. Certainly there’s a bare minimum. We’ve got to have our keyword used in the title. We definitely want it in the headline. If that’s not in an H1 tag, I think we can live with that. I think that’s absolutely fine. Instead I would urge you to move some of that same obsession that you had with perfecting those tags, getting the last 0.01% of value out of those into related keywords and related topics, making sure that the body content uses and explains the subjects, the topics, the words and phrases that Google knows searchers associate with a given topic.

My favorite example of this is if you’re trying to rank for “New York neighborhoods” and you have a page that doesn’t include the word Brooklyn or Manhattan or Bronx or Queens or Staten Island, your chances of ranking are much, much worse, and you can get all the links and the perfect keyword targeting in your H1, all of that stuff, but if you are not using those neighborhood terms that Google clearly can associate with the topic, with the searcher’s query, you’re probably not going to rank.

Replace it with obsessing over related keywords and topics

This is true no matter what you’re trying to rank for. I don’t care if it’s blue shoes or men’s watches or B2B SaaS products. Google cares a lot more about whether the content solves the searcher’s query. Related topics, related keywords are often correlated with big rankings improvements when we see folks undertake them.

I was talking to an SEO a few weeks ago who did this. They just audited across their site, found the 5 to 10 terms that they felt they were missing from the content, added those into the content intelligently, adding them to the content in such a way that they were actually descriptive and useful, and then they saw rankings shoot up with nothing else, no other work. Really, really impressive stuff.

So take some of these dino tactics, try retiring them and replacing them with some of these modern ones, and see if your results don’t come out better too. Look forward to your thoughts on other dino tactics in the comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Spectator to Partner: Turn Your Clients into SEO Allies – Whiteboard Friday

Posted by KameronJenkins

Are your clients your allies in SEO, or are they passive spectators? Could they even be inadvertently working against you? A better understanding of expectations, goals, and strategy by everyone involved can improve your client relations, provide extra clarity, and reduce the number of times you’re asked to “just SEO a site.” In today’s Whiteboard Friday, Kameron Jenkins outlines tactics you should know for getting clients and bosses excited about the SEO journey, as well as the risks involved in passivity.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, everyone, and welcome to this week’s edition of Whiteboard Friday. I am Kameron Jenkins, and I’m the SEO Wordsmith here at Moz. Today I’m going to be talking with you about how to turn your clients from spectators, passive spectators to someone who is proactively interested and an ally in your SEO journey.

So if you’ve ever heard someone come to you, maybe it’s a client or maybe you’re in-house and this is your boss saying this, and they say, “Just SEO my site,” then this is definitely for you. A lot of times it can be really hard as an SEO to work on a site if you really aren’t familiar with the business, what that client is doing, what they’re all about, what their goals are. So I’m going to share with you some tactics for getting your clients and your boss excited about SEO and excited about the work that you’re doing and some risks that can happen when you don’t do that.

Tactics

So let’s dive right in. All right, first we’re going to talk about tactics.

1. Share news

The first tactic is to share news. In the SEO industry, things are changing all the time, so it’s actually a really great tactic to keep yourself informed, but also to share that news with the client. So here’s an example. Google My Business is now experimenting with a new video format for their post feature. So one thing that you can do is say, “Hey, client, I hear that Google is experimenting with this new format. They’re using videos now. Would you like to try it?”

So that’s really cool because it shows them that you’re on top of things. It shows them that you’re the expert and you’re keeping your finger on the pulse of the industry. It also tells them that they’re going to be a part of this new, cutting-edge technology, and that can get them really, really excited about the SEO work you’re doing. So make sure to share news. I think that can be really, really valuable.

2. Outline your work

The next tip is to outline your work. This one seems really simple, but there is so much to say for telling a client what you’re going to do, doing it, and then telling them that you did it. It’s amazing what can happen when you just communicate with a client more. There have been plenty of situations where maybe I did less tangible work for a client one week, but because I talk to them more, they were more inclined to be happy with me and excited about the work I was doing.

It’s also cool because when you tell a client ahead of time what you’re going to do, it gives them time to get excited about, “Ooh, I can’t wait to see what he or she is going to do next.” So that’s a really good tip for getting your clients excited about SEO.

3. Report results

Another thing is to report on your results. So, as SEOs, it can be really easy to say, hey, I added this page or I fixed these things or I updated this.

But if we detach it from the actual results, it doesn’t really matter how much a client likes you or how much your boss likes you, there’s always a risk that they could pull the plug on SEO because they just don’t see the value that’s coming from it. So that’s an unfortunate reality, but there are tons of ways that you can show the value of SEO. One example is, “Hey, client, remember that page that we identified that was ranking on page two. We improved it. We made all of those updates we talked about, and now it’s ranking on page one. So that’s really exciting. We’re seeing a lot of new traffic come from it.I’m wondering, are you seeing new calls, new leads, an uptick in any of those things as a result of that?”

So that’s really good because it shows them what you did, the results from that, and then it kind of connects it to, “Hey, are you seeing any revenue, are you seeing new clients, new customers,” things like that. So they’re more inclined to see that what you’re doing is making a real, tangible impact on actual revenue and their actual business goals.

4. Acknowledge and guide their ideas

This one is really, really important. It can be hard sometimes to marry best practices and customer service. So what I mean by that is there’s one end of the pendulum where you are really focused on best practices. This is right. This is wrong. I know my SEO stuff. So when a client comes to you and they say, “Hey, can we try this?” and you go, “No, that’s not best practices,”it can kind of shut them down. It doesn’t get them involved in the SEO process. In fact, it just kind of makes them recoil and maybe they don’t want to talk to you, and that’s the exact opposite of what we want here. On the other end of that spectrum though, you have clients who say, “Hey, I really want to try this.I saw this article. I’m interested in this thing. Can you do it for my website?”

Maybe it’s not the greatest idea SEO-wise. You’re the SEO expert, and you see that and you go, “Mm, that’s actually kind of scary. I don’t think I want to do that.” But because you’re so focused on pleasing your client, you maybe do it anyway. So that’s the opposite of what we want as well. We want to have a “no, but” mentality. So an example of that could be your client emails in and says, “Hey, I want to try this new thing.”

You go, “Hey, I really like where your head is at. I like that you’re thinking about things this way. I’m so glad you shared this with me. I tried this related thing before, and I think that would be actually a really good idea to employ on your website.” So kind of shifting the conversation, but still bringing them along with you for that journey and guiding them to the correct conclusions. So that’s another way to get them invested without shying them away from the SEO process.

Risks

So now that we’ve talked about those tactics, we’re going to move on to the risks. These are things that could happen if you don’t get your clients excited and invested in the SEO journey.

1. SEO becomes a checklist

When you don’t know your client well enough to know what they’re doing in the real world, what they’re all about, the risk becomes you have to kind of just do site health stuff, so fiddling with meta tags, maybe you’re changing some paragraphs around, maybe you’re changing H1s, fixing 404s, things like that, things that are just objectively, “I can make this change, and I know it’s good for site health.”

But it’s not proactive. It’s not actually doing any SEO strategies. It’s just cleanup work. If you just focus on cleanup work, that’s really not an SEO strategy. That’s just making sure your site isn’t broken. As we all know, you need so much more than that to make sure that your client’s site is ranking. So that’s a risk.

If you don’t know your clients, if they’re not talking to you, or they’re not excited about SEO, then really all you’re left to do is fiddle with kind of technical stuff. As good as that can be to do, our jobs are way more fun than that. So communicate with your clients. Get them on board so that you can do proactive stuff and not just fiddling with little stuff.

2. SEO conflicts with business goals

So another risk is that SEO can conflict with business goals.

So say that you’re an SEO. Your client is not talking to you. They’re not really excited about stuff that you’re doing. But you decide to move forward with proactive strategies anyway. So say I’m an SEO, and I identify this keyword. My client has this keyword. This is a related keyword. It can bring in a lot of good traffic. I’ve identified this good opportunity. All of the pages that are ranking on page one, they’re not even that good. I could totally do better. So I’m going to proactively go, I’m going to build this page of content and put it on my client’s site. Then what happens when they see that page of content and they go, “We don’t even do that. We don’t offer that product. We don’t offer that service.”

Oops. So that’s really bad. What can happen is that, yes, you’re being proactive, and that’s great. But if you don’t actually know what your client is doing, because they’re not communicating with you, they’re not really excited, you risk misaligning with their business goals and misrepresenting them. So that’s a definite risk.

3. You miss out on PR opportunities

Another thing, you miss out on PR opportunities. So again, if your client is not talking to you, they’re not excited enough to share what they’re doing in the real world with you, you miss out on news like, “Hey, we’re sponsoring this event,”or, “Hey, I was the featured expert on last night’s news.”

Those are all really, really good things that SEOs look for. We crave that information. We can totally use that to capitalize on it for SEO value. If we’re not getting that from our clients, then we miss out on all those really, really cool PR opportunities. So a definite risk. We want those PR opportunities. We want to be able to use them.

4. Client controls the conversation

Next up, client controls the conversation. That’s a definite risk that can happen. So if a client is not talking to you, a reason could be they don’t really trust you yet. When they don’t trust you, they tend to start to dictate. So maybe our client emails in.

A good example of this is, “Hey, add these 10 backlinks to my website.” Or, “Hey, I need these five pages, and I need them now.” Maybe they’re not even actually bad suggestions. It’s just the fact that the client is asking you to do that. So this is kind of tricky, because you want to communicate with your client. It’s good that they’re emailing in, but they’re the ones at that point that are dictating the strategy. Whereas they should be communicating their vision, so hey, as a business owner, as a website owner, “This is my vision. This is my goal, and this is what I want.”

As the SEO professional, you’re receiving that information and taking it and making it into an SEO strategy that can actually be really, really beneficial for the client. So there’s a huge difference between just being a task monkey and kind of transforming their vision into an SEO strategy that can really, really work for them. So that’s a definite risk that can happen.

Excitement + partnership = better SEO campaigns

There’s a lot of different things that can happen. These are just some examples of tactics that you can use and risks. If you have any examples of things that have worked for you in the past, I would love to hear about them. It’s really good to information share. Success stories where maybe you got your client or your boss really bought into SEO, more so than just, “Hey, I’m spending money on it.”

But, “Hey, I’m your partner in this. I’m your ally, and I’m going to give you all the information because I know that it’s going to be mutually beneficial for us.” So at the end here, excitement, partner, better SEO campaigns. This is going to be I believe a recipe for success to get your clients and your boss on board. Thanks again so much for watching this edition of Whiteboard Friday, and come back next week for another one.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert