Tag Archive | "Whiteboard"

How to Get Into Google News – Whiteboard Friday

Posted by Polemic

Today we’re tackling a question that many of us have asked over the years: how do you increase your chances of getting your content into Google News? We’re delighted to welcome renowned SEO specialist Barry Adams to share the framework you need to have in place in order to have a chance of appearing in that much-coveted Google News carousel.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. I’m Barry Adams. I’m a technical SEO consultant at Polemic Digital and a specialist in news SEO. Today we’re going to be talking about how to get into Google News. I get a lot of questions from a lot of people about Google News and specifically how you get a website into Google News, because it’s a really great source of traffic for websites. Once you’re in the Google News Index, you can appear in the top stories carousel in Google search results, and that can send a lot of traffic your way.

How do you get into Google News’ manually curated index?

So how do you get into Google News? How do you go about getting your website to be a part of Google News’ manual index so that you can get that top stories traffic for yourself? Well, it’s not always as easy as it makes it appear. You have to jump through quite a few hoops before you get into Google News.

1. Have a dedicated news website

First of all, you have to have a dedicated news website. You have to keep in mind when you apply to be included in Google News, there’s a team of Googlers who will manually review your website to decide whether or not you’re worthy of being in the News index. That is a manual process, and your website has to be a dedicated news website.

I get a lot of questions from people asking if they have a news section or a blog on their site and if that could be included in Google News. The answer tends to be no. Google doesn’t want news websites in there that aren’t entirely about news, that are commercial websites that have a news section. They don’t really want that. They want dedicated news websites, websites whose sole purpose is to provide news and content on specific topics and specific niches.

So that’s the first hurdle and probably the most important one. If you can’t clear that hurdle, you shouldn’t even try getting into Google News.

2. Meet technical requirements

There are also a lot of other aspects that go into Google News. You have to jump through, like I said, quite a few hoops. Some technical requirements are very important to know as well.

Have static, unique URLs.

Google wants your articles and your section pages to have static, unique URLs so that an article or a section is always on the same URL and Google can crawl it and recrawl it on that URL without having to work with any redirects or other things. If you have content with dynamically generated URLs, that does not tend to work with Google News very well. So you have to keep that in mind and make sure that your content, both your articles and your static section pages are on fixed URLs that tend not to change over time.

Have your content in plain HTML.

It also helps to have all your content in plain HTML. Google News, when it indexes your content, it’s all about speed. It tries to index articles as fast as possible. So any content that requires like client-side JavaScript or other sort of scripting languages tends not to work for Google News. Google has a two-stage indexing process, where the first stage is based on the HTML source code and the second stage is based on a complete render of the page, including executing JavaScript.

For Google News, that doesn’t work. If your content relies on JavaScript execution, it will never be seen by Google News. Google News only uses the first stage of indexing, based purely on the HTML source code. So keep your JavaScript to a minimum and make sure that the content of your articles is present in the HTML source code and does not require any JavaScript to be seen to be present.

Have clean code.

It also helps to have clean code. By clean code, I mean that the article content in the HTML source code should be one continuous block of code from the headline all the way to the end. That tends to result in the best and most efficient indexing in Google News, because I’ve seen many examples where websites put things in the middle of the article code, like related articles or video carousels, photo galleries, and that can really mess up how Google News indexes the content. So having clean code and make sure the article code is in one continuous block of easily understood HTML code tends to work the best for Google News.

3. Optional (but more or less mandatory) technical considerations

There’s also quite a few other things that are technically optional, but I see them as pretty much mandatory because it really helps with getting your content picked up in Google News very fast and also makes sure you get that top stories carousel position as fast as possible, which is where you will get most of your news traffic from.

Have a news-specific XML sitemap.

Primarily the news XML sitemap, Google says this is optional but recommended, and I agree with them on that. Having a news-specific XML sitemap that lists articles that you’ve published in the last 48 hours, up to a maximum of 1,000 articles, is absolutely necessary. For me, I think this is Google News’ primary discovery mechanism when they crawl your website and try to find new articles.

So that news-specific XML sitemap is absolutely crucial, and you want to make sure you have that in place before you submit your site to Google News.

Mark up articles with NewsArticle structured data.

I also think it’s very important to mark up your articles with news article structured data. It can be just article structured data or even more specific structured data segments that Google is introducing, like news article analysis and news article opinion for specific types of articles.

But article or news article markup on your article pages is pretty much mandatory. I see your likelihood of getting into the top stories carousel much improved if you have that markup implemented on your article pages.

Helpful-to-have extras:

Also, like I said, this is a manually curated index. So there are a few extra hoops that you want to jump through to make sure that when a Googler looks at your website and reviews it, it ticks all the boxes and it appears like a trustworthy, genuine news website.

A. Multiple authors

Having multiple authors contribute to your website is hugely valuable, hugely important, and it does tend to elevate you above all the other blogs and small sites that are out there and makes it a bit more likely that the Googler reviewing your site will press that Approve button.

B. Daily updates

Having daily updates definitely is necessary. You don’t want just one news post every couple of days. Ideally, multiple new articles every single day that also should be unique. You can have some sort of syndicated content on there, like from feeds, from AP or Reuters or whatever, but the majority of your content needs to be your own unique content. You don’t want to rely too much on syndicated articles to fill your website with news content.

C. Mostly unique content

Try to write as much unique content as you possibly can. There isn’t really a clear ratio for that. Generally speaking, I recommend my clients to have at least 70% of the content as unique stuff that they write themselves and publish themselves and only 30% maximum syndicated content from external sources.

D. Specialized niche/topic

It really helps to have a specialized niche or a specialized topic that you focus on as a news website. There are plenty of news sites out there that are general news and try to do everything, and Google News doesn’t really need many more of those. What Google is interested in is niche websites on specific topics, specific areas that can provide in-depth reporting on those specific industries or topics. So if you have a very niche topic or a niche industry that you cover with your news, it does tend to improve your chances of getting into that News Index and getting that top stories carousel traffic.

So that, in a nutshell, is how you get into Google News. It might appear to be quite simple, but, like I said, quite a few hoops for you to jump through, a few technical things you have to implement on your website as well. But if you tick all those boxes, you can get so much traffic from the top stories carousel, and the rest is profit. Thank you very much.

This has been my Whiteboard Friday.

Further resources:

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

On-Page SEO for 2019 – Whiteboard Friday

Posted by BritneyMuller

Whew! We made it through another year, and it seems like we’re past due for taking a close look at the health of our on-page SEO practices. What better way to hit the ground running than with a checklist? In today’s Whiteboard Friday, the fabulous Britney Muller shares her best tips for doing effective on-page SEO in 2019.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we’re going over all things on-page SEO, and I’ve divided it into three different sections:

  1. How are crawlers and Googlebot crawling through your site and your web pages?
  2. What is the UX of your on-page content?
  3. What is the value in the content of your on-page content?

So let’s just jump right in, shall we?

Crawler/bot-accessible

☑ Meta robots tag allows crawling

Making sure your meta robots tag allows crawling is essential. If that’s blocking Googlebot from crawling, your page will never be in search. You want to make sure that’s all panned out.

☑ Robots.txt doesn’t disallow crawling

You want to make sure that let’s say this page that you’re trying to get to rank in search engines, that you’re not disallowing this URL from your robots.txt.

☑ URL is included in sitemap

Similarly you want to make sure that the URL is in your site map.

☑ Schema markup

You also want to add any schema markup, any relevant schema markup that you can. This is essentially spoon-feeding search engines what your page is about and what your content is about.

☑ Internal links pointing to your page with natural anchor text

So let’s say I am trying to rank for chakra stones. Maybe I’m on a yoga website and I want to make sure that I have other internal pages linking to chakra stones with the anchor text “chakra crystals” or “chakra stones” and making sure that I’m showing Google that this is indeed an internally linked page and it’s important and we want to give it some weight.

☑ HTTPS – SSL

You want to make sure that that is secure and that Google is taking that into consideration as well.

User experience

☑ Meets Web Content Accessibility Guidelines

Does it meet Web Content Accessibility Guidelines? Definitely look into that and make sure you check all the boxes.

☑ Responsive mobile design with same content and links

Is it responsive for mobile? Super important with the mobile-first indexing.

☑ Clear CTA

Is there one clear call to action? A lot of pages miss this. So, for this page, maybe I would have a big “Buy Chakra Crystals Here” button or link. That would be a clear CTA. It’s important to have.

☑ Multimedia: Evaluate SERP and add desired media

Are you providing other desired media types? Are there images and video and different forms of content on your page?

☑ Page speed: utilize CDNs, compress images, use reliable hosting

Are you checking the page speed? Are you using CDNs? Are you compressing your images? You want to check all of that.

☑ Integrate social sharing buttons

It’s the easiest thing. Make sure that people can easily share your content.

Content and value

This is where it gets really fun and strategic too.

☑ Unique, high-quality content

Are you providing high-quality content? So if you go to Google and you search “chakra stones” and you take a look at all of those results, are you including all of that good content into your page? Then are you making it even better? Because that should be the goal.

☑ Optimize for intent: Evaluate SERP and PPC, note which SERP features show up

You want to also optimize for intent. So you want to evaluate that SERP. If that search result page is showing tons of images or maybe videos, you should be incorporating that into your page as well, because clearly that’s what people are looking for.

You also want to evaluate the PPC. They have done so much testing on what converts and what doesn’t. So it’s silly not to take that into consideration when optimizing your page.

☑ Title tags and meta descriptions

What are those titles? What are those descriptions? What’s working? Title tags and meta description are still so important. This is the first impression to many of your visitors in Google. Are you enticing a click? Are you making that an enticing call to action to your site?

☑ Header tags

H1, H2, and H3 header tags are still super important. You want to make sure that the title of your page is the H1 and so forth. But just to check on all of that would be good.

☑ Optimize images: compress, title file names, add alt text

Images are the biggest source of bloat of on-page site speed. So you want to make sure that your images are compressed and optimized and keeping your page fast and easily accessible to your users.

☑ Review for freshness

You want to review for freshness. We want to make sure that this is up-to-date content. Maybe take a look at popular content the last year or two of your site and update that stuff. This should be a continual wash and repeat. You want to continue to update the content on your site.

☑ Include commonly asked questions

It’s such an easy thing to do, but it’s commonly overlooked. AnswerThePublic does a great job of surfacing questions. Moz Keyword Explorer has a really great filter that provides some of the most commonly asked questions for a keyword term. I highly suggest you check that out and start to incorporate some of that.

Find common questions now

These help to target featured snippets. So if you’re incorporating some of that, not only do you get the extra traffic, but you find these opportunities of getting featured snippets, which is great. You’re expanding your real estate in search. Awesome. PAA boxes are also a great way to find commonly asked questions for a particular keyword.

☑ Add summaries

Summaries are also hidden gems. We see Google seeking out summaries for content all of the time. They are providing summaries in featured snippets and in different SERP features to help sort of distill information for users. So if you can do that, not only will you make your content more easily scannable, but you’re also making it more accessible for search, which is great.

☑ TF-IDF (term frequency-inverse document frequency)

TF-IDF stands for “term frequency-inverse document frequency.” It sounds a little intimidating. It’s actually pretty simple. What’s the number of times that “chakra stones” is mentioned in this particular page divided by the number of times it’s mentioned anywhere? This is basically just a calculation to determine relevance for the term “chakra stones.” Really cool and commonly used by Google. So if you can do this on your on-page, it will just help you in the long term.

☑ LSI (latent semantic indexing) for relevance

Similarly LSI or LSA, it sometimes referred to, is latent semantic indexing, and it’s also for relevance. This helps determine, okay, if I’m talking about chakra stones, it may also incorporate those other topics that are commonly related to this topic. Relevant.

☑ Flesch-Kincaid Readability Test

What is the readability of this page? The easier it is to read the better, but you just want to keep an eye on that in general.

Bonus tip!

One final tip that Kameron Jenkins put on Twitter, that I love so much, and Kameron is a world-class writer —she’s one of the best I’ve ever had the privilege of working with — mentioned this on-page SEO trick. Find the top three ranking URLs for your target keyword.

So if I were to put in “chakra stones” in Google and pull the top three URLs, put them into Moz Keyword Explorer and I see what they’re ranking for, I see what those three URLs are specifically ranking for, and I look at what they’re commonly ranking for in the middle here. Then I use those keywords to optimize my page even better. It’s genius. It’s very similar to some of the relevant stuff we were talking about over here.

Discover new keyword ideas

So definitely try some of this stuff out. I hope this helps. I really look forward to any of your comments or questions down below in the comments section.

Thank you so much for joining me on this edition of Whiteboard Friday. I look forward to seeing you all again soon, so thanks. Have a good one.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The SEO Elevator Pitch – Whiteboard Friday

Posted by KameronJenkins

What is it you do again?

It’s a question every SEO has had to answer at some point, whether to your family members over the holidays or to the developer who will eventually implement your suggestions. If you don’t have a solid elevator pitch for describing your job, this is the Whiteboard Friday for you! Learn how to craft a concise, succinct description of life as an SEO without jargon, policing, or acting like a superhero.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey guys, welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz. Today we’re going to be talking about creating an SEO elevator pitch, what is it, why we need one, and what kind of prompted this whole idea for an SEO elevator pitch.

So essentially, a couple of weeks ago, I was on Twitter and I saw John Mueller. He tweeted, “Hey, I meet with a lot of developers, and a lot of times they don’t really know what SEOs do.” He was genuinely asking. He was asking, “Hey, SEO community, how do you describe what you do?” I’m scrolling through, and I’m seeing a lot of different answers, and all of them I’m resonating with.

They’re all things that I would probably say myself. But it’s just interesting how many different answers there were to the question, “What do SEOs do and what value do they provide?” So I kind of thought to myself, “Why is that? Why do we have so many different explanations for what SEO is and what we do?” So I thought about it, and I thought that it might be a good idea for myself and maybe other SEOs if you don’t already have an elevator pitch ready.

What is an SEO elevator pitch?

Now, if you’re not familiar with the concept of an elevator pitch, it’s basically — I have a definition here — a succinct and persuasive speech that communicates your unique value as an SEO. It’s called an elevator pitch essentially because it should take about the length of time it takes to ride the elevator with someone. So you want to be able to quickly and concisely answer someone’s question when they ask you, “Oh, SEO, what is that?I think I’ve heard of that before. What do you do?”

Why is this so hard?

So let’s dive right in. So I mentioned, in the beginning, how there are so many different answers to this “what do you say you do here” type question. I think it’s hard to kind of come up with a concise explanation for a few different reasons. So I wanted to dive into that a little bit first.

1. Lots of specialties within SEO

So number one, there are lots of specialties within SEO.

As the industry has advanced over the last two plus decades, it has become very diverse, and there are lots of different facets in SEO. I found myself on quite a rabbit trail. I was on LinkedIn and I was kind of browsing SEO job descriptions. I wanted to see basically: What is it that people are looking for in an SEO?

How do they describe it? What are the characteristics? So basically, I found a lot of different things, but I found a few themes that emerged. So there are your content-focused SEOs, and those are people that are your keyword research aficionados. There are the people that write search engine optimized content to drive traffic to your website. You have your link builders, people that focus almost exclusively on that.

You have your local SEOs, and you have your analysts. You have your tech SEOs, people that either work on a dev team or closely with a dev team. So I think that’s okay though. There are lots of different facets within SEO, and I think that’s awesome. That’s, to me, a sign of maturity in our industry. So when there are a lot of different specialties within SEO, I think it’s right and good for all of our elevator pitches to differ.

So if you have a specialty within SEO, it can be different. It should kind of cater toward the unique brand of SEO that you do, and that’s okay.

2. Different audiences

Number two, there are different audiences. We’re not always going to be talking to the same kind of person. So maybe you’re talking to your boss or a client. To me, those are more revenue-focused conversations.

They want to know: What’s the value of what you do? How does it affect my bottom line? How does it help me run my business and stay afloat and stay profitable? If you’re talking to a developer, that’s going to be a slightly different conversation. So I think it’s okay if we kind of tweak our elevator pitch to make it a little bit more palatable for the people that we’re talking to.

3. Algorithm maturity

Three, why this is hard is there’s been, obviously, a lot of changes all the time in the algorithm, and as it matures, it’s going to look like the SEO’s job is completely different than last year just because the algorithm keeps maturing and it looks like our jobs are changing all the time. So I think that’s a reality that we have to live with, but I still think it’s important, even though things are changing all the time, to have a baseline kind of pitch that we give people when they ask us what it is we do.

So that’s why it’s hard. That’s what your elevator pitch is.

My elevator pitch: SEO is marketing, with search engines

Then, by way of example, I thought I’d just give you my SEO elevator pitch. Maybe it will spark your creativity. Maybe it will give you some ideas. Maybe you already have one, and that’s okay. But the point is not to use mine.

The point is essentially to kind of take you through what mine looks like, hopefully get your creative juices flowing, and you can create your own. So let’s dive right into my pitch.

So my pitch is SEO is marketing, just with search engines. So we have the funnel here — awareness, consideration, and decision.

Awareness: Rank and attract clicks for informational queries.

First of all, I think it’s important to note that SEO can help you rank and attract clicks for informational queries.

Consideration: Rank and attract clicks for evaluation queries.

So when your audience is searching for information, they want to solve their pain points, they’re not ready to buy, they’re just searching, we’re meeting them there with content that brings them to the site, informs them, and now they’re familiar with our brand. Those are great assisted conversions. Rank and attract clicks for evaluation queries. When your audience is starting to compare their options, you want to be there. You want to meet them there, and we can do that with SEO.

Decision: Rank, attract clicks, and promote conversion for bottom-funnel queries

At the decision phase, you can rank and attract clicks and kind of promote conversions for bottom of funnel queries. When people are in their “I want to buy” stage, SEO can meet them there. So I think it’s important to realize that SEO isn’t kind of like a cost center and not a profit center. It’s not like a bottom of funnel thing. I’ve heard that in a lot of places, and I think it’s just important to kind of draw attention to the fact that SEO is integrated throughout your marketing funnel. It’s not relegated to one stage or another.

But how?

We talked about rank and attract clicks and promote conversions. But how do we do that? That’s the what it does.

But how do we do it? So this is how I explain it. I think really, for me, there are two sides to the SEO’s coin. We have driving, and we have supporting.

1. Driving

So on the driving side, I would say something like this. When someone searches a phrase or a keyword in Google, I make sure the business’ website shows up in the non-ad results. That’s important because a lot of people are like, “Oh, do you bid on keywords?”

We’re like, “No, no, that’s PPC.” So I always just throw in “non-ad” because people understand that. So I do that through content that answers people’s questions, links that help search engines find my content and show signs of authority and popularity of my content, and accessibility. So that’s kind of your technical foundation.

You’re making sure that your website is crawlable and it that it’s index the way that you want it to be indexed. When people get there, it works. It works on mobile and on desktop. It’s fast. So I think these are really the three big pillars of driving SEO — content, links, and making sure your website is technically sound. So that’s how I describe the driving, the proactive side of SEO.

2. Supporting

Then two, we have supporting, and I think this is kind of an underrated or maybe it’s often seen as kind of an interruption to our jobs.

But I think it’s important to actually call it what it is. It’s a big part of what we do. So I think we should embrace it as SEOs.

A. Be the Google Magic 8-ball

For one, we can serve as the Google Magic 8-Ball. When people come to us in our organization and they say, “Hey, I’m going to make this change, or I’m thinking about making this change.Is this going to be good or bad for SEO?”

I think it’s great that people are asking that question. Always be available and always make yourself ready to answer those types of questions for people. So I think on the reactionary side we can be that kind of person that helps guide people and understand what is going to affect your organic search presence.

B. Assist marketing

Two, we can assist marketing. So on this side of the coin, we’re driving.

We can drive our own marketing strategies. As SEOs, we can see how SEO can drive all phases of the funnel. But I think it’s important to note that we’re not the only people in our organization. Often SEOs maybe they don’t even live in the marketing department. Maybe they do and they report to a marketing lead. There are other initiatives that your marketing lead could be investigating.

Maybe they say, “Hey, we’ve just done some market research, and here’s this plan.” It could be our job as SEOs to take that plan, take that strategy and translate it into something digital. I think that’s a really important value that SEOs can add. We can actually assist marketing as well as drive our own efforts.

C. Fix mistakes

Then number three here, I know this is another one that kind of makes people cringe, but we are here to fix mistakes when they happen and train people so that they don’t happen again. So maybe we come in on a Monday morning and we’re ready to face the week, and we see that traffic has taken a nosedive or something. We go, “Oh, no,” and we dive in.

We try to see what happened. But I think that’s really important. It’s our job or it’s part of our job to kind of dive in, diagnose what happened, and not only that but support and be there to help fix it or guide the fixes, and then train and educate and make sure that people know what it is that happened and how it shouldn’t happen again.

You’re there to help train them and guide them. I think that’s another really important way that we can support as SEOs. So that’s essentially how I describe it.

3 tips for coming up with your own pitch

Before I go, I just wanted to mention some tips when you’re coming up with your own SEO elevator pitch. I think it’s really important to just kind of stay away from certain language when you’re crafting your own “this is what I do” speech.

So the three tips I have are:

1. Stay away from jargon.

If you’re giving an SEO elevator pitch, it’s to people that don’t know what SEO is. So try to avoid jargon. I know it’s really easy as SEOs. I find myself doing it all the time. There are things that I don’t think are jargon.

But then I take a couple steps back and I realize, oh yeah, that’s not layman’s terms. So stay away from jargon if at all possible. You’re not going to benefit anyone by confusing them.

2. Avoid policing.

It can be easy as SEOs I’ve found and I’ve found myself in this trap a couple of times where we kind of act as these traffic cops that are waiting around the corner, and when people make a mistake, we’re there to wag our finger at them.

So avoid any language that makes it sound like the SEOs are just the police waiting to kind of punish people for wrongdoing. We are there to help fix mistakes, but it’s in a guiding and educating and supporting, kind of collaborative manner and not like a policing type of manner. Number three, I would say is kind of similar, but a little different.

3. Avoid Supermanning.

I call this Supermanning because it’s the type of language that makes it sound like SEOs are here to swoop in and save the day when something goes wrong. We do. We’re superheroes a lot of times. There are things that happen and thank goodness there was an SEO there to help diagnose and fix that.

But I would avoid any kind of pitch that makes it sound like your entire job is just to kind of save people. There are other people in your organization that are super smart and talented at what they do. They probably wouldn’t like it if you made it sound like you were there to help them all the time. So I just think that’s important to keep in mind. Don’t make it seem like you’re the police waiting to wag your finger at them or you’re the superhero that needs to save everyone from their mistakes.

So yeah, that’s my SEO elevator pitch. That’s why I think it’s important to have one. If you’ve kind of crafted your own SEO elevator pitch, I would love to hear it, and I’m sure it would be great for other SEOs to hear it as well. It’s great to information share. So drop that in the comments if you feel comfortable doing that. If you don’t have one, hopefully this helps. So yeah, that’s it for this week’s Whiteboard Friday, and come back again next week for another one.

Thanks, everybody.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

3 Big Lessons from Interviewing John Mueller at SearchLove London – Whiteboard Friday

Posted by willcritchlow

When you’ve got one of Google’s most helpful and empathetic voices willing to answer your most pressing SEO questions, what do you ask? Will Critchlow recently had the honor of interviewing Google’s John Mueller at SearchLove London, and in this week’s edition of Whiteboard Friday he shares his best lessons from that session, covering the concept of Domain Authority, the great subdomain versus subfolder debate, and a view into the technical workings of noindex/nofollow.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, Whiteboard Friday fans. I’m Will Critchlow from Distilled, and I found myself in Seattle, wanted to record another Whiteboard Friday video and talk through some things that I learned recently when I got to sit down with John Mueller from Google at our SearchLove London conference recently.

So I got to interview John on stage, and, as many of you may know, John is a webmaster relations guy at Google and really a point of contact for many of us in the industry when there are technical questions or questions about how Google is treating different things. If you followed some of the stuff that I’ve written and talked about in the past, you’ll know that I’ve always been a little bit suspicious of some of the official lines that come out of Google and felt like either we don’t get the full story or we haven’t been able to drill in deep enough and really figure out what’s going on.

I was under no illusions that I might be able to completely fix this this in one go, but I did want to grill John on a couple of specific things where I felt like we hadn’t maybe asked things clearly enough or got the full story. Today I wanted to run through a few things that I learned when John and I sat down together. A little side note, I found it really fascinating doing this kind of interview. I sat on stage in a kind of journalistic setting. I had never done this before. Maybe I’ll do a follow-up Whiteboard Friday one day on things I learned and how to run interviews.

1. Does Google have a “Domain Authority” concept?

But the first thing that I wanted to quiz John about was this domain authority idea. So here we are on Moz. Moz has a proprietary metric called domain authority, DA. I feel like when, as an industry, we’ve asked Google, and John in particular, about this kind of thing in the past, does Google have a concept of domain authority, it’s got bundled up with feeling like, oh, he’s had an easy way out of being able to answer and say, “No, no, that’s a proprietary Moz metric. We don’t have that.”

I felt like that had got a bit confusing, because our suspicion is that there is some kind of an authority or a trust metric that Google has and holds at a domain level. We think that’s true, but we felt like they had always been able to wriggle out of answering the question. So I said to John, “Okay, I am not asking you do you use Moz’s domain authority metric in your ranking factors. Like we know that isn’t the case. But do you have something a little bit like it?”

Yes, Google has metrics that map into similar things

John said yes. He said yes, they have metrics that, his exact quote was, “map into similar things.”My way of phrasing this was this is stuff that is at the domain level. It’s based on things like link authority, and it is something that is used to understand performance or to rank content across an entire domain. John said yes, they have something similar to that.

New content inherits those metrics

They use it in particular when they discover new content on an existing domain. New content, in some sense, can inherit some of the authority from the domain, and this is part of the reason why we figured they must have something like this, because we’ve seen identical content perform differently on different sites. We know that there’s something to this. So yes, John confirmed that until they have some of those metrics developed, when they’ve seen a bit of content for long enough, and it can have its own link metrics and usage metrics, in the intervening time up until that point it can inherit some of this stuff from the domain.

Not wholly link-based

He did also just confirm that it’s not just link-based. This is not just a domain-level PageRank type thing.

2. Subdomains versus subfolders

This led me into the second thing that I really wanted to get out of him, which was — and when I raised this, I got kind of an eye roll, “Are we really going down this rabbit hole” — the subdomain versus subfolder question. You might have seen me talk about this. You might have seen people like Rand talk about this, where we’ve seen cases and we have case studies of moving blog.example.com to example.com/blog and changing nothing else and getting an uplift.

We know something must be going on, and yet the official line out of Google has for a very long time been: “We don’t treat these things differently. There is nothing special about subfolders. We’re perfectly happy with subdomains. Do whatever is right for your business.” We’ve had this kind of back-and-forth a few times. The way I put it to John was I said, “We have seen these case studies. How would you explain this?”

They try to figure out what belongs to the site

To his credit, John said, “Yes, we’ve seen them as well.” So he said, yes, Google has also seen these things. He acknowledged this is true. He acknowledged that it happens. The way he explained it connects back into this Domain Authority thing in my mind, which is to say that the way they think about it is: Are these pages on this subdomain part of the same website as things on the main domain?

That’s kind of the main question. They try and figure out, as he put it, “what belongs to this site.” We all know of sites where subdomains are entirely different sites. If you think about a blogspot.com or a WordPress.com domain, subdomains might be owned and managed by entirely different people, and there would be no reason for that authority to pass across. But what Google is trying to do and is trying to say, “Is this subdomain part of this main site?”

Sometimes this includes subdomains and sometimes not

He said sometimes they determine that it is, and sometimes they determine that it is not. If it is part of the site, in their estimation, then they will treat it as equivalent to a subfolder. This, for me, pretty much closes this loop. I think we understand each other now, which is Google is saying, in these certain circumstances, they will be treated identically, but there are circumstances where it can be treated differently.

My recommendation stays what it’s always been, which is 100% if you’re starting from the outset, put it on a subfolder. There’s no upside to the subdomain. Why would you risk the fact that Google might treat it as a separate site? If it is currently on a subdomain, then it’s a little trickier to make that case. I would personally be arguing for the integration and for making that move.

If it’s treated as part of the site, a subdomain is equivalent to a subfolder

But unfortunately, but somewhat predictably, I couldn’t tie John down to any particular way of telling if this is the case. If your content is currently on a subdomain, there isn’t really any way of telling if Google is treating it differently, which is a shame, but it’s somewhat predictable. But at least we understand each other now, and I think we’ve kind of got to the root of the confusion. These case studies are real. This is a real thing. Certainly in certain circumstances moving from the subdomain to the subfolder can improve performance.

3. Noindex’s impact on nofollow

The third thing that I want to talk about is a little bit more geeked out and technical, and also, in some sense, it leads to some bigger picture lessons and thinking. A little while ago John kind of caught us out by talking about how if you have a page that you no index and keep it that way for a long time, that Google will eventually treat that equivalently to a no index, no follow.

In the long-run, a noindex page’s links effectively become nofollow

In other words, the links off that page, even if you’ve got it as a no index, follow, the links off that page will be effectively no followed. We found that a little bit confusing and surprising. I mean I certainly felt like I had assumed it didn’t work that way simply because they have the no index, follow directive, and the fact that that’s a thing seems to suggest that it ought to work that way.

It’s been this way for a long time

It wasn’t really so much about the specifics of this, but more the like: How did we not know this? How did this come about and so forth? John talked about how, firstly, it has been this way for a long time. I think he was making the point none of you all noticed, so how big a deal can this really be? I put it back to him that this is kind of a subtle thing and very hard to test, very hard to extract out the different confounding factors that might be going on.

I’m not surprised that, as an industry, we missed it. But the point being it’s been this way for a long time, and Google’s view and certainly John’s view was that this hadn’t been hidden from us so much as the people who knew this hadn’t realized that they needed to tell anyone. The actual engineers working on the search algorithm, they had a curse of knowledge.

The curse of knowledge: engineers didn’t realize webmasters had the wrong idea

They knew it worked this way, and they had never realized that webmasters didn’t know that or thought any differently. This was one of the things that I was kind of trying to push to John a little more was kind of saying, “More of this, please. Give us more access to the engineers. Give us more insight into their way of thinking. Get them to answer more questions, because then out of that we’ll spot the stuff that we can be like, ‘Oh, hey, that thing there, that was something I didn’t know.’ Then we can drill deeper into that.”

That led us into a little bit of a conversation about how John operates when he doesn’t know the answer, and so there were some bits and pieces that were new to me at least about how this works. John said he himself is generally not attending search quality meetings. The way he works is largely off his knowledge and knowledge base type of content, but he has access to engineers.

They’re not dedicated to the webmaster relations operation. He’s just going around the organization, finding individual Google engineers to answer these questions. It was somewhat interesting to me at least to find that out. I think hopefully, over time, we can generally push and say, “Let’s look for those engineers. John, bring them to the front whenever they want to be visible, because they’re able to answer these kinds of questions that might just be that curse of knowledge that they knew this all along and we as marketers hadn’t figured out this was how things worked.”

That was my quick run-through of some of the things that I learned when I interviewed John. We’ll link over to more resources and transcripts and so forth. But it’s been a blast. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Content Comprehensiveness – Whiteboard Friday

Posted by KameronJenkins

When Google says they prefer comprehensive, complete content, what does that really mean? In this week’s episode of Whiteboard Friday, Kameron Jenkins explores actionable ways to translate the demands of the search engines into valuable, quality content that should help you rank.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, guys. Welcome to this week’s edition of Whiteboard Friday. My name is Kameron Jenkins, and I work here at Moz.

Today we’re going to be talking about the quality of content comprehensiveness and what that means and why sometimes it can be confusing. I want to use an example scenario of a conversation that tends to go on between SEOs and Google. So here we go.

An SEO usually says something like, “Okay, Google, you say you want to rank high-quality content. But what does that really mean? What is high quality, because we need more specifics than that.”

Then Google goes, “Okay, high quality is something that’s comprehensive and complete. Yeah, it’s really comprehensive.” SEOs go, “Well, wait. What does that even mean?”

That’s kind of what this was born out of. Just kind of an explanation of what is comprehensive, what does Google mean when they say that, and how that differs depending on the query.

Here we have an example page, and I’ll kind of walk you through it. It’s just going to serve to demonstrate why when Google says “comprehensive,” that can mean something different for an e-commerce page than it would for a history of soccer page. It’s really going to differ depending on the query, because people want all sorts of different kinds of things. Their intent is going to be different depending on what they’re searching in Google. So the criteria is going to be different for comprehensiveness. So hopefully, by way of example, we’ll be able to kind of walk you through what comprehensiveness looks like for this one particular query. So let’s just dive in.

1. Intent

All right. So first I’m going to talk about intent. I have here a Complete Guide to Buying a House. This is the query I used as an example. Before we dive in, even before we look into keyword research tools or anything like that, I think it’s really important to just like let the query sit with you for a little bit. So “guide to buying a house,” okay, I’m going to think about that and think about what the searcher probably wanted based on the query.

So first of all, I noticed “guide.” The word “guide” to me makes it sound like someone wants something very complete, very thorough. They don’t just want quick tips. They don’t want a quick bullet list. This can be longer, because someone is searching for a comprehensive guide.

“To buying a house,” that’s a process. That’s not like an add-to-cart like Amazon. It’s a step-by-step. There are multiple phases to that type of process. It’s really important to realize here that they’re probably looking for something a little lengthier and something that is maybe a step-by-step process.

And too, you just look at the query, “guide to buying a house,” people are probably searching that if they’ve never bought a house before. So if they’ve never bought a house before, it’s just good to remember that your audience is in a phase where they have no idea what they’re doing. It’s important to understand your audience and understand that this is something that they’re going to need very, very comprehensive, start-to-finish information on it.

2. Implications

Two, implications. This is again also before we get into any keyword research tools. By implications, I mean what is going to be the effect on someone after reading this? So the implications here, a guide to buying a house, that is a big financial decision. That’s a big financial purchase. It’s going to affect people’s finances and happiness and well-being, and Google actually has a name for that. In their Quality Rater Guidelines, they call that YMYL. So that stands for “your money, your life.”

Those types of pages are held to a really high standard, and rightfully so. If someone reads this, they’re going to get advice about how to spend their money. It’s important for us, as SEOs and writers crafting these types of pages, to understand that these are going to be held to a really high standard. I think what that could look like on the page is, because they’re making a big purchase like this, it might be a good sign of trustworthiness to maybe have some expert quotes in here. Maybe you kind of sprinkle those throughout your page. Maybe you actually have it written by an expert author instead of just Joe Schmoe blogger. Those are just some ideas for making a page really trustworthy, and I think that’s a key to comprehensiveness.

3. Subtopics

Number three here we have subtopics. There are two ways that I’ll walk you through finding subtopics to fit within your umbrella topic. I’m going to use Moz Keyword Explorer as an example of this.

Use Keyword Explorer to reveal subtopics

In Moz Keyword Explorer, you can search for different keywords and related keywords two different ways. You can type in a query. So you can type in something like “buy a house” or “home buying” or something like that. You start with your main topic, and what you’ll get as a result is a bunch of suggested keywords that you can also incorporate on your page, terms that are related to the term that you searched. This is going to be really great, because you’re going to start to notice themes emerge. Some of the themes I noticed were people tend to search for “home buying calculator,” like a can-I-afford-it type of calculator. A lot of people search financial-related things obviously, bad credit. I filed for bankruptcy, can I still buy a house? You’ll start to see subthemes emerge.

Then I also wanted to mention that, in Moz Keyword Explorer, you can also search by URL. What I might do is query my term that I’m trying to target on my page. I’m going to pick the top three URLs that are ranking. You pop them into Keyword Explorer, and you can compare them and you can see the areas of most overlap. So what you’ll get essentially is a list of keywords that the top ranking pages for that term also rank for. That’s going to be a really good way to mine some extra keyword ideas for your page to make it more comprehensive.

4. Questions

Then here we go. We have step four. After we’ve come up with some subtopics, I think it’s also a really good idea to mine questions and try to find what questions our audience is actually asking. So, for these, I like to use Answer the Public and Keywords Everywhere. Those are two really great tools that I kind of like to use in tandem.

Use Answer the Public to mine questions

Answer the Public, if you’ve never used it, is a really fun tool. You can put in a keyword, and you get a huge list. Depending on how vague your query is, you might get a ton of ideas. If your query is really specific, you might not get as many keyword ideas back. But it’s a really great way to type in a keyword, like “buying a house” or “buy a house” or “home buying” or something like that, and get a whole, big, long list of questions that your audience is asking. People that want to know how to buy a house, they’re also asking these questions.

I think a comprehensive page will answer those questions. But it can be a little bit overwhelming. There’s going to be probably a lot of questions potentially to answer. So how do you prioritize and choose which questions are the best to address on your page?

Use Keywords Everywhere to highlight keywords on a page

That’s where the Keywords Everywhere plug-in comes in handy. I use it in Chrome. You can have it highlight the keywords on the page. I think I have mine set to highlight anything that’s searched 50 or more times a month. That’s a really good way to gauge, just right off the bat you can see, okay, now there are these 10 instead of these 100 questions to potentially answer on my page.

So examples of questions here, I have questions like: Can I afford this? Is now the right time to buy? So you can kind of fit those into your page and answer those questions.

5. Trends

Then finally here I have trends. I think this is a really commonly missed step. It’s important to remember that a lot of terms have seasonality attached to them. So what I did with this query, I queried “buy a house,” and I wanted to see if there were any trends for home buying-type of research queries in Google Trends. I zoomed out to five years to see if I could see year-over-year if there were any trends that emerged.

That was totally the case. When people are searching “buy a house,” it’s at its peak kind of around January into spring, and then in the summer it starts to dive, and then it’s at its lowest during the holidays. That kind of shows you that people are researching at the beginning of the year. They’re kind of probably moving into their house during the summertime, and then during the holidays they’ve had all the time to move in and now they’re just enjoying the holidays. That’s kind of the trend flow that it follows. That’s really key information, if you’re going to build a comprehensive page, to kind of understand that there’s seasonality attached with your term.

Because I know now that there’s seasonality with my term, I can incorporate information like what are the pros and cons of buying in peak season versus off-season for buying a house. Maybe what’s the best time of year to buy. Those are, again, other ideas for things that you can incorporate on your page to make it more comprehensive.

This page is not comprehensive. I didn’t have enough room to fit some things. So you don’t just stop at this phase. If you’re really building a comprehensive page on this topic, don’t stop where I stopped. But this is kind of just an example of how to go about thinking through what Google means when they say make a page comprehensive. It’s going to mean something different depending on your query and just keep that in mind. Just think about the query, think about what your audience wanted based on what they searched, and you’ll be off to a great start building a comprehensive page.

I hope that was helpful. If you have any ideas for building your own comprehensive page, how you do that, maybe how it differs in different industries that you’ve worked in, pop it in the comments. That would be really good for us to share that information. Come back again next week for another edition of Whiteboard Friday.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

What SEOs Can Learn from AdWords – Whiteboard Friday

Posted by DiTomaso

Organic and paid search aren’t always at odds; there are times when there’s benefit in knowing how they work together. Taking the time to know the ins and outs of AdWords can improve your rankings and on-site experience. In today’s edition of Whiteboard Friday, our fabulous guest host Dana DiTomaso explains how SEOs can improve their game by taking cues from paid search in this Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, my name is Dana DiTomaso. I’m President and Partner at Kick Point, and one of the things that we do at Kick Point is we do both SEO and paid. One of the things that’s really useful is when SEO and paid work together. But what’s even better is when SEOs can learn from paid to make their stuff better.

One of the things that is great about AdWords or Google Ads — whenever you’re watching this, it may be called one thing or the other — is that you can learn a lot from what has a high click-through rate, what performs well in paid, and paid is way faster than waiting for Google to catch up to the awesome title tags you’ve written or the new link building that you’ve done to see how it’s going to perform. So I’m going to talk about four things today that you can learn from AdWords, and really these are easy things to get into in AdWords.

Don’t be intimidated by the interface. You can probably just get in there and look at it yourself, or talk to your AdWords person. I bet they’d be really excited that you know what a callout extension is. So we’re going to start up here.

1. Negative keywords

The first thing is negative keywords. Negative keywords, obviously really important. You don’t want to show up for things that you shouldn’t be showing up for.

Often when we need to take over an AdWords account, there aren’t a lot of negative keywords. But if it’s a well-managed account, there are probably lots of negatives that have been added there over time. What you want to look at is if there’s poor word association. So in your industry, cheap, free, jobs, and then things like reviews and coupons, if these are really popular search phrases, then maybe this is something you need to create content for or you need to think about how your service is presented in your industry.

Then what you can do to change that is to see if there’s something different that you can do to present this kind of information. What are the kinds of things your business doesn’t want? Are you definitely not saying these things in the content of your website? Or is there a way that you can present the opposite opinion to what people might be searching for, for example? So think about that from a content perspective.

2. Title tags and meta descriptions

Then the next thing are title tags and meta descriptions. Title tags and meta descriptions should never be a write it once and forget it kind of thing. If you’re an on-it sort of SEO, you probably go in every once in a while and try to tweak those title tags and meta descriptions. But the problem is that sometimes there are just some that aren’t performing. So go into Google Search Console, find the title tags that have low click-through rate and high rankings, and then think about what you can do to test out new ones.

Then run an AdWords campaign and test out those title tags in the title of the ad. Test out new ad copy — that would be your meta descriptions — and see what actually brings a higher click-through rate. Then whichever one does, ta-da, that’s your new title tags and your meta descriptions. Then add those in and then watch your click-through rate increase or decrease.

Make sure to watch those rankings, because obviously title tag changes can have an impact on your rankings. But if it’s something that’s keyword rich, that’s great. I personally like playing with meta descriptions, because I feel like meta descriptions have a bigger impact on that click-through rate than title tags do, and it’s something really important to think about how are we making this unique so people want to click on us. The very best meta description I’ve ever seen in my life was for an SEO company, and they were ranking number one.

They were obviously very confident in this ranking, because it said, “The people above me paid. The people below me aren’t as good as me. Hire me for your SEO.” I’m like, “That’s a good meta description.” So what can you do to bring in especially that brand voice and your personality into those titles, into those meta descriptions and test it out with ads first and see what’s going to resonate with your audience. Don’t just think about click-through rate for these ads.

Make sure that you’re thinking about conversion rate. If you have a really long sales cycle, make sure those leads that you’re getting are good, because what you don’t want to have happen is have an ad that people click on like crazy, they convert like crazy, and then the customers are just a total trash fire. You really want to make sure you’re driving valuable business through this kind of testing. So this might be a bit more of a longer-term piece for you.

3. Word combinations

The third thing you can look at are word combinations.

So if you’re not super familiar with AdWords, you may not be familiar with the idea of broad match modifier. So in AdWords we have broad phrases that you can search for, recipes, for example, and then anything related to the word “recipe” will show up. But you could put in a phrase in quotes. You could say “chili recipes.” Then if they say, “I would like a chili recipe,” it would come up.

If it says “chili crockpot recipes,” it would not come up. Now if you had + chili + recipes, then anything with the phrase “chili recipes” would come up, which can be really useful. If you have a lot of different keyword combinations and you don’t have time for that, you can use broad match modifier to capture a lot of them. But then you have to have a good negative keyword list, speaking as an AdWords person for a second.

Now one of the things that can really come out of broad match modifier are a lot of great, new content ideas. If you look at the keywords that people had impressions from or clicks from as a result of these broad match modifier keywords, you can find the strangest phrasing that people come up with. There are lots of crazy things that people type into Google. We all know this, especially if it’s voice search and it’s obviously voice search.

One of the fun things to do is look and see if anybody has “okay Google” and then the search phrase, because they said “okay Google” twice and then Google searched “okay Google” plus the phrase. That’s always fun to pick up. But you can also pick up lots of different content ideas, and this can help you modify poorly performing content for example. Maybe you’re just not saying the thing in the way in which your audience is saying it.

AdWords gives you totally accurate data on what your customers are thinking and feeling and saying and searching. So why not use that kind of data? So definitely check out broad match modifier stuff and see what you can do to make that better.

4. Extensions

Then the fourth thing is extensions. So extensions are those little snippets that can show up under an ad.

You should always have all of the extensions loaded in, and then maybe Google picks some, maybe they won’t, but at least they’re there as an option. Now one thing that’s great are callout extensions. Those are the little site links that are like free trial, and people click on those, or find out more information or menu or whatever it might be. Now testing language in those callout extensions can help you with your call-to-action buttons.

Especially if you’re thinking about things like people want to download a white paper, well, what’s the best way to phrase that? What do you want to say for things like a submit button for your newsletter or for a contact form? Those little, tiny pieces, that are called micro-copy, what can you do by taking your highest performing callout extensions and then using those as your call-to-action copy on your website?

This is really going to improve your lead click-through rate. You’re going to improve the way people feel about you, and you’re going to have that really nice consistency between the language that you see in your advertising and the language that you have on your website, because one thing you really want to avoid as an SEO is to get into that silo where this is SEO and this is AdWords and the two of you aren’t talking to each other at all and the copy just feels completely disjointed between the paid side and the organic side.

It should all be working together. So by taking the time to understand AdWords a little bit, getting to know it, getting to know what you can do with it, and then using some of that information in your SEO work, you can improve your on-site experience as well as rankings, and your paid person is probably going to appreciate that you talked to them for a little bit.

Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

YouTube SEO: Top Factors to Invest In – Whiteboard Friday

Posted by randfish

If you have an audience on YouTube, are you doing everything you can to reach them? Inspired by a large-scale study from Justin Briggs, Rand covers the top factors to invest in when it comes to YouTube SEO in this week’s episode of Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about YouTube SEO. So I was lucky enough to be speaking at the Search Love Conference down in San Diego a little while ago, and Justin Briggs was there presenting on YouTube SEO and on a very large-scale study that he had conducted with I think it was 100,000 different video rankings across YouTube’s search engine as well as looking at the performance of many thousands of channels and individual videos in YouTube.

Justin came up with some fascinating results. I’ve called them out here @JustinBriggs on Twitter, and his website is Briggsby.com. You can find this study, including an immense amount of data, there. But I thought I would try and sum up some of the most important points that he brought up and some of the conclusions he came to in his research. I do urge you to check out the full study, especially if you’re doing YouTube SEO.

5 crucial elements for video ranking success

So first off, there are some crucial elements for video ranking success. Now video ranking success, what do we mean by that? We mean if you perform a search query in YouTube for a specific keyword, and not necessarily a branded one, what are the things that will come up? So sort of like the same thing we talk about when we talk about Google success ranking factors, these are success factors for YouTube. That doesn’t necessarily mean that these are the things that will get you the most possible views. In fact, some of them work the other way.

1. Video views and watch time

First off, video views and watch time. So it turns out these are both very well correlated and in Justin’s opinion probably causal with higher rankings. So if you have a video and you’re competing against a competitor’s video and you get more views and a greater amount of watch time on average per view — so that’s how many people make it through a greater proportion of the video itself –you tend to do better than your competitors.

2. Keyword matching the searcher’s query in the title

Number two, keyword matching still more important we think on YouTube than it is in classic Google search. That’s not to say it’s not important in classic Google, but that in YouTube it’s even more important. It’s even a bigger factor. Essentially what Justin’s data showed is that exact match keywords, exactly matching the keyword phrase in the video title tended to outperform partial by a little bit, and partial outperformed none or only some by a considerable portion.

So if you’re trying to rank your video for what pandas eat and your video is called “What Pandas Eat,”that’s going to do much better than, for example, “Panda Consumption Habits” or “Panda Food Choices.” So describe your video, name your video in the same way that searchers are searching, and you can get intel into how searchers are using YouTube.

You can also use the data that comes back from Google keyword searches, especially if videos appear at the top of Google keyword searches, that means there’s probably a lot of demand on YouTube as well.

3. Shorter titles (<50 characters) with keyword-rich descriptions

Next up, shorter titles, less than 50 characters, with keyword-rich descriptions between 200 and 350 words tended to perform best in this dataset.

So if you’re looking for guidelines around how big should I make my YouTube title, how big should I make my description, that’s generally probably some best practices. If you leak over a little bit, it’s not a huge deal. The curve doesn’t fall off dramatically. But certainly staying around there is a good idea.

4. Keyword tags

Number four, keyword tags. So YouTube will let you apply keyword tags to a video.

This is something that used to exist in Google SEO decades ago with the meta keywords tag. It still does exist in YouTube. These keyword tags seem to matter a little for rankings, but they seem to matter more for the recommended videos. So those recommended videos are sort of what appear on the right-hand side of the video player if you’re in a desktop view or below the video on a mobile player.

Those recommended videos are also kind of what play when you keep watching a video and it’s what comes up next. So those both figure prominently into earning you more views, which can then help your rankings of course. So using keyword tags in two to three word phrase elements and usually the videos that Justin’s dataset saw performing best were those with 31 to 40 unique tags, which is a pretty hefty number.

That means folks are going through and they’re taking their “What Pandas Eat” and they’re tagging it with pandas, zoo animals, mammals, and they might even be tagging it with marsupials — I think pandas are a marsupial — but those kinds of things. So they’re adding a lot of different tags on there, 31 to 40, and those tended to do the best.

So if you’re worried that adding too many keyword tags can hurt you, maybe it can, but not up until you get to a pretty high limit here.

5. Certain video lengths perform and rank well

Number five, the videos that perform best — I like that this correlates with how Whiteboard Fridays do well as well — 10 to 16 minutes in length tend to do best in the rankings. Under two minutes in length tend to be very disliked by YouTube’s audience. They don’t perform well. Four to six minutes get the most views. So it depends on what you’re optimizing for. At Whiteboard Friday, we’re trying to convey information and make it useful and interesting and valuable. So we would probably try and stick to 10 to 16 minutes. But if we had a promotional video, for example, for a new product that we were launching, we might try and aim for a four to six minute video to get the most views, the most amplification, the most awareness that we possibly could.

3 takeaways of interest

Three other takeaways of interest that I just found potentially valuable.

Older videos do better on average, but new videos get a boost

One is older videos on average tend to do better in the rankings, but new videos get a boost when they initially come out. So in the dataset, Justin created a great graph that looks like this –zero to two weeks after a video is published, two to six weeks, six to twelve weeks, and after a year, and there are a few other ones in here.

But you can see the slope of this curve follows this concept that there’s a fresh boost right here in those first two to six weeks, and it’s strongest in the first zero to two weeks. So if you are publishing regularly and you sort of have that like, “Oh, this video didn’t hit. Let me try again.This video didn’t hit. Oh, this one got it.This nailed what my audience was looking for.This was really powerful.” That seems to do quite well.

Channels help boost their videos

Channels is something Justin looked deeply into. I haven’t covered it much here, but he looked into channel optimization a lot. Channels do help boost their individual videos with things like subscribers who comment and like and have a higher watch time on average than videos that are disconnected from subscribers. He noted that about 1,000 or more subscriptions is a really good target to start to benefit from the metrics that a good subscriber base can bring. These tend to have a positive impact on views and also on rankings. Although whether that’s correlated or merely causal, hard to say.

Embeds and links are correlated, but unsure if causal

Again on the correlation but not causation, embeds and links. So the study looked at the rankings, higher rankings up here and lower rankings down there, versus embeds.

Videos that received more embeds, they were embedded on websites more, did tend to perform better. But through experimentation, we’re not quite clear if we can prove that by embedding a video a lot we can increase its rankings. So it could just be that as something ranks well and gets picked up a lot, many people embed it rather than many embeds lead to better rankings.

All right, everyone, if you’re producing video, which I probably recommend that you do if video is ranking in the SERPs that you care about or if your audience is on YouTube, hopefully this will be helpful, and I urge you to check out Justin’s research. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The Difference Between URL Structure and Information Architecture – Whiteboard Friday

Posted by willcritchlow

Questions about URL structure and information architecture are easy to get confused, but it’s an important distinction to maintain. IA tends to be more impactful than URL decisions alone, but advice given around IA often defaults to suggestions on how to best structure your URLs. In this Whiteboard Friday, Will Critchlow helps us distinguish between the two disparate topics and shares some guiding questions to ask about each.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hi, everyone. Welcome to a British Whiteboard Friday. My name is Will Critchlow. I’m one of the founders of Distilled, and I wanted to go back to some basics today. I wanted to cover a little bit of the difference between URL structure and information architecture, because I see these two concepts unfortunately mixed up a little bit too often when people are talking about advice that they want to give.

I’m thinking here particularly from an SEO perspective. So there is a much broader study of information architecture. But here we’re thinking really about: What do the search engines care about, and what do users care about when they’re searching? So we’ll link some basics about things like what is URL structure, but we’re essentially talking here about the path, right, the bit that comes after the domain www.example.com/whatever-comes-next.

There’s a couple of main ways of structuring your URL. You can have kind of a subfolder type of structure or a much flatter structure where everything is kind of collapsed into the one level. There are pros and cons of different ways of doing this stuff, and there’s a ton of advice. You’re generally trading off considerations around, in general, it’s better to have shorter URLs than longer URLs, but it’s also better, on average, to have your keyword there than not to have your keyword there.

These are in tension. So there’s a little bit of art that goes into structuring good URLs. But too often I see people, when they’re really trying to give information architecture advice, ending up talking about URL structure, and I want to just kind of tease those things apart so that we know what we’re talking about.

So I think the confusion arises because both of them can involve questions around which pages exist on my website and what hierarchies are there between pages and groups of pages.

URL questions

So what pages exist is clearly a URL question at some level. Literally if I go to /shoes/womens, is that a 200 status? Is that a page that returns things on my website? That is, at its basics, a URL question. But zoom out a little bit and say what are the set of pages, what are the groups of pages that exist on my website, and that is an information architecture question, and, in particular, how they’re structured and how those hierarchies come together is an information architecture question.

But it’s muddied by the fact that there are hierarchy questions in the URL. So when you’re thinking about your red women’s shoes subcategory page on an e-commerce site, for example, you could structure that in a flat way like this or in a subfolder structure. That’s just a pure URL question. But it gets muddied with the information architecture questions, which we’ll come on to.

I think probably one of the key ones that comes up is: Where do your detail-level pages sit? So on an e-commerce site, imagine a product page. You could have just /product-slug. Ideally that would have some kind of descriptive keywords in it, rather than just being an anonymous number. But you can have it just in the root like this, or you can put it in a subfolder, the category it lives in.

So if this is a pair of red women’s shoes, then you could have it in /shoes/women/red slug, for example. There are pros and cons of both of these. I’m not going to get deep into it, but in general the point is you can make any of these decisions about your URLs independent of your information architecture questions.

Information architecture questions

Let’s talk about the information architecture, because these are actually, in general, the more impactful questions for your search performance. So these are things like, as I said at the beginning, it’s essentially what pages exist and what are their hierarchies.

  • How many levels of category and subcategory should we have on our website?
  • What do we do in our faceted navigation?
  • Do we go two levels deep?
  • Do we go three levels deep?
  • Do we allow all those pages to be crawled and indexed?
  • How do we link between things?
  • How do we link between the sibling products that are in the same category or subcategory?
  • How do we link back up the structure to the parent subcategory or category?
  • How do we crucially build good link paths out from the big, important pages on our website, so our homepage or major category pages?
  • What’s the link path that you can follow by clicking multiple links from there to get to detail level for every product on your website?

Those kind of questions are really impactful. They make a big difference, on an SEO front, both in terms of crawl depth, so literally a search engine spider coming in and saying, “I need to discover all these pages, all these detail-level pages on your website.” So what’s the click depth and crawl path out from those major pages?

Think about link authority and your link paths

It’s also a big factor in a link authority sense. Your internal linking structure is how your PageRank and other link metrics get distributed out around your website, and so it’s really critical that you have these great linking paths down into the products, between important products, and between categories and back up the hierarchy. How do we build the best link paths from our important pages down to our detail-level pages and back up?

Make your IA decisions before your URL structure decisions

After you have made whatever IA decisions you like, then you can independently choose your preferred URLs for each page type.

These are SEO information architecture questions, and the critical thing to realize is that you can make all of your information architecture decisions — which pages exist, which subcategories we’re going to have indexed, how we link between sibling products, all of this linking stuff — we can make all these decisions, and then we can say, independently of whatever decisions we made, we can choose any of the URL structures we like for what those actual pages’ paths are, what the URLs are for those pages.

We need to not get those muddied, and I see that getting muddied too often. People talk about these decisions as if they’re information architecture questions, and they make them first, when actually you should be making these decisions first and then picking the best, like I said, it’s a bit more art than science sometimes to making the decision between longer URLs, more descriptive URLs, or shorter URL paths.

So I hope that’s been a helpful intro to a basic topic. I’ve written a bunch of this stuff up in a blog post, and we’ll link to that. But yeah, I’ve enjoyed this Whiteboard Friday. I hope you have too. See you soon.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

How Do Sessions Work in Google Analytics? – Whiteboard Friday

Posted by Tom.Capper

One of these sessions is not like the other. Google Analytics data is used to support tons of important work, ranging from our everyday marketing reporting all the way to investment decisions. To that end, it’s integral that we’re aware of just how that data works.

In this week’s edition of Whiteboard Friday, we welcome Tom Capper to explain how the sessions metric in Google Analytics works, several ways that it can have unexpected results, and as a bonus, how sessions affect the time on page metric (and why you should rethink using time on page for reporting).

How do sessions work in Google Analytics?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans, and welcome to another edition of Whiteboard Friday. I am Tom Capper. I am a consultant at Distilled, and today I’m going to be talking to you about how sessions work in Google Analytics. Obviously, all of us use Google Analytics. Pretty much all of us use Google Analytics in our day-to-day work.

Data from the platform is used these days in everything from investment decisions to press reporting to the actual marketing that we use it for. So it’s important to understand the basic building blocks of these platforms. Up here I’ve got the absolute basics. So in the blue squares I’ve got hits being sent to Google Analytics.

So when you first put Google Analytics on your site, you get that bit of tracking code, you put it on every page, and what that means is when someone loads the page, it sends a page view. So those are the ones I’ve marked P. So we’ve got page view and page view and so on as you’re going around the site. I’ve also got events with an E and transactions with a T. Those are two other hit types that you might have added.

The job of Google Analytics is to take all this hit data that you’re sending it and try and bring it together into something that actually makes sense as sessions. So they’re grouped into sessions that I’ve put in black, and then if you have multiple sessions from the same browser, then that would be a user that I’ve marked in pink. The issue here is it’s kind of arbitrary how you divide these up.

These eight hits could be one long session. They could be eight tiny ones or anything in between. So I want to talk today about the different ways that Google Analytics will actually split up those hit types into sessions. So over here I’ve got some examples I’m going to go through. But first I’m going to go through a real-world example of a brick-and-mortar store, because I think that’s what they’re trying to emulate, and it kind of makes more sense with that context.

Brick-and-mortar example

So in this example, say a supermarket, we enter by a passing trade. That’s going to be our source. Then we’ve got an entrance is in the lobby of the supermarket when we walk in. We got passed from there to the beer aisle to the cashier, or at least I do. So that’s one big, long session with the source passing trade. That makes sense.

In the case of a brick-and-mortar store, it’s not to difficult to divide that up and try and decide how many sessions are going on here. There’s not really any ambiguity. In the case of websites, when you have people leaving their keyboard for a while or leaving the computer on while they go on holiday or just having the same computer over a period of time, it becomes harder to divide things up, because you don’t know when people are actually coming and going.

So what they’ve tried to do is in the very basic case something quite similar: arrive by Google, category page, product page, checkout. Great. We’ve got one long session, and the source is Google. Okay, so what are the different ways that that might go wrong or that that might get divided up?

Several things that can change the meaning of a session

1. Time zone

The first and possibly most annoying one, although it doesn’t tend to be a huge issue for some sites, is whatever time zone you’ve set in your Google Analytics settings, the midnight in that time zone can break up a session. So say we’ve got midnight here. This is 12:00 at night, and we happen to be browsing. We’re doing some shopping quite late.

Because Google Analytics won’t allow a session to have two dates, this is going to be one session with the source Google, and this is going to be one session and the source will be this page. So this is a self-referral unless you’ve chosen to exclude that in your settings. So not necessarily hugely helpful.

2. Half-hour cutoff for “coffee breaks”

Another thing that can happen is you might go and make a cup of coffee. So ideally if you went and had a cup of coffee while in you’re in Tesco or a supermarket that’s popular in whatever country you’re from, you might want to consider that one long session. Google has made the executive decision that we’re actually going to have a cutoff of half an hour by default.

If you leave for half an hour, then again you’ve got two sessions. One, the category page is the landing page and the source of Google, and one in this case where the blog is the landing page, and this would be another self-referral, because when you come back after your coffee break, you’re going to click through from here to here. This time period, the 30 minutes, that is actually adjustable in your settings, but most people do just leave it as it is, and there isn’t really an obvious number that would make this always correct either. It’s kind of, like I said earlier, an arbitrary distinction.

3. Leaving the site and coming back

The next issue I want to talk about is if you leave the site and come back. So obviously it makes sense that if you enter the site from Google, browse for a bit, and then enter again from Bing, you might want to count that as two different sessions with two different sources. However, where this gets a little murky is with things like external payment providers.

If you had to click through from the category page to PayPal to the checkout, then unless PayPal is excluded from your referral list, then this would be one session, entrance from Google, one session, entrance from checkout. The last issue I want to talk about is not necessarily a way that sessions are divided, but a quirk of how they are.

4. Return direct sessions

If you were to enter by Google to the category page, go on holiday and then use a bookmark or something or just type in the URL to come back, then obviously this is going to be two different sessions. You would hope that it would be one session from Google and one session from direct. That would make sense, right?

But instead, what actually happens is that, because Google and most Google Analytics and most of its reports uses last non-direct click, we pass through that source all the way over here, so you’ve got two sessions from Google. Again, you can change this timeout period. So that’s some ways that sessions work that you might not expect.

As a bonus, I want to give you some extra information about how this affects a certain metric, mainly because I want to persuade you to stop using it, and that metric is time on page.

Bonus: Three scenarios where this affects time on page

So I’ve got three different scenarios here that I want to talk you through, and we’ll see how the time on page metric works out.

I want you to bear in mind that, basically, because Google Analytics really has very little data to work with typically, they only know that you’ve landed on a page, and that sent a page view and then potentially nothing else. If you were to have a single page visit to a site, or a bounce in other words, then they don’t know whether you were on that page for 10 seconds or the rest of your life.

They’ve got no further data to work with. So what they do is they say, “Okay, we’re not going to include that in our average time on page metrics.” So we’ve got the formula of time divided by views minus exits. However, this fudge has some really unfortunate consequences. So let’s talk through these scenarios.

Example 1: Intuitive time on page = actual time on page

In the first scenario, I arrive on the page. It sends a page view. Great. Ten seconds later I trigger some kind of event that the site has added. Twenty seconds later I click through to the next page on the site. In this case, everything is working as intended in a sense, because there’s a next page on the site, so Google Analytics has that extra data of another page view 20 seconds after the first one. So they know that I was on here for 20 seconds.

In this case, the intuitive time on page is 20 seconds, and the actual time on page is also 20 seconds. Great.

Example 2: Intuitive time on page is higher than measured time on page

However, let’s think about this next example. We’ve got a page view, event 10 seconds later, except this time instead of clicking somewhere else on the site, I’m going to just leave altogether. So there’s no data available, but Google Analytics knows we’re here for 10 seconds.

So the intuitive time on page here is still 20 seconds. That’s how long I actually spent looking at the page. But the measured time or the reported time is going to be 10 seconds.

Example 3: Measured time on page is zero

The last example, I browse for 20 seconds. I leave. I haven’t triggered an event. So we’ve got an intuitive time on page of 20 seconds and an actual time on page or a measured time on page of 0.

The interesting bit is when we then come to calculate the average time on page for this page that appeared here, here, and here, you would initially hope it would be 20 seconds, because that’s how long we actually spent. But your next guess, when you look at the reported or the available data that Google Analytics has in terms of how long we’re on these pages, the average of these three numbers would be 10 seconds.

So that would make some sense. What they actually do, because of this formula, is they end up with 30 seconds. So you’ve got the total time here, which is 30, divided by the number of views, we’ve got 3 views, minus 2 exits. Thirty divided 3 minus 2, 30 divided by 1, so we’ve got 30 seconds as the average across these 3 sessions.

Well, the average across these three page views, sorry, for the amount of time we’re spending, and that is longer than any of them, and it doesn’t make any sense with the constituent data. So that’s just one final tip to please not use average time on page as a reporting metric.

I hope that’s all been useful to you. I’d love to hear what you think in the comments below. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Log File Analysis 101 – Whiteboard Friday

Posted by BritneyMuller

Log file analysis can provide some of the most detailed insights about what Googlebot is doing on your site, but it can be an intimidating subject. In this week’s Whiteboard Friday, Britney Muller breaks down log file analysis to make it a little more accessible to SEOs everywhere.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we’re going over all things log file analysis, which is so incredibly important because it really tells you the ins and outs of what Googlebot is doing on your sites.

So I’m going to walk you through the three primary areas, the first being the types of logs that you might see from a particular site, what that looks like, what that information means. The second being how to analyze that data and how to get insights, and then the third being how to use that to optimize your pages and your site.

For a primer on what log file analysis is and its application in SEO, check out our article: How to Use Server Log Analysis for Technical SEO

1. Types

So let’s get right into it. There are three primary types of logs, the primary one being Apache. But you’ll also see W3C, elastic load balancing, which you might see a lot with things like Kibana. But you also will likely come across some custom log files. So for those larger sites, that’s not uncommon. I know Moz has a custom log file system. Fastly is a custom type setup. So just be aware that those are out there.

Log data

So what are you going to see in these logs? The data that comes in is primarily in these colored ones here.

So you will hopefully for sure see:

  • the request server IP;
  • the timestamp, meaning the date and time that this request was made;
  • the URL requested, so what page are they visiting;
  • the HTTP status code, was it a 200, did it resolve, was it a 301 redirect;
  • the user agent, and so for us SEOs we’re just looking at those user agents’ Googlebot.

So log files traditionally house all data, all visits from individuals and traffic, but we want to analyze the Googlebot traffic. Method (Get/Post), and then time taken, client IP, and the referrer are sometimes included. So what this looks like, it’s kind of like glibbery gloop.

It’s a word I just made up, and it just looks like that. It’s just like bleh. What is that? It looks crazy. It’s a new language. But essentially you’ll likely see that IP, so that red IP address, that timestamp, which will commonly look like that, that method (get/post), which I don’t completely understand or necessarily need to use in some of the analysis, but it’s good to be aware of all these things, the URL requested, that status code, all of these things here.

2. Analyzing

So what are you going to do with that data? How do we use it? So there’s a number of tools that are really great for doing some of the heavy lifting for you. Screaming Frog Log File Analyzer is great. I’ve used it a lot. I really, really like it. But you have to have your log files in a specific type of format for them to use it.

Splunk is also a great resource. Sumo Logic and I know there’s a bunch of others. If you’re working with really large sites, like I have in the past, you’re going to run into problems here because it’s not going to be in a common log file. So what you can do is to manually do some of this yourself, which I know sounds a little bit crazy.

Manual Excel analysis

But hang in there. Trust me, it’s fun and super interesting. So what I’ve done in the past is I will import a CSV log file into Excel, and I will use the Text Import Wizard and you can basically delineate what the separators are for this craziness. So whether it be a space or a comma or a quote, you can sort of break those up so that each of those live within their own columns. I wouldn’t worry about having extra blank columns, but you can separate those. From there, what you would do is just create pivot tables. So I can link to a resource on how you can easily do that.

Top pages

But essentially what you can look at in Excel is: Okay, what are the top pages that Googlebot hits by frequency? What are those top pages by the number of times it’s requested?

Top folders

You can also look at the top folder requests, which is really interesting and really important. On top of that, you can also look into: What are the most common Googlebot types that are hitting your site? Is it Googlebot mobile? Is it Googlebot images? Are they hitting the correct resources? Super important. You can also do a pivot table with status codes and look at that. I like to apply some of these purple things to the top pages and top folders reports. So now you’re getting some insights into: Okay, how did some of these top pages resolve? What are the top folders looking like?

You can also do that for Googlebot IPs. This is the best hack I have found with log file analysis. I will create a pivot table just with Googlebot IPs, this right here. So I will usually get, sometimes it’s a bunch of them, but I’ll get all the unique ones, and I can go to terminal on your computer, on most standard computers.

I tried to draw it. It looks like that. But all you do is you type in “host” and then you put in that IP address. You can do it on your terminal with this IP address, and you will see it resolve as a Google.com. That verifies that it’s indeed a Googlebot and not some other crawler spoofing Google. So that’s something that these tools tend to automatically take care of, but there are ways to do it manually too, which is just good to be aware of.

3. Optimize pages and crawl budget

All right, so how do you optimize for this data and really start to enhance your crawl budget? When I say “crawl budget,” it primarily is just meaning the number of times that Googlebot is coming to your site and the number of pages that they typically crawl. So what is that with? What does that crawl budget look like, and how can you make it more efficient?

  • Server error awareness: So server error awareness is a really important one. It’s good to keep an eye on an increase in 500 errors on some of your pages.
  • 404s: Valid? Referrer?: Another thing to take a look at is all the 400s that Googlebot is finding. It’s so important to see: Okay, is that 400 request, is it a valid 400? Does that page not exist? Or is it a page that should exist and no longer does, but you could maybe fix? If there is an error there or if it shouldn’t be there, what is the referrer? How is Googlebot finding that, and how can you start to clean some of those things up?
  • Isolate 301s and fix frequently hit 301 chains: 301s, so a lot of questions about 301s in these log files. The best trick that I’ve sort of discovered, and I know other people have discovered, is to isolate and fix the most frequently hit 301 chains. So you can do that in a pivot table. It’s actually a lot easier to do this when you have kind of paired it up with crawl data, because now you have some more insights into that chain. What you can do is you can look at the most frequently hit 301s and see: Are there any easy, quick fixes for that chain? Is there something you can remove and quickly resolve to just be like a one hop or a two hop?
  • Mobile first: You can keep an eye on mobile first. If your site has gone mobile first, you can dig into that, into the logs and evaluate what that looks like. Interestingly, the Googlebot is still going to look like this compatible Googlebot 2.0. However, it’s going to have all of the mobile implications in the parentheses before it. So I’m sure these tools can automatically know that. But if you’re doing some of the stuff manually, it’s good to be aware of what that looks like.
  • Missed content: So what’s really important is to take a look at: What’s Googlebot finding and crawling, and what are they just completely missing? So the easiest way to do that is to cross-compare with your site map. It’s a really great way to take a look at what might be missed and why and how can you maybe reprioritize that data in the site map or integrate it into navigation if at all possible.
  • Compare frequency of hits to traffic: This was an awesome tip I got on Twitter, and I can’t remember who said it. They said compare frequency of Googlebot hits to traffic. I thought that was brilliant, because one, not only do you see a potential correlation, but you can also see where you might want to increase crawl traffic or crawls on a specific, high-traffic page. Really interesting to kind of take a look at that.
  • URL parameters: Take a look at if Googlebot is hitting any URLs with the parameter strings. You don’t want that. It’s typically just duplicate content or something that can be assigned in Google Search Console with the parameter section. So any e-commerce out there, definitely check that out and kind of get that all straightened out.
  • Evaluate days, weeks, months: You can evaluate days, weeks, and months that it’s hit. So is there a spike every Wednesday? Is there a spike every month? It’s kind of interesting to know, not totally critical.
  • Evaluate speed and external resources: You can evaluate the speed of the requests and if there’s any external resources that can potentially be cleaned up and speed up the crawling process a bit.
  • Optimize navigation and internal links: You also want to optimize that navigation, like I said earlier, and use that meta no index.
  • Meta noindex and robots.txt disallow: So if there are things that you don’t want in the index and if there are things that you don’t want to be crawled from your robots.txt, you can add all those things and start to help some of this stuff out as well.

Reevaluate

Lastly, it’s really helpful to connect the crawl data with some of this data. So if you’re using something like Screaming Frog or DeepCrawl, they allow these integrations with different server log files, and it gives you more insight. From there, you just want to reevaluate. So you want to kind of continue this cycle over and over again.

You want to look at what’s going on, have some of your efforts worked, is it being cleaned up, and go from there. So I hope this helps. I know it was a lot, but I want it to be sort of a broad overview of log file analysis. I look forward to all of your questions and comments below. I will see you again soon on another Whiteboard Friday. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert