Tag Archive | "Ranking"

Diagnosing Why a Site’s Set of Pages May Be Ranking Poorly – Whiteboard Friday

Posted by randfish

Your rankings have dropped and you don’t know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It’s an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.

Diagnosing why a site's pages may be ranking poorly

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about diagnosing a site and specifically a section of a site’s pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we’ve got a fairly extensive process here, so let’s get started.

Step 1: Uncover the problem

First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we’re talking about a site that’s 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to

A. Treat different site sections as unique segments for investigation. You should look at them individually.

A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I’m going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let’s say I have a website that’s dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I’m ranking for fewer of them, and it’s my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.

B. Check traffic from search over time.

So I go to my Google Analytics or whatever analytics you’re using, and you might see something like, okay, I’m going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that’s a big traffic drop. We fell off a cliff there for these particular pages.

This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It’s going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it’s not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.

From there, I’m going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it’s almost certainly something you’ve done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or ’17 or ’18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.

C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.

This is one of the reasons I like to rank track for even these types of queries that don’t get a lot of traffic.

1. Target keywords. In this case, it might be “Denver population growth,” maybe that’s one of your keywords. You would see, “Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?”

2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus “Denver population growth,” so My Site or MySite.com Denver population growth. If you’re not ranking for that, that’s usually an indication of a more serious problem, potentially a penalty or some type of dampening that’s happening around your brand name or around your website.

3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.

4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I’m not ranking for this, but I am for this one … sorry, if I’m not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It’s probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.

5. site: urlstring/ So I would search for “site:MySite.com/cities/Denver.” I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it’s been a month. I wonder why they haven’t come back. Maybe there’s some sort of crawl issue, robots.txt issue, meta robots issue, something. I’m preventing Google from potentially getting there. Or maybe they can’t get there at all, and this results in zero results. That means Google hasn’t even indexed the page. Now we have another type of problem.

D. Check your tools.

1. Google Search Console. I would start there, especially in the site issues section.

2. Check your rank tracker or whatever tool you’re using, whether that’s Moz or something else.

3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you’ve run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.

4. Site uptime. So I might check Pingdom or other things that alert me to, “Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen,” those types of things.

Step 2: Offer hypothesis for falling rankings/traffic

Okay, you’ve done your diagnostics. Now it’s time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.

A. If rankings are up, but traffic is down…

In those cases, these are the five things that are most typically to blame.

1. New SERP features. There’s a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don’t get that featured snippet, you’re losing out to one of your competitors.

2. Lower search demand. Like we talked about in Google Trends. I’m looking at search demand, and there are just not as many people searching as there used to be.

3. Brand or reputation issues. I’m ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They’re choosing someone else actively because of reputation issues.

4. Snippet problems. I’m ranking in the same place I used to be, but I’m no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.

5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they’ve clicked in the past or where they’re located. Location is often a big cause for this.

So for many SEOs for many years, “SEO consultant” resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now “SEO consultant” results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.

B. If rankings and traffic are down…

If you’re seeing that rankings have fallen and traffic has fallen in conjunction, there’s a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it’s things like this:

1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you’re providing just isn’t good enough.

3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it’s just not serving the new searcher intent.

4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you’re not solving the searcher’s query as well. Your user interface, your UX is not as good. Your keyword targeting isn’t as good as theirs. Your content quality and the unique value you provide isn’t as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.

5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn’t getting indexed, or Google hasn’t updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I’m having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can’t see the content on my page anymore because I made some change in the technology of how it’s displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.

Maybe at the server level, someone on the tech ops team of my website decided, “Oh, there’s this really problematic bot coming from Mountain View that’s costing us a bunch of bandwidth. Let’s block bots from Mountain View.” No, don’t do that. Bad. Those kinds of technical issues can happen.

6. Spam and penalties. We’ll talk a little bit more about how to diagnose those in a second.

7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren’t satisfied by your results, maybe because their expectations have changed or market issues have changed.

Step 3: Make fixes and observe results

All right. Next and last in this process, what we’re going to do is make some fixes and observe the results. Hopefully, we’ve been able to correctly diagnose and form some wise hypotheses about what’s going wrong, and now we’re going to try and resolve them.

A. On-page and technical issues should solve after a new crawl + index.

So on-page and technical issues, if we’re fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we’re talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it’s long tail to many different pages, you’re not going to see that instant traffic gain and rise as fast.

B. Link issues and spam penalty problems can take months to show results.

Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You’ve seen a lot of spam experts on Twitter saying, “Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today,” because Google made some fix in their latest index rollout or their algorithm changed, and it’s sort of, okay, well we’ll reward the people for all the fixes that they’ve made. Sometimes that’s in batches that take months.

C. Fixing a small number of pages in a section that’s performing poorly might not show results very quickly.

For example, let’s say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You’ve tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you’ve left 5,000 other cities pages untouched.

Sometimes Google will sort of be like, “No, you know what? We still think your cities pages, as a whole, don’t do a good job solving this query. So even though these two that you’ve updated do a better job, we’re not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section.” That is a real thing that we’ve observed happening in Google’s results.

Because of this, one of the things that I would urge you to do is if you’re seeing good results from the people you’re testing it with and you’re pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.

D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.

So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, “Ah, we have something new to judge. Let’s see how these location pages on MySite.com perform versus the old cities pages.”

So I know we’ve covered a ton today and there are a lot of diagnostic issues that we haven’t necessarily dug deep into, but I hope this can help you if you’re encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

Back to the basics: Beyond ranking factors

Ranking factors are important, but is that really what you should be focusing on? Columnist Garrett Mehrguth believes marketers need to first turn their attention to design, audience research, content and attribution.

The post Back to the basics: Beyond ranking factors appeared first on Search…

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Troubleshooting Local Ranking Failures [Updated for 2018]

Posted by MiriamEllis

I love a mystery… especially a local search ranking mystery I can solve for someone.

Now, the truth is, some ranking puzzles are so complex, they can only be solved by a formal competitive audit. But there are many others that can be cleared up by spending 15 minutes or less going through an organized 10-point checklist of the commonest problems that can cause a business to rank lower than the owner thinks it should. By zipping through the following checklist, there’s a good chance you’ll be able to find one or more obvious “whodunits” contributing to poor Google local pack visibility for a given search.

Since I wrote the original version of this post in 2014, so much has changed. Branding, tools, tactics — things are really different in 2018. Definitely time for a complete overhaul, with the goal of making you a super sleuth for your forum friends, clients, agency teammates, or executive superiors.

Let’s emulate the Stratemeyer Syndicate, which earned lasting fame by hitting on a simple formula for surfacing and solving mysteries in a most enjoyable way.

Before we break out our magnifying glass, it’s critical to stress one very important thing. The local rankings I see from an office in North Beach, San Francisco are not the rankings you see while roaming around Golden Gate park in the same city. The rankings your client in Des Moines sees for things in his town are not the same rankings you see from your apartment in Albuquerque when you look at Des Moines results. With the user having become the centroid of search for true local searches, it is no mystery at all that we see different results when we are different places, and it is no cause for concern.

And now that we’ve gotten that out of the way and are in the proper detective spirit, let’s dive into how to solve for each item on our checklist!

☑ Google updates/bugs

The first thing to ask if a business experiences a sudden change in rankings is whether Google has done something. Search Engine Land strikes me as the fastest reporter of Google updates, with MozCast offering an ongoing weather report of changes in the SERPs. Also, check out the Moz Google Algo Change history list and the Moz Blog for some of the most in-depth strategic coverage of updates, penalties, and filters.

For local-specific bugs (or even just suspected tests), check out the Local Search Forum, the Google My Business forum, and Mike Blumenthal’s blog. See if the effects being described match the weirdness you are seeing in your local packs. If so, it’s a matter of fixing a problematic practice (like iffy link building) that has been caught in an update, waiting to see how the update plays out, or waiting for Google to fix a bug or turn a dial down to normalize results.

*Pro tip: Don’t make the mistake of thinking organic updates have nothing to do with local SEO. Crack detectives know organic and local are closely connected.

☑ Eligibility to list and rank

When a business owner wants to know why he isn’t ranking well locally, always ask these four questions:

  1. Does the business have a real address? (Not a PO box, virtual office, or a string of employees’ houses!)
  2. Does the business make face-to-face contact with its customers?
  3. What city is the business in?
  4. What is the exact keyword phrase they are hoping to rank for?

If the answer is “no” to either of the first two questions, the business isn’t eligible for a Google My Business listing. And while spam does flow through Google, a lack of eligibility could well be the key to a lack of rankings.

For the third question, you need to know the city the business is in so that you can see if it’s likely to rank for the search phrase cited in the fourth question. For example, a plumber with a street address in Sugar Land, TX should not expect to rank for “plumber Dallas TX.” If a business lacks a physical location in a given city, it’s atypical for it to rank for queries that stem from or relate to that locale. It’s amazing just how often this simple fact solves local pack mysteries.

☑ Guideline spam

To be an ace local sleuth, you must commit to memory the guidelines for representing your business on Google so that you can quickly spot violations. Common acts of spam include:

  • Keyword stuffing the business name field
  • Improper wording of the business name field
  • Creating listings for ineligible locations, departments, or people
  • Category spam
  • Incorrect phone number implementation
  • Incorrect website URL implementation
  • Review guideline violations

If any of the above conundrums are new to you, definitely spend 10 minutes reading the guidelines. Make flash cards, if necessary, to test yourself on your spam awareness until you can instantly detect glaring errors. With this enhanced perception, you’ll be able to see problems that may possibly be leading to lowered rankings, or even… suspensions!

☑ Suspensions

There are two key things to look for here when a local business owner comes to you with a ranking woe:

  1. If the listing was formerly verified, but has mysteriously become unverified, you should suspect a soft suspension. Soft suspensions might occur around something like a report of keyword-stuffing the GMB business name field. Oddly, however, there is little anecdotal evidence to support the idea that soft suspensions cause ranking drops. Nevertheless, it’s important to spot the un-verification clue and tell the owner to stop breaking guidelines. It’s possible that the listing may lose reviews or images during this type of suspension, but in most cases, the owner should be able to re-verify his listing. Just remember: a soft suspension is not a likely cause of low local pack rankings.
  2. If the listing’s rankings totally disappear and you can’t even find the listing via a branded search, it’s time to suspect a hard suspension. Hard suspensions can result from a listing falling afoul of a Google guideline or new update, a Google employee, or just a member of the public who has reported the business for something like an ineligible location. If the hard suspension is deserved, as in the case of creating a listing at a fake address, then there’s nothing you can do about it. But, if a hard suspension results from a mistake, I recommend taking it to the Google My Business forum to plead for help. Be prepared to prove that you are 100% guideline-compliant and eligible in hopes of getting your listing reinstated with its authority and reviews intact.

☑ Duplicates

Notorious for their ability to divide ranking strength, duplicate listings are at their worst when there is more than one verified listing representing a single entity. If you encounter a business that seems like it should be ranking better than it is for a given search, always check for duplicates.

The quickest way to do this is to get all present and past NAP (name, address, phone) from the business and plug it into the free Moz Check Listing tool. Pay particular attention to any GMB duplicates the tool surfaces. Then:

  1. If the entity is a brick-and-mortar business or service area business, and the NAP exactly matches between the duplicates, contact Google to ask them to merge the listings. If the NAP doesn’t match and represents a typo or error on the duplicate, use the “suggest an edit” link in Google Maps to toggle the “yes/no” toggle to “yes,” and then select the radio button for “never existed.”
  2. If the duplicates represent partners in a multi-practitioner business, Google won’t simply delete them. Things get quite complicated in this scenario, and if you discover practitioner duplicates, tread carefully. There are half a dozen nuances here, including whether you’re dealing with actual duplicates, whether they represent current or past staffers, whether they are claimed or unclaimed, and even whether a past partner is deceased. There isn’t perfect industry agreement on the handling of all of the ins-and-outs of practitioner listings. Given this, I would advise an affected business to read all of the following before making a move in any direction:

☑ Missing/inaccurate listings

While you’ve got Moz Check Listing fired up, pay attention to anything it tells you about missing or inaccurate listings. The tool will show you how accurate and complete your listings on are on the major local business data aggregators, plus other important platforms like Google My Business, Facebook, Factual, Yelp, and more. Why does this matter?

  1. Google can pull information from anywhere on the web and plunk it into your Google My Business listing.
  2. While no one can quantify the exact degree to which citation/listing consistency directly impacts Google local rankings for every possible search query, it has been a top 5 ranking factor in the annual Local Search Ranking Factors survey as far back as I can remember. Recently, I’ve seen some industry discussion as to whether citations still matter, with some practitioners claiming they can’t see the difference they make. I believe that conclusion may stem from working mainly in ultra-competitive markets where everyone has already got their citations in near-perfect order, forcing practitioners to look for differentiation tactics beyond the basics. But without those basics, you’re missing table stakes in the game.
  3. Indirectly, listing absence or inconsistency impacts local rankings in that it undermines the quest for good local KPIs as well as organic authority. Every lost or misdirected consumer represents a failure to have someone click-for-directions, click-to-call, click-to-your website, or find your website at all. Online and offline traffic, conversions, reputation, and even organic authority all hang in the balance of active citation management.

☑ Lack of organic authority

Full website or competitive audits are not the work of a minute. They really take time, and deep delving. But, at a glance, you can access some quick metrics to let you know whether a business’ lack of achievement on the organic side of things could be holding them back in the local packs. Get yourself the free MozBar SEO toolbar and try this:

  1. Turn the MozBar on by clicking the little “M” at the top of your browser so that it is blue.
  2. Perform your search and look at the first few pages of the organic results, ignoring anything from major directory sites like Yelp (they aren’t competing with you for local pack rankings, eh?).
  3. Note down the Page Authority, Domain Authority, and link counts for each of the businesses coming up on the first 3 pages of the organic results.
  4. Finally, bring up the website of the business you’re investigating. If you see that the top competitors have Domain Authorities of 50 and links numbering in the hundreds or thousands, whereas your target site is well below in these metrics, chances are good that organic authority is playing a strong role in lack of local search visibility. How do we know this is true? Do some local searches and note just how often the businesses that make it into the 3-pack or the top of the local finder view have correlating high organic rankings.

Where organic authority is poor, a business has a big job of work ahead. They need to focus on content dev + link building + social outreach to begin building up their brand in the minds of consumers and the “RankBrain” of Google.

One other element needs to be mentioned here, and that’s the concept of how time affects authority. When you’re talking to a business with a ranking problem, it’s very important to ascertain whether they just launched their website or just built their local business listings last week, or even just a few months ago. Typically, if they have, the fruits of their efforts have yet to fully materialize. That being said, it’s not a given that a new business will have little authority. Large brands have marketing departments which exist solely to build tremendous awareness of new assets before they even launch. It’s important to keep that in mind, while also realizing that if the business is smaller, building authority will likely represent a longer haul.

☑ Possum effect

Where local rankings are absent, always ask:

“Are there any other businesses in your building or even on your street that share your Google category?”

If the answer is “yes,” search for the business’ desired keyword phase and look at the local finder view in Google Maps. Note which companies are ranking. Then begin to zoom in on the map, level by level, noting changes in the local finder as you go. If, a few levels in, the business you’re advising suddenly appears on the map and in the local finder, chances are good it’s the Possum filter that’s causing their apparent invisibility at the automatic zoom level.

Google Possum rolled out in September 2016, and its observable effects included a geographic diversification of the local results, filtering out many listings that share a category and are in close proximity to one another. Then, about one year later, Google initiated the Hawk update, which appears to have tightened the radius of Possum, with the result that while many businesses in the same building are still being filtered out, a number of nearby neighbors have reappeared at the automatic zoom level of the results.

If your sleuthing turns up a brand that is being impacted by Possum/Hawk, the only surefire way to beat the filter is to put in the necessary work to become the most authoritative answer for the desired search phrase. It’s important to remember that filters are the norm in Google’s local results, and have long been observed impacting listings that share an address, share a phone number, etc. If it’s vital for a particular listing to outrank all others that possess shared characteristics, then authority must be built around it in every possible way to make it one of the most dominant results.

☑ Local Service Ads effect

The question you ask here is:

“Is yours a service-area business?”

And if the answer is “yes,” then brace yourself for ongoing results disruption in the coming year.

Google’s Local Service Ads (formerly Home Service Ads) make Google the middleman between consumers and service providers, and in the 2+ years since first early testing, they’ve caused some pretty startling things to happen to local search results. These have included:

Suffice it to say, rollout to an ever-increasing number of cities and categories hasn’t been for the faint of heart, and I would hazard a guess that Google’s recent re-brand of this program signifies their intention to move beyond the traditional SAB market. One possible benefit of Google getting into this type of lead gen is that it could decrease spam, but I’m not sold on this, given that fake locations have ended up qualifying for LSA inclusion. While I honor Google’s need to be profitable, I share some of the qualms business owners have expressed about the potential impacts of this venture.

Since I can’t offer a solid prediction of what precise form these impacts will take in the coming months, the best I can do here is to recommend that if an SAB experiences a ranking change/loss, the first thing to look for is whether LSA has come to town. If so, alteration of the SERPs may be unavoidable, and the only strategy left for overcoming vanished visibility may be to pay for it… by qualifying for the program.

☑ GMB neglect

Sometimes, a lack of competitive rankings can simply be chalked up to a lack of effort. If a business wonders why they’re not doing better in the local packs, pull up their GMB listing and do a quick evaluation of:

  • Verification status – While you can rank without verifying, lack of verification is a hallmark of listing neglect.
  • Basic accuracy – If NAP or map markers are incorrect, it’s a sure sign of neglect.
  • Category choices – Wrong categories make right rankings impossible.
  • Image optimization – Every business needs a good set of the most professional, persuasive photos it can acquire, and should even consider periodic new photo shoots for seasonal freshness; imagery impacts KPIs, which are believed to impact rank.
  • Review count, sentiment and management – Too few reviews, low ratings, and lack of responses = utter neglect of this core rank/reputation-driver.
  • Hours of operation – If they’re blank or incorrect, conversions are being missed.
  • Main URL choice – Does the GMB listing point to a strong, authoritative website page or a weak one?
  • Additional URL choices – If menus, bookings, reservations, or placing orders is part of the business model, a variety of optional URLs are supported by Google and should be explored.
  • Google Posts – Early-days testing indicates that regular posting may impact rank.
  • Google Questions and Answers – Pre-populate with best FAQs and actively manage incoming questions.

There is literally no business, large or small, with a local footprint that can afford to neglect its Google My Business listing. And while some fixes and practices move the ranking needle more than others, the increasing number of consumer actions that take place within Google is reason enough to put active GMB management at the top of your list.

Closing the case

The Hardy Boys never went anywhere without their handy kit of detection tools. Their father was so confident in their utter preparedness that he even let them chase down gangs in Hong Kong and dictators in the Guyanas (which, on second thought, doesn’t seem terribly wise.) But I have that kind of confidence in you. I hope my troubleshooting checklist is one you’ll bookmark and share to be prepared for the local ranking mysteries awaiting you and your digital marketing colleagues in 2018. Happy sleuthing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

SearchCap: Local ranking factors, keyword bidding & more

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Local ranking factors, keyword bidding & more appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

SearchCap: Google Shopping ad updates, SEO ranking factors & nofollow links

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Shopping ad updates, SEO ranking factors & nofollow links appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Search Buzz Video Recap: Google Algorithm Update & Mobile First Index Tests, Sentiment Ranking Factors & Danny Sullivan Joins Google

This week in search, I covered a largish Google search algorithm ranking update over last weekend. Also, we are noticing huge shifts in the mobile search results…

Search Engine Roundtable

Posted in IM NewsComments Off

SEO ranking factors: What’s important, what’s not

This week, Google celebrated its 19th birthday. A lot has changed in nearly two decades. Rather than relying primarily on PageRank to evaluate the quality of web pages, Google now uses a whole array of techniques to suggest a wide range of content in response to queries, from simple direct answers…

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Search Buzz Video Recap: Google Search Ranking Changes, Top Ranking Signals & Dynamic Algorithms

First, I am offline, so this video and post was all created and produced on Wednesday, things may have transpired between now and then that I may have to catch up on this Monday. Google did some algorithm search ranking updates this week…

Search Engine Roundtable

Posted in IM NewsComments Off

Understanding and Harnessing the Flow of Link Equity to Maximize SEO Ranking Opportunity – Whiteboard Friday

Posted by randfish

How does the flow of link equity work these days, and how can you harness its potential to help improve your rankings? Whether you’re in need of a refresher or you’ve always wanted a firmer grasp of the concept, this week’s Whiteboard Friday is required watching. Rand covers the basic principles of link equity, outlines common flow issues your site might be encountering, and provides a series of action items to ensure your site is riding the right currents.

Link equity flow

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about understanding and harnessing link equity flow, primarily internal link equity flow, so that you can get better rankings and execute on your SEO. A big thank you to William Chou, @WChouWMX on Twitter, for suggesting this topic. If you have a topic or something that you would like to see on Whiteboard Friday, tweet at me. We’ll add it to the list.

Principles of link equity

So some principles of link equity first to be aware of before we dive into some examples.

1. External links generally give more ranking value and potential ranking boosts than internal links.

That is not to say, though, that internal links provide no link equity, and in fact, many pages that earn few or no external links can still rank well if a domain itself is well linked to and that page is on that site and has links from other good, important pages on the domain. But if a page is orphaned or if a domain has no links at all, extremely difficult to rank.

2. Well-linked-to pages, both internal and external, pass more link equity than those that are poorly linked to.

I think this makes intuitive sense to all of us who have understood the concept of PageRank over the years. Basically, if a page accrues many links, especially from other important pages, that page’s ability to pass its link equity to other pages, to give a boost in ranking ability is stronger than if a page is very poorly linked to or not linked to at all.

3. Pages with fewer links tend to pass more equity to their targets than pages with more links.

Again, going off the old concept of PageRank, if you have a page with hundreds or thousands of links on it, each of those receives a much more fractional, smaller amount of the link equity that could be passed to it than if you have a page with only a few links on it. This is not universally… well, I just want to say this doesn’t scale perfectly. So it’s not the case that if you were to trim down your high link earning pages to having only one link and point to this particular page on your site, then you suddenly get tremendously more benefit than if you had your normal navigation on that page and you link to your homepage and About page and products page. That’s not really the case. But if you had a page that had hundreds of links in a row and you instead made that page have only a few links to the most important, most valuable places, you’ll get more equity out of that, more rank boosting ability.

4. Hacks and tricks like “nofollow” are often ineffective at shaping the flow of link equity.

Using rel=”no follow” or embedding a remotely executable JavaScript file that makes it so that browsers can see the links and visitors can, but Google is unlikely to see or follow those links, to shape the flow of your link equity is generally (a) a poor use of your time, because it doesn’t affect things that much. The old-school PageRank algorithm not that hugely important anymore. And (b) Google is often pretty good at interpreting and discounting these things. So it tends to not be worth your time at all.

5. Redirects and canonicalization lose a small amount of link equity. Non-ideal ones like 302s, JS redirects, etc. may lose more than 301, rel=canonical, etc.

So if I have a 301 or a rel=canonical from one page to another, those will lose or cost you a small, a very small amount of link equity. But more potentially costly would be using non-ideal types of redirects or canonicalization methods, like a JavaScript-based redirect or a 302 or a 307 instead of a 301. If you’re going to do a redirect or if you’re going to do canonicalization, 301s or rel=canonicals are the way to go.

So keeping in mind these principles, let’s talk through three of the most common link equity flow issues that we see websites facing.

Common link equity flow issues

A. A few pages on a large site get all the external links:

You have a relatively large site, let’s say thousands to tens of thousands, maybe even hundreds of thousands of pages, and only a few of those pages are earning any substantial quantity of external links. I have highlighted those in pink. So these pages are pointing to these pink ones. But on this website you have other pages, pages like these purple ones, where you essentially are wanting to earn link equity, because you know that you need to rank for these terms and pages that these purple ones are targeting, but they’re not getting the external links that these pink pages are. In these cases, it’s important to try a few things.

  1. We want to identify the most important non-link earning pages, these purple ones. We’ve got to figure out what these actually are. What are the pages that you wish would rank that are not yet ranking for their terms and phrases that they’re targeting?
  2. We want to optimize our internal links from these pink pages to these purple ones. So in an ideal world, we would say, “Aha, these pages are very strong. They’ve earned a lot of link equity.” You could use Open Site Explorer and look at Top Pages, or Ahrefs or any of our other competitors and look at your pages, the ones that have earned the most links and the most link equity. Then you could say, “Hey, can I find some relevance between these two or some user stories where someone who reaches this page needs something over here, and thus I’m going to create a link to and from there?” That’s a great way to pass equity.
  3. Retrofitting and republishing. So what I mean by this is essentially I’m going to take these pages, these purple ones that I want to be earning links, that are not doing well yet, and consider reworking their content, taking the lessons that I have learned from the pink pages, the ones that have earned link equity, that have earned external links and saying, “What did these guys do right that we haven’t done right on these guys, and what could we do to fix that situation?” Then I’m going to republish and restart a marketing, a link building campaign to try and get those links.

B. Only the homepage of a smaller site gets any external links.

This time we’re dealing with a small site, a very, very small site, 5 pages, 10 pages, maybe even up to 50 pages, but generally a very small site. Often a lot of small businesses, a lot of local businesses have this type of presence, and only the homepage gets any link equity at all. So what do we do in those cases? There’s not a whole lot to spread around. The homepage can only link to so many places. We have to serve users first. If we don’t, we’re definitely going to fall in the search engine rankings.

So in this case, where the pink link earner is the homepage, there are two things we can do:

  1. Make sure that the homepage is targeting and serves the most critical keyword targets. So we have some keyword targets that we know we want to go after. If there’s one phrase in particular that’s very important, rather than having the homepage target our brand, we could consider having the homepage target that specific query. Many times small businesses and small websites will make this mistake where they say, “Oh, our most important keyword, we’ll make that this page. We’ll try and rank it. We’ll link to it from the homepage.” That is generally not nearly as effective as making a homepage target that searcher intent. If it can fit with the user journey as well, that’s one of the best ways you can go.
  2. Consider some new pages for content, like essentially saying, “Hey, I recognize that these other pages, maybe they’re About and my Terms of Service and some of my products and services and whatnot, and they’re just not that link-worthy. They don’t deserve links. They’re not the type of pages that would naturally earn links.” So we might need to consider what are two or three types of pages or pages that we could produce, pieces of content that could earn those links, and think about it this way. You know who the people who are already linking to you are. It’s these folks. I have just made up some domains here. But the folks who are already linking to your homepage, those are likely to be the kinds of people who will link to your internal pages as well. So I would think about them as link targets and say, “What would I be pretty confident that they would link to, if only they knew that it existed on our website?” That’s going to give you a lot of success. Then I would check out some of our link building sections here on Whiteboard Friday and across the Moz Blog for more tips.

C. Mid-long tail KW-targeting pages are hidden or minimized by the site’s nav/IA.

So this is essentially where I have a large site, and I have pages that are targeting keywords that don’t get a ton of volume, but they’re still important. They could really boost the value that we get from our website, because they’re hyper-targeted to good customers for us. In this case, one of the challenges is they’re hidden by your information architecture. So your top-level navigation and maybe even your secondary-level navigation just doesn’t link to them. So they’re just buried deep down in the website, under a whole bunch of other stuff. In these cases, there are some really good solutions.

  1. Find semantic and user intent relationships. So semantic is these words appeared on those pages. Let’s say one of these pages here is targeting the word “toothpaste,” for example, and I find that, oh, you know what, this page over here, which is well linked to in our navigation, mentions the word “toothpaste,” but it doesn’t link over here yet. I’m going to go create those links. That’s a semantic relationship. A user intent relationship would be, hey, this page over here talks about oral health. Well, oral health and toothpaste are actually pretty relevant. Let me make sure that I can create that user journey, because I know that people who’ve read about oral health on our website probably also later want to read about toothpaste, at least some of them. So let’s make that relationship also happen between those two pages. That would be a user intent type of relationship. You’re going find those between your highly linked to external pages and your well-linked-to internal pages and these long tail pages that you’re trying to target. Then you’re going to create those new links.
  2. Try and leverage the top-level category pages that you already have. If you have a top-level navigation and it links to whatever it is — home, products, services, About Us, Contact, the usual types of things — it’s those pages that are extremely well linked to already internally where you can add in content links to those long-tail pages and potentially benefit.
  3. Consider new top-level or second-level pages. If you’re having trouble adding them to these pages, they already have too many links, there’s no user story that make good sense here, it’s too weird to jam them in, maybe engineering or your web dev team thinks that that’s ridiculous to try and jam those in there, consider creating new top-level pages. So essentially saying, “Hey, I want to add a page to our top-level navigation that is called whatever it is, Additional Resources or Resources for the Curious or whatever.” In this case in my oral health and dentistry example, potentially I want an oral health page that is linked to from the top-level navigation. Then you get to use that new top-level page to link down and flow the link equity to all these different pages that you care about and currently are getting buried in your navigation system.

All right, everyone. Hope you’ve enjoyed this edition of Whiteboard Friday. Give us your tips in the comments for how you’ve seen link equity flow, the benefits or drawbacks that you’ve seen to try and controlling and optimizing that flow. We’ll see again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

Google Algorithm & Ranking Update Chatter

Starting over the weekend, mostly Saturday and Sunday on August 19th and 20th, there were some chatter in the webmaster channels around Google fluctuations in the search results. The chatter was intense for a day or so and died out a lot since then…

Search Engine Roundtable

Posted in IM NewsComments Off