Tag Archive | "Updated"

Troubleshooting Local Ranking Failures [Updated for 2018]

Posted by MiriamEllis

I love a mystery… especially a local search ranking mystery I can solve for someone.

Now, the truth is, some ranking puzzles are so complex, they can only be solved by a formal competitive audit. But there are many others that can be cleared up by spending 15 minutes or less going through an organized 10-point checklist of the commonest problems that can cause a business to rank lower than the owner thinks it should. By zipping through the following checklist, there’s a good chance you’ll be able to find one or more obvious “whodunits” contributing to poor Google local pack visibility for a given search.

Since I wrote the original version of this post in 2014, so much has changed. Branding, tools, tactics — things are really different in 2018. Definitely time for a complete overhaul, with the goal of making you a super sleuth for your forum friends, clients, agency teammates, or executive superiors.

Let’s emulate the Stratemeyer Syndicate, which earned lasting fame by hitting on a simple formula for surfacing and solving mysteries in a most enjoyable way.

Before we break out our magnifying glass, it’s critical to stress one very important thing. The local rankings I see from an office in North Beach, San Francisco are not the rankings you see while roaming around Golden Gate park in the same city. The rankings your client in Des Moines sees for things in his town are not the same rankings you see from your apartment in Albuquerque when you look at Des Moines results. With the user having become the centroid of search for true local searches, it is no mystery at all that we see different results when we are different places, and it is no cause for concern.

And now that we’ve gotten that out of the way and are in the proper detective spirit, let’s dive into how to solve for each item on our checklist!

☑ Google updates/bugs

The first thing to ask if a business experiences a sudden change in rankings is whether Google has done something. Search Engine Land strikes me as the fastest reporter of Google updates, with MozCast offering an ongoing weather report of changes in the SERPs. Also, check out the Moz Google Algo Change history list and the Moz Blog for some of the most in-depth strategic coverage of updates, penalties, and filters.

For local-specific bugs (or even just suspected tests), check out the Local Search Forum, the Google My Business forum, and Mike Blumenthal’s blog. See if the effects being described match the weirdness you are seeing in your local packs. If so, it’s a matter of fixing a problematic practice (like iffy link building) that has been caught in an update, waiting to see how the update plays out, or waiting for Google to fix a bug or turn a dial down to normalize results.

*Pro tip: Don’t make the mistake of thinking organic updates have nothing to do with local SEO. Crack detectives know organic and local are closely connected.

☑ Eligibility to list and rank

When a business owner wants to know why he isn’t ranking well locally, always ask these four questions:

  1. Does the business have a real address? (Not a PO box, virtual office, or a string of employees’ houses!)
  2. Does the business make face-to-face contact with its customers?
  3. What city is the business in?
  4. What is the exact keyword phrase they are hoping to rank for?

If the answer is “no” to either of the first two questions, the business isn’t eligible for a Google My Business listing. And while spam does flow through Google, a lack of eligibility could well be the key to a lack of rankings.

For the third question, you need to know the city the business is in so that you can see if it’s likely to rank for the search phrase cited in the fourth question. For example, a plumber with a street address in Sugar Land, TX should not expect to rank for “plumber Dallas TX.” If a business lacks a physical location in a given city, it’s atypical for it to rank for queries that stem from or relate to that locale. It’s amazing just how often this simple fact solves local pack mysteries.

☑ Guideline spam

To be an ace local sleuth, you must commit to memory the guidelines for representing your business on Google so that you can quickly spot violations. Common acts of spam include:

  • Keyword stuffing the business name field
  • Improper wording of the business name field
  • Creating listings for ineligible locations, departments, or people
  • Category spam
  • Incorrect phone number implementation
  • Incorrect website URL implementation
  • Review guideline violations

If any of the above conundrums are new to you, definitely spend 10 minutes reading the guidelines. Make flash cards, if necessary, to test yourself on your spam awareness until you can instantly detect glaring errors. With this enhanced perception, you’ll be able to see problems that may possibly be leading to lowered rankings, or even… suspensions!

☑ Suspensions

There are two key things to look for here when a local business owner comes to you with a ranking woe:

  1. If the listing was formerly verified, but has mysteriously become unverified, you should suspect a soft suspension. Soft suspensions might occur around something like a report of keyword-stuffing the GMB business name field. Oddly, however, there is little anecdotal evidence to support the idea that soft suspensions cause ranking drops. Nevertheless, it’s important to spot the un-verification clue and tell the owner to stop breaking guidelines. It’s possible that the listing may lose reviews or images during this type of suspension, but in most cases, the owner should be able to re-verify his listing. Just remember: a soft suspension is not a likely cause of low local pack rankings.
  2. If the listing’s rankings totally disappear and you can’t even find the listing via a branded search, it’s time to suspect a hard suspension. Hard suspensions can result from a listing falling afoul of a Google guideline or new update, a Google employee, or just a member of the public who has reported the business for something like an ineligible location. If the hard suspension is deserved, as in the case of creating a listing at a fake address, then there’s nothing you can do about it. But, if a hard suspension results from a mistake, I recommend taking it to the Google My Business forum to plead for help. Be prepared to prove that you are 100% guideline-compliant and eligible in hopes of getting your listing reinstated with its authority and reviews intact.

☑ Duplicates

Notorious for their ability to divide ranking strength, duplicate listings are at their worst when there is more than one verified listing representing a single entity. If you encounter a business that seems like it should be ranking better than it is for a given search, always check for duplicates.

The quickest way to do this is to get all present and past NAP (name, address, phone) from the business and plug it into the free Moz Check Listing tool. Pay particular attention to any GMB duplicates the tool surfaces. Then:

  1. If the entity is a brick-and-mortar business or service area business, and the NAP exactly matches between the duplicates, contact Google to ask them to merge the listings. If the NAP doesn’t match and represents a typo or error on the duplicate, use the “suggest an edit” link in Google Maps to toggle the “yes/no” toggle to “yes,” and then select the radio button for “never existed.”
  2. If the duplicates represent partners in a multi-practitioner business, Google won’t simply delete them. Things get quite complicated in this scenario, and if you discover practitioner duplicates, tread carefully. There are half a dozen nuances here, including whether you’re dealing with actual duplicates, whether they represent current or past staffers, whether they are claimed or unclaimed, and even whether a past partner is deceased. There isn’t perfect industry agreement on the handling of all of the ins-and-outs of practitioner listings. Given this, I would advise an affected business to read all of the following before making a move in any direction:

☑ Missing/inaccurate listings

While you’ve got Moz Check Listing fired up, pay attention to anything it tells you about missing or inaccurate listings. The tool will show you how accurate and complete your listings on are on the major local business data aggregators, plus other important platforms like Google My Business, Facebook, Factual, Yelp, and more. Why does this matter?

  1. Google can pull information from anywhere on the web and plunk it into your Google My Business listing.
  2. While no one can quantify the exact degree to which citation/listing consistency directly impacts Google local rankings for every possible search query, it has been a top 5 ranking factor in the annual Local Search Ranking Factors survey as far back as I can remember. Recently, I’ve seen some industry discussion as to whether citations still matter, with some practitioners claiming they can’t see the difference they make. I believe that conclusion may stem from working mainly in ultra-competitive markets where everyone has already got their citations in near-perfect order, forcing practitioners to look for differentiation tactics beyond the basics. But without those basics, you’re missing table stakes in the game.
  3. Indirectly, listing absence or inconsistency impacts local rankings in that it undermines the quest for good local KPIs as well as organic authority. Every lost or misdirected consumer represents a failure to have someone click-for-directions, click-to-call, click-to-your website, or find your website at all. Online and offline traffic, conversions, reputation, and even organic authority all hang in the balance of active citation management.

☑ Lack of organic authority

Full website or competitive audits are not the work of a minute. They really take time, and deep delving. But, at a glance, you can access some quick metrics to let you know whether a business’ lack of achievement on the organic side of things could be holding them back in the local packs. Get yourself the free MozBar SEO toolbar and try this:

  1. Turn the MozBar on by clicking the little “M” at the top of your browser so that it is blue.
  2. Perform your search and look at the first few pages of the organic results, ignoring anything from major directory sites like Yelp (they aren’t competing with you for local pack rankings, eh?).
  3. Note down the Page Authority, Domain Authority, and link counts for each of the businesses coming up on the first 3 pages of the organic results.
  4. Finally, bring up the website of the business you’re investigating. If you see that the top competitors have Domain Authorities of 50 and links numbering in the hundreds or thousands, whereas your target site is well below in these metrics, chances are good that organic authority is playing a strong role in lack of local search visibility. How do we know this is true? Do some local searches and note just how often the businesses that make it into the 3-pack or the top of the local finder view have correlating high organic rankings.

Where organic authority is poor, a business has a big job of work ahead. They need to focus on content dev + link building + social outreach to begin building up their brand in the minds of consumers and the “RankBrain” of Google.

One other element needs to be mentioned here, and that’s the concept of how time affects authority. When you’re talking to a business with a ranking problem, it’s very important to ascertain whether they just launched their website or just built their local business listings last week, or even just a few months ago. Typically, if they have, the fruits of their efforts have yet to fully materialize. That being said, it’s not a given that a new business will have little authority. Large brands have marketing departments which exist solely to build tremendous awareness of new assets before they even launch. It’s important to keep that in mind, while also realizing that if the business is smaller, building authority will likely represent a longer haul.

☑ Possum effect

Where local rankings are absent, always ask:

“Are there any other businesses in your building or even on your street that share your Google category?”

If the answer is “yes,” search for the business’ desired keyword phase and look at the local finder view in Google Maps. Note which companies are ranking. Then begin to zoom in on the map, level by level, noting changes in the local finder as you go. If, a few levels in, the business you’re advising suddenly appears on the map and in the local finder, chances are good it’s the Possum filter that’s causing their apparent invisibility at the automatic zoom level.

Google Possum rolled out in September 2016, and its observable effects included a geographic diversification of the local results, filtering out many listings that share a category and are in close proximity to one another. Then, about one year later, Google initiated the Hawk update, which appears to have tightened the radius of Possum, with the result that while many businesses in the same building are still being filtered out, a number of nearby neighbors have reappeared at the automatic zoom level of the results.

If your sleuthing turns up a brand that is being impacted by Possum/Hawk, the only surefire way to beat the filter is to put in the necessary work to become the most authoritative answer for the desired search phrase. It’s important to remember that filters are the norm in Google’s local results, and have long been observed impacting listings that share an address, share a phone number, etc. If it’s vital for a particular listing to outrank all others that possess shared characteristics, then authority must be built around it in every possible way to make it one of the most dominant results.

☑ Local Service Ads effect

The question you ask here is:

“Is yours a service-area business?”

And if the answer is “yes,” then brace yourself for ongoing results disruption in the coming year.

Google’s Local Service Ads (formerly Home Service Ads) make Google the middleman between consumers and service providers, and in the 2+ years since first early testing, they’ve caused some pretty startling things to happen to local search results. These have included:

Suffice it to say, rollout to an ever-increasing number of cities and categories hasn’t been for the faint of heart, and I would hazard a guess that Google’s recent re-brand of this program signifies their intention to move beyond the traditional SAB market. One possible benefit of Google getting into this type of lead gen is that it could decrease spam, but I’m not sold on this, given that fake locations have ended up qualifying for LSA inclusion. While I honor Google’s need to be profitable, I share some of the qualms business owners have expressed about the potential impacts of this venture.

Since I can’t offer a solid prediction of what precise form these impacts will take in the coming months, the best I can do here is to recommend that if an SAB experiences a ranking change/loss, the first thing to look for is whether LSA has come to town. If so, alteration of the SERPs may be unavoidable, and the only strategy left for overcoming vanished visibility may be to pay for it… by qualifying for the program.

☑ GMB neglect

Sometimes, a lack of competitive rankings can simply be chalked up to a lack of effort. If a business wonders why they’re not doing better in the local packs, pull up their GMB listing and do a quick evaluation of:

  • Verification status – While you can rank without verifying, lack of verification is a hallmark of listing neglect.
  • Basic accuracy – If NAP or map markers are incorrect, it’s a sure sign of neglect.
  • Category choices – Wrong categories make right rankings impossible.
  • Image optimization – Every business needs a good set of the most professional, persuasive photos it can acquire, and should even consider periodic new photo shoots for seasonal freshness; imagery impacts KPIs, which are believed to impact rank.
  • Review count, sentiment and management – Too few reviews, low ratings, and lack of responses = utter neglect of this core rank/reputation-driver.
  • Hours of operation – If they’re blank or incorrect, conversions are being missed.
  • Main URL choice – Does the GMB listing point to a strong, authoritative website page or a weak one?
  • Additional URL choices – If menus, bookings, reservations, or placing orders is part of the business model, a variety of optional URLs are supported by Google and should be explored.
  • Google Posts – Early-days testing indicates that regular posting may impact rank.
  • Google Questions and Answers – Pre-populate with best FAQs and actively manage incoming questions.

There is literally no business, large or small, with a local footprint that can afford to neglect its Google My Business listing. And while some fixes and practices move the ranking needle more than others, the increasing number of consumer actions that take place within Google is reason enough to put active GMB management at the top of your list.

Closing the case

The Hardy Boys never went anywhere without their handy kit of detection tools. Their father was so confident in their utter preparedness that he even let them chase down gangs in Hong Kong and dictators in the Guyanas (which, on second thought, doesn’t seem terribly wise.) But I have that kind of confidence in you. I hope my troubleshooting checklist is one you’ll bookmark and share to be prepared for the local ranking mysteries awaiting you and your digital marketing colleagues in 2018. Happy sleuthing!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

How to Do a Content Audit [Updated for 2017]

Posted by Everett

This guide provides instructions on how to do a content audit using examples and screenshots from Screaming Frog, URL Profiler, Google Analytics (GA), and Excel, as those seem to be the most widely used and versatile tools for performing content audits.

{Expand for more background}


What is a content audit?

A content audit for the purpose of SEO includes a full inventory of all indexable content on a domain, which is then analyzed using performance metrics from a variety of sources to determine which content to keep as-is, which to improve, and which to remove or consolidate.

What is the purpose of a content audit?

A content audit can have many purposes and desired outcomes. In terms of SEO, they are often used to determine the following:

  • How to escape a content-related search engine ranking filter or penalty
  • Content that requires copywriting/editing for improved quality
  • Content that needs to be updated and made more current
  • Content that should be consolidated due to overlapping topics
  • Content that should be removed from the site
  • The best way to prioritize the editing or removal of content
  • Content gap opportunities
  • Which content is ranking for which keywords
  • Which content should be ranking for which keywords
  • The strongest pages on a domain and how to leverage them
  • Undiscovered content marketing opportunities
  • Due diligence when buying/selling websites or onboarding new clients

While each of these desired outcomes and insights are valuable results of a content audit, I would define the overall “purpose” of one as:

The purpose of a content audit for SEO is to improve the perceived trust and quality of a domain, while optimizing crawl budget and the flow of PageRank (PR) and other ranking signals throughout the site.

Often, but not always, a big part of achieving these goals involves the removal of low-quality content from search engine indexes. I’ve been told people hate this word, but I prefer the “pruning” analogy to describe the concept.

How & why “pruning” works

{Expand for more on pruning}

How to do a content audit

Just like anything in SEO, from technical and on-page changes to site migrations, things can go horribly wrong when content audits aren’t conducted properly. The most common example would be removing URLs that have external links because link metrics weren’t analyzed as part of the audit. Another common mistake is confusing removal from search engine indexes with removal from the website.

Content audits start with taking an inventory of all content available for indexation by search engines. This content is then analyzed against a variety of metrics and given one of three “Action” determinations. The “Details” of each Action are then expanded upon.

The variety of combinations of options between the “Action” of WHAT to do and the “Details” of HOW (and sometimes why) to do it are as varied as the strategies, sites, and tactics themselves. Below are a few hypothetical examples:

You now have a basic overview of how to perform a content audit. More specific instructions can be found below.

The process can be roughly split into three distinct phases:

  1. Inventory & audit
  2. Analysis & recommendations
  3. Summary & reporting

The inventory & audit phase

Taking an inventory of all content, and related metrics, begins with crawling the site.

One difference between crawling for content audits and technical audits:

Technical SEO audit crawls are concerned with all crawlable content (among other things).

Content audit crawls for the purpose of SEO are concerned with all indexable content.

{Expand for more on crawlable vs. indexable content}

All of this is changing rapidly, though. URLs as the unique identifier in Google’s index are probably going away. Yes, we’ll still have URLs, but not everything requires them. So far, the word “content” and URL has been mostly interchangeable. But some URLs contain an entire application’s worth of content. How to do a content audit in that world is something we’ll have to figure out soon, but only after Google figures out how to organize the web’s information in that same world. From the looks of things, we still have a year or two.

Until then, the process below should handle most situations.

Step 1: Crawl all indexable URLs

A good place to start on most websites is a full Screaming Frog crawl. However, some indexable content might be missed this way. It is not recommended that you rely on a crawler as the source for all indexable URLs.

In addition to the crawler, collect URLs from Google Analytics, Google Webmaster Tools, XML Sitemaps, and, if possible, from an internal database, such as an export of all product and category URLs on an eCommerce website. These can then be crawled in “list mode” separately, then added to your main list of URLs and deduplicated to produce a more comprehensive list of indexable URLs.

Some URLs found via GA, XML sitemaps, and other non-crawl sources may not actually be “indexable.” These should be excluded. One strategy that works here is to combine and deduplicate all of the URL “lists,” and then perform a crawl in list mode. Once crawled, remove all URLs with robots meta or X-Robots noindex tags, as well as any URL returning error codes and those that are blocked by the robots.txt file, etc. At this point, you can safely add these URLs to the file containing indexable URLs from the crawl. Once again, deduplicate the list.

Crawling roadblocks & new technologies

Crawling very large websites

First and foremost, you do not need to crawl every URL on the site. Be concerned with indexable content. This is not a technical SEO audit.

{Expand for more about crawling very large websites}

Crawling dynamic mobile sites

This refers to a specific type of mobile setup in which there are two code-bases –– one for mobile and one for desktop –– but only one URL. Thus, the content of a single URL may vary significantly depending on which type of device is visiting that URL. In such cases, you will essentially be performing two separate content audits. Proceed as usual for the desktop version. Below are instructions for crawling the mobile version.

{Expand for more on crawling dynamic websites}

Crawling and rendering JavaScript

One of the many technical issues SEOs have been increasingly dealing with over the last couple of years is the proliferation of websites built on JavaScript frameworks and libraries like React.js, Ember.js, and Angular.js.

{Expand for more on crawling Javascript websites}

Step 2: Gather additional metrics

Most crawlers will give you the URL and various on-page metrics and data, such as the titles, descriptions, meta tags, and word count. In addition to these, you’ll want to know about internal and external links, traffic, content uniqueness, and much more in order to make fully informed recommendations during the analysis portion of the content audit project.

Your process may vary, but we generally try to pull in everything we need using as few sources as possible. URL Profiler is a great resource for this purpose, as it works well with Screaming Frog and integrates easily with all of the APIs we need.

Once the Screaming Frog scan is complete (only crawling indexable content) export the “Internal All” file, which can then be used as the seed list in URL Profiler (combined with any additional indexable URLs found outside of the crawl via GSC, GA, and elsewhere).

This is what my URL Profiler settings look for a typical content audit for a small- or medium-sized site. Also, under “Accounts” I have connected via API keys to Moz and SEMrush.

Once URL Profiler is finished, you should end up with something like this:

Screaming Frog and URL Profiler: Between these two tools and the APIs they connect with, you may not need anything else at all in order to see the metrics below for every indexable URL on the domain.

The risk of getting analytics data from a third-party tool

We’ve noticed odd data mismatches and sampled data when using the method above on large, high-traffic websites. Our internal process involves exporting these reports directly from Google Analytics, sometimes incorporating Analytics Canvas to get the full, unsampled data from GA. Then VLookups are used in the spreadsheet to combine the data, with URL being the unique identifier.

Metrics to pull for each URL:

  • Indexed or not?
    • If crawlers are set up properly, all URLs should be “indexable.”
    • A non-indexed URL is often a sign of an uncrawled or low-quality page.
  • Content uniqueness
    • Copyscape, Siteliner, and now URL Profiler can provide this data.
  • Traffic from organic search
    • Typically 90 days
    • Keep a consistent timeframe across all metrics.
  • Revenue and/or conversions
    • You could view this by “total,” or by segmenting to show only revenue from organic search on a per-page basis.
  • Publish date
    • If you can get this into Google Analytics as a custom dimension prior to fetching the GA data, it will help you discover stale content.
  • Internal links
    • Content audits provide the perfect opportunity to tighten up your internal linking strategy by ensuring the most important pages have the most internal links.
  • External links
  • Landing pages resulting in low time-on-site
    • Take this one with a grain of salt. If visitors found what they want because the content was good, that’s not a bad metric. A better proxy for this would be scroll depth, but that would probably require setting up a scroll-tracking “event.”
  • Landing pages resulting in Low Pages-Per-Visit
    • Just like with Time-On-Site, sometimes visitors find what they’re looking for on a single page. This is often true for high-quality content.
  • Response code
    • Typically, only URLs that return a 200 (OK) response code are indexable. You may not require this metric in the final data if that’s the case on your domain.
  • Canonical tag
    • Typically only URLs with a self-referencing rel=“canonical” tag should be considered “indexable.” You may not require this metric in the final data if that’s the case on your domain.
  • Page speed and mobile-friendliness

Before you begin analyzing the data, be sure to drastically improve your mental health and the performance of your machine by taking the opportunity to get rid of any data you don’t need. Here are a few things you might consider deleting right away (after making a copy of the full data set, of course).

Things you don’t need when analyzing the data

{Expand for more on removing unnecessary data}

Hopefully by now you’ve made a significant dent in reducing the overall size of the file and time it takes to apply formatting and formula changes to the spreadsheet. It’s time to start diving into the data.

The analysis & recommendations phase

Here’s where the fun really begins. In a large organization, it’s tempting to have a junior SEO do all of the data-gathering up to this point. I find it useful to perform the crawl myself, as the process can be highly informative.

Step 3: Put it all into a dashboard

Even after removing unnecessary data, performance could still be a major issue, especially if working in Google Sheets. I prefer to do all of this in Excel, and only upload into Google Sheets once it’s ready for the client. If Excel is running slow, consider splitting up the URLs by directory or some other factor in order to work with multiple, smaller spreadsheets.

Creating a dashboard can be as easy as adding two columns to the spreadsheet. The first new column, “Action,” should be limited to three options, as shown below. This makes filtering and sorting data much easier. The “Details” column can contain freeform text to provide more detailed instructions for implementation.

Use Data Validation and a drop-down selector to limit Action options.

Step 4: Work the content audit dashboard

All of the data you need should now be right in front of you. This step can’t be turned into a repeatable process for every content audit. From here on the actual step-by-step process becomes much more open to interpretation and your own experience. You may do some of them and not others. You may do them a little differently. That’s all fine, as long as you’re working toward the goal of determining what to do, if anything, for each piece of content on the website.

A good place to start would be to look for any content-related issues that might cause an algorithmic filter or manual penalty to be applied, thereby dragging down your rankings.

Causes of content-related penalties

These typically fall under three major categories: quality, duplication, and relevancy. Each category can be further broken down into a variety of issues, which are detailed below.

{Expand to learn more about quality, duplication, and relevancy issues}

It helps to sort the data in various ways to see what’s going on. Below are a few different things to look for if you’re having trouble getting started.

{Expand to learn more about what to look for}

Taking the hatchet to bloated websites

For big sites, it’s best to use a hatchet-based approach as much as possible, and finish up with a scalpel in the end. Otherwise, you’ll spend way too much time on the project, which eats into the ROI.

This is not a process that can be documented step-by-step. For the purpose of illustration, however, below are a few different examples of hatchet approaches and when to consider using them.

{Expand for examples of hatchet approaches}

As you can see from the many examples above, sorting by “Page Type” can be quite handy when applying the same Action and Details to an entire section of the website.

After all of the tool set-up, data gathering, data cleanup, and analysis across dozens of metrics, what matters in the end is the Action to take and the Details that go with it.

URL, Action, and Details: These three columns will be used by someone to implement your recommendations. Be clear and concise in your instructions, and don’t make decisions without reviewing all of the wonderful data-points you’ve collected.

Here is a sample content audit spreadsheet to use as a template, or for ideas. It includes a few extra tabs specific to the way we used to do content audits at Inflow.


As Razvan Gavrilas pointed out in his post on Cognitive SEO from 2015, without doing the research above you risk pruning valuable content from search engine indexes. Be bold, but make highly informed decisions:

Content audits allow SEOs to make informed decisions on which content to keep indexed “as-is,” which content to improve, and which to remove.

The reporting phase

The content audit dashboard is exactly what we need internally: a spreadsheet crammed with data that can be sliced and diced in so many useful ways that we can always go back to it for more insight and ideas. Some clients appreciate that as well, but most are going to find the greater benefit in our final content audit report, which includes a high-level overview of our recommendations.

Counting actions from Column B

It is useful to count the quantity of each Action along with total organic search traffic and/or revenue for each URL. This will help you (and the client) identify important metrics, such as total organic traffic for pages marked to be pruned. It will also make the final report much easier to build.

Step 5: Writing up the report

Your analysis and recommendations should be delivered at the same time as the audit dashboard. It summarizes the findings, recommendations, and next steps from the audit, and should start with an executive summary.

Here is a real example of an executive summary from one of Inflow’s content audit strategies:

As a result of our comprehensive content audit, we are recommending the following, which will be covered in more detail below:

Removal of about 624 pages from Google index by deletion or consolidation:

  • 203 Pages were marked for Removal with a 404 error (no redirect needed)
  • 110 Pages were marked for Removal with a 301 redirect to another page
  • 311 Pages were marked for Consolidation of content into other pages
    • Followed by a redirect to the page into which they were consolidated

Rewriting or improving of 668 pages

  • 605 Product Pages are to be rewritten due to use of manufacturer product descriptions (duplicate content), these being prioritized from first to last within the Content Audit.
  • 63 “Other” pages to be rewritten due to low-quality or duplicate content.

Keeping 226 pages as-is

  • No rewriting or improvements needed

These changes reflect an immediate need to “improve or remove” content in order to avoid an obvious content-based penalty from Google (e.g. Panda) due to thin, low-quality and duplicate content, especially concerning Representative and Dealers pages with some added risk from Style pages.

The content strategy should end with recommended next steps, including action items for the consultant and the client. Below is a real example from one of our documents.

We recommend the following three projects in order of their urgency and/or potential ROI for the site:

Project 1: Remove or consolidate all pages marked as “Remove”. Detailed instructions for each URL can be found in the “Details” column of the Content Audit Dashboard.

Project 2: Copywriting to improve/rewrite content on Style pages. Ensure unique, robust content and proper keyword targeting.

Project 3: Improve/rewrite all remaining pages marked as “Improve” in the Content Audit Dashboard. Detailed instructions for each URL can be found in the “Details” column

Content audit resources & further reading

Understanding Mobile-First Indexing and the Long-Term Impact on SEO by Cindy Krum
This thought-provoking post begs the question: How will we perform content inventories without URLs? It helps to know Google is dealing with the exact same problem on a much, much larger scale.

Here is a spreadsheet template to help you calculate revenue and traffic changes before and after updating content.

Expanding the Horizons of eCommerce Content Strategy by Dan Kern of Inflow
An epic post about content strategies for eCommerce businesses, which includes several good examples of content on different types of pages targeted toward various stages in the buying cycle.

The Content Inventory is Your Friend by Kristina Halvorson on BrainTraffic
Praise for the life-changing powers of a good content audit inventory.


Everything You Need to Perform Content Audits

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

Updated Google My Business guidelines disallows virtual offices as service-area businesses

Recently updated Google guidelines clarifies how virtual offices are handled for Google My Business listings.

The post Updated Google My Business guidelines disallows virtual offices as service-area businesses appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Google Updated Their Algorithm For “Did The Holocaust Happen?” Controversy?

Over the past week or so, Google has been getting some heat around various anti semitic and racist sites ranking well in their search results. One of those queries getting a ton of attention is [did the holocaust happen]…

Search Engine Roundtable

Posted in IM NewsComments Off

Updated: Google: Penguin Can Discount All Your Links, Good Or Bad

Gary Illyes from Google sent us this statement:

When speaking yesterday, a statement I made about manual actions was phrased in a way that sounded like I was talking about Penguin — that was incorrect and I apologize for the confusion…

Search Engine Roundtable

Posted in IM NewsComments Off

Google Updated Quality Raters Guidelines Last Week

Google has updated the quality raters guidelines PDF on March 28, 2016. The full version of the quality raters guidelines was first introduced in November 2015 and now keeps it live and updated for all to see…

Search Engine Roundtable

Posted in IM NewsComments Off

SearchCap: Google Maps App Revised, Social Links In Knowledge Graph, Google Structured Data Updated

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Maps App Revised, Social Links In Knowledge Graph, Google Structured Data Updated appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Google News Reader Is Updated For Android & iOS, With Customized Settings & Added Desktop Features

Google is updating the look and feel of its News reader on Android and iOS devices, as well as adding a few new features. According to the announcement, the new Google News reader for Android and iOS offers three new customized settings: the option to change the theme from light to dark, reduce or…

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

MyCopyblogger: 14 Free Ebooks and an Updated 20-Part Internet Marketing Course

Can’t see the video? Click here.

Free Registration

About the Author: Brian Clark is founder of Copyblogger and CEO of Copyblogger Media. Get more from Brian on .

Related Stories


Posted in IM NewsComments Off

How to Build a Business with Premium Content (Updated for 2012)

Teaching Sells logo

It’s that time again. We’re going to be launching Teaching Sells to a brand new group of members.

If you haven’t done this with us before before, let me tell you what it’s all about.

Basically, I spend the next four weeks or so writing my little fingers to the bone, giving you a ton of awesome content you can use right now to start creating something epic.

Then we tell you a little about our flagship course, Teaching Sells, or as I affectionately call it, “Brian’s Brain in a Box.” (But you know, less messy than that.)

Teaching Sells is our comprehensive course about how to build a business online.

Not a “money-getting system” or some kind of “turn the internet into your personal ATM” course. There are plenty of other folks who will try to sell you that.

Teaching Sells is a course about business. How it works online, how to decide on your topic, how to find customers, how to build your site — the whole thing. Start to finish.

This is the stuff we think the Small Business Administration should be teaching you … but they aren’t.

It’s the stuff we think marketing majors should learn in college … but they don’t.

Yes, I’m a fangirl

You know those people who crash servers during launches by hitting “Refresh” until they get to order?

That was me, way back in 2007 when the Teaching Sells course was first launched. That was before I worked with Brian, before Copyblogger Media was formed, before I wrote a single Copyblogger post.

Teaching Sells was what let me make the jump into working for myself. So I think it’s pretty darned cool. But, you know, I’m kind of biased.

Eventually, I came onboard

I was fortunate enough to eventually join Brian Clark and Tony Clark in making Teaching Sells even better — adding content formats, making the material easier to digest, simplifying the course structure, and more good stuff.

That’s one reason I tend to pour my heart into these launches … I care about this course, and I’ve seen what it can do for people.

I also know how much you (and I) like awesome free stuff. So we’ve made kind of a tradition of delivering lots of great free content during our launches.

You can get all the good stuff by signing up for the Teaching Sells email list. We’ll also let you know when the course is available, so if this is the right fit for you, you’ll be able to join us.

Either way, over the next few weeks, we’ll be talking about:

  • If only you had started earlier … can real people still build a business online from scratch?
  • How you’re making it harder than it has to be (and how to knock that off and start making real progress)
  • The best ways to make money with content in 2012
  • Blogging and the zombie apocalypse
  • The key element to creating a successful 21st century business
  • Why advertising doesn’t work for bloggers (and what does)
  • What buyers will and won’t pay for when it comes to content
  • What to do if you’re not a great content creator yourself (this one is cool)

We’re also going to give away our handy-dandy 20-step road map to creating a serious, thriving business online, and we’ll throw in a live webinar to let you know more about how that works.

There’s all that, and then anything else I come up with over the next four weeks. If you see me griping on Twitter about how little sleep I’m getting, you’ll know why.

So what do you do now?

  1. Go sign up for the Teaching Sells list. (If you’re signed up already, you’ll get all the awesomeness automatically.)
  2. Let your email provider know we’re good guys by adding us to your approved email senders, so your content doesn’t get stuck in a spam filter.

Then hang on tight, and I’ll be talking more with you in the comfort of your email box.

Whether or not you think Teaching Sells would be a good fit, go ahead and sign up for all the free goodies. We don’t mind a bit — in fact, we love it when people put our free stuff to good use. That’s why Copyblogger is here for you every day, right?

Looking forward to seeing you there!

About the Author: Sonia Simone is co-founder and CMO of Copyblogger Media and a co-creator of the updated Teaching Sells.


Oh yeah, and of course, we’re not going to do anything weird with your email address. We’ll send you lots of valuable content and some information about Teaching Sells, and that’s it. We loathe spam and we know you do, too.



Posted in IM NewsComments Off