Tag Archive | "Recovering"

Video: Lily Ray on recovering after a Google core update

Here are tips from Lily Ray on how to fix a site that was hit by a Google core algorithm update.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Recovering Your Organic Search Traffic from a Web Migration Gone Wrong

Posted by Aleyda

[Estimated read time: 9 minutes]

I know you would never change a URL without identifying where to 301-redirect it and making sure that the links, XML sitemaps, and/or canonical tags are also updated. But if you’ve been doing SEO for a while, I bet you’ve also had a few clients — even big ones — coming to you after they’ve tried to do structural web changes or migrations of any type without taking SEO best practices into consideration.

Whenever this happens, your new client comes to you for help in an “emergency” type of situation in which there are two characteristics when doing the required SEO analysis:

  1. You need to prioritize:
    Your client is likely very nervous about the situation. You don’t have a lot of time to invest at the beginning to do a full audit right away. You’ll need to focus on identifying what hasn’t been done during the migration to make sure that the fundamental causes of the traffic loss are fixed — then you can move on with the rest.
  2. You might not have all the data:
    You might have only the basics — like Google Analytics & Google Search Console — and the information that the client shares with you about the steps they took when doing the changes. There are usually no previous rankings, crawls, or access to logs. You’ll need to make the most out of these two fairly easy-to-get data sources, new crawls that you can do yourself, and third-party “historical” ranking data. In this analysis we’ll work from this existing situation as a “worst-case scenario,” so anything extra that you can get will be an added benefit.

How can you make the most out of your time and basic data access to identify what went wrong and fix it — ASAP?

Let’s go through the steps for a “minimum viable” web migration validation to identify the critical issues to fix:

1. Verify that the web migration is the cause of the traffic loss.

To start, it’s key to:

  • Obtain from the client the specific changes that were done and actions taken during the migration, so you can identify those that had been likely missed and prioritize their validations when doing the analysis.
  • Check that the time of the traffic loss coincides with that of the migration to validate that it was actually the cause, or if there were different or coinciding factors that might have affected at the same time that you can later take into consideration when doing the full audit and analysis.

Screenshot: traffic dropping shortly after a web migration.

To identify this, compare the before and after with other traffic sources, per device & the migrated areas of your site (if not all of them changed), etc.

Use the “Why My Web Traffic Dropped” checklist to quickly verify that the loss has nothing to do with, for example, incorrect Google Analytics settings after the migration or a Google update happening at the same time.

Screenshot from Google Analytics of web traffic dropping.

I’ve had situations where the organic search traffic loss had coincided not only with the web migration but also with the date of a Phantom update (and they had the type of characteristics that were targeted).

Screenshot: Traffic loss after web migration and Google algo update.

If this is the case, you can’t expect to regain all the traffic after fixing the migration-related issues. There will be further analysis and implementations needed to fix the other causes of traffic loss.

2. Identify the pages that dropped the most in traffic, conversions, & rankings.

Once you verify that the traffic loss is due (completely or partially) to the web migration, then the next step is to focus your efforts on analyzing and identifying the issues in those areas that were hit the most from a traffic, conversions, & rankings perspective. You can do this by comparing organic search traffic per page before and after the migration in Google Analytics:

Screenshot: comparing organic search traffic per page before and after the migration in Google Analytics.

Select those that previously had the highest levels of traffic & conversions and that lost the highest percentages of traffic.

You can also do something similar with those pages with the highest impressions, clicks, & positions that have also had the greatest negative changes from the Google Search Console “Search Analytics” report:

Screenshot: the Google Search Console "Search Analytics" report.

After gathering this data, consolidate all of these pages (and related metrics) in an Excel spreadsheet. Here you’ll have the most critical pages that have lost the most from the migration.

Pages and related metrics consolidated in an Excel sheet

3. Identify the keywords for which these pages were ranking for and start monitoring them.

In most cases the issues will be technical (though sometimes they may be due to structural content issues). However, it’s important to identify the keywords for which these pages had been ranking in the past that lost visibility post-migration, start tracking them, and be able to verify their improvement after the issues are fixed.

Screenshot: identifying which keywords the page was ranking for.

This can be done by gathering data from tools with historical keyword ranking features — like SEMrush, Sistrix, or SearchMetrics — that also show you which pages have lost rankings during a specific period of time.

This can be a bit time-consuming, so you can also use URLProfiler to discover those keywords that were ranking in the past. It easily connects with your Google Search Console “Search Analytics” data via API to obtain their queries from the last 3 months.

Connecting URL Profiler to Google Search Console

As a result, you’ll have your keyword data and selected critical pages to assess in one spreadsheet:

Keyword data and selected critical pages to assess in one spreadsheet.

Now you can start tracking these keywords with your favorite keyword monitoring tool. You can even track the entire SERPs for your keywords with a tool like SERPwoo.

4. Crawl both the list of pages with traffic drops & the full website to identify issues and gaps.

Now you can crawl the list of pages you’ve identified using the “list mode” of an SEO crawler like Screaming Frog, then crawl your site with the “crawler mode,” comparing the issues in the pages that lost traffic versus the new, currently linked ones.

Uploading a list into Screaming Frog

You can also integrate your site crawl with Google Analytics to identify gaps (ScreamingFrog and Deepcrawl have this feature) and verify crawling, indexation, and even structural content-related issues that might have been caused by the migration. The following are some of the fundamentals that I recommend you take a look at, answering these questions:

Verifying against various issues your site may have.

A.) Which pages aren’t in the web crawl (because they’re not linked anymore) but were receiving organic search traffic?

Do these coincide with the pages that have lost traffic, rankings, & conversions? Have these pages been replaced? If so, why they haven’t been 301-redirected towards their new versions? Do it.

B.) Is the protocol inconsistent in the crawls?

Especially if the migration was from one version to the other (like HTTP to HTTPS), verify whether there are pages still being crawled with their HTTP version because links or XML sitemaps were not updated… then make sure to fix it.

C.) Are canonicalized pages pointing towards non-relevant URLs?

Check whether the canonical tags of the migrated pages are still pointing to the old URLs, or if the canonical tags were changed and are suddenly pointing to non-relevant URLs (such as the home page, as in the example below). Make sure to update them to point to their relevant, original URL if this is the case.

A page's source code with canonicalization errors.

D.) Are the pages with traffic loss now blocked via robots.txt or are non-indexable?

If so, why? Unblock all pages that should be crawled, indexed, and ranking as well as they were before.

E.) Verify whether the redirects logic is correct.

Just because the pages were redirected doesn’t mean that those redirects were correct. Identify these type of issues by asking the following questions:

  • Are the redirects going to relevant new page-versions of the old ones?
    Verify if the redirects are going to the correct page destination that features similar content and has the same purpose as the one redirected. If they’re not, make sure to update the redirects.
  • Are there any 302 redirects that should become 301s (as they are permanent and not temporary)
    Update them.
  • Are there any redirect loops that might be interfering with search crawlers reaching the final page destination?
    Update those as well.

    Especially if you have an independent mobile site version (under an “m” subdomain, for example), you’ll want to verify their redirect logic specifically versus the desktop one.

Checking redirect logic.

    • Are there redirects going towards non-indexable, canonicalized, redirected or error pages?
      Prioritize their fixing.

      To facilitate this analysis, you can use OnPage.org‘s “Redirects by Status Code” report.

OnPage.org's Redirects by Status Code report

    • Why are these redirected pages still being crawled?

      Update the links and XML sitemaps still pointing to the pages that are now redirecting to others, so that they go directly to the final page to crawl, index, and rank.

  • Are there duplicate content issues among the lost traffic pages?
    The configuration of redirects, canonicals, noindexation, or pagination might have changed and therefore these pages might now be featuring content that’s identified as duplicated and should be fixed.

Duplicate content issues shown on OnPage.org.

5. It’s time to implement fixes for the identified issues.

Once you ask these questions and update the configuration of your lost traffic pages as mentioned above, it’s important to:

  1. Update all of your internal links to go to the final URL destinations directly.
  2. Update all of your XML sitemaps to eliminate the inclusion of the old URLs, only leaving the new ones and resubmitting them to the Google Search Console
  3. Verify whether there are any external links still going to non-existent pages that should now redirect. This way, in the future and with more time, you can perform outreach to the most authoritative sites linking to them so they can be updated.
  4. Submit your lost traffic pages to be recrawled with the Google Search Console “Fetch as Google” section.

After resubmitting, start monitoring the search crawlers’ behavior through your web logs (you can use the Screaming Frog Log Analyzer), as well as your pages’ indexation, rankings, & traffic trends. You should start seeing a positive move after a few days:

Regaining numbers after implementing the fixes.

Remember that if the migration required drastic changes (like if you’ve migrated over another domain, for example), it’s natural to see a short-term rankings and traffic loss. This can be true even if it’s now correctly implemented and the new domain has a higher authority. You should take this into consideration; however, if the change has improved the former optimization status, the mid- to long-term results should be positive.

In the short term results dip, but as time goes on they rise again.

As you can see above, you can recover from this type of situation if you make sure to prioritize and fix the issues with negative effects before moving on to change anything else that’s not directly related. Once you’ve done this and see a positive trend, you can then begin a full SEO audit and process to improve what you’ve migrated, maximizing the optimization and results of the new web structure.

I hope this helps you have a quicker, easier web migration disaster recovery!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

Kobe Bryant Gradually Recovering

Kobe Bryant is gradually getting closer to returning to the court with his teammates. It has been a little over seven months since Bryant had surgery to repair his Achilles’ tendon, and he is hoping to be able to play soon.

“I feel like I’m ahead of schedule,” Bryant said in an interview for NBA TV on Friday. “If there was a playoff game tonight, I’d play. I’d play. I don’t know how effective I’d be, but I would play.” He continued, “The fadeaway still works, the ballhandling, being able to post. Those are things that I can do right now. But it’s not the playoffs, thank god.”

Bryant has been lightly practicing his “tippy toe” shots, but he and his coach, Mike D’Antoni, say neither of them knows exactly when he will be playing again. Bryant says he doesn’t want to set an official date. He wants to make sure he is 100 percent before he takes the floor again.

“Maybe a little bit more than tippy-toe,” D’Antoni said of Bryant’s shots. “I look out every once in awhile. I haven’t heard back that he’s ready to practice or anything like that. So, he’s just progressing. I think it’s better than yesterday, but I don’t know yet.”

“It’s tough because once I’ve set that as a target then I’m hell-bent at doing it at all costs, even to the detriment of the damn Achilles,” Bryant said. “I try to just stay in the moment and really try to listen to my body. The biggest thing is I have not done anything athletically for six months, seven months. You got to get your body back in shape. And doing that, if I was healthy — completely healthy — you have that much time off and get back in shape and your knee is going to ache, your ankle is going to hurt, your back is going to be out. So you got to go through your progressions as you normally would over the course of a summer.”

Image via Twitter


Posted in IM NewsComments Off

Are Some Sites Recovering From The Google Panda Update?

It would appear that some of the victims of Google’s Panda algorithm update are starting to see at least slight recoveries after using some elbow grease. A couple examples of sites that have gained some attention for upswings in traffic post-Panda, after getting hit hard by the update, are DaniWeb and One Way Furniture.

Have you seen any recovery in search traffic since Panda hit? Let us know.

DaniWeb Sees an Uptick in Traffic Post-Panda

DaniWeb is an IT discussion community site. It’s a place where people can go to discuss issues related to hardware, software, software development, web development, Internet marketing, etc. This is exactly the kind of site that can actually provide great value to a searcher. I can’t tell you how many times I’ve had some kind of frustrating software issue only to find the solution after a Google search pointing me to a discussion forum with people openly discussing the pros, cons, and merits of a given solution or idea. The very fact that it is a discussion forum means it is a potentially great place for different angles and ideas to any given topic, with the ongoing possibility of added value. More information means you can make better informed decisions.

Sure, there is no guarantee that all of the information is good information, but that’s the beauty of discussion. There is often someone there to shoot down the bad. The point is, many searchers or search enthusiasts might take issue with a site like Daniweb being demoted in search because of an algorithm change that was designed to crack down on shallow and lesser-quality content.

The good news for DaniWeb, and anybody that finds it to be a helpful resource, is that since being hit by the update it is starting to bounce back. To what extent remains to be seen. Time will tell, but Dani Horowtiz, who runs the site, recently revealed a Google analytics graph showing an upswing:

Daniweb traffic Panda and Post-panda

“The graph indicates a slight dip towards the end of February when just the US was affected by Panda, and then a huge dip when Panda went global,” she says. “However, you can see that over the past couple of weeks, traffic has been on the upswing, increasing day after day. We’re not yet near where we were before Panda, but there definitely is hope that we will get back there soon.”

DaniWeb has recovered from Google Panda … Sorta http://bit.ly/liGYiT 3 days ago via twitterfeed · powered by @socialditto

She is careful to note, “Many algorithm changes have already gone into effect between when Panda first was rolled out and today. Therefore, I can’t say without a doubt that our upswing is directly related to us being un-Pandalized in Google’s eyes and not due to another algorithm change that was released. In fact, in all honestly, that’s probably what it is.”

Still, it should serve as a reminder that Panda isn’t everything. Google has over 200 ranking signals don’t forget.

One Way Furniture Slowly Climbs Back Up

If you’re a regular reader of WebProNews or have been following the Panda news, you may recall earlier this month when NPR ran a story about a furniture store called One Way Furniture that had been feeling the wrath of the Panda, mainly due to its use of un-original product descriptions, which the e-commerce site was drawing from manufacturer listings.

Internet Retailer Senior Editor Allison Enright spoke with One Way Furniture CEO Mitch Lieberman this week (hat tip to SEW), and he said that the site is slowly climbing back up in the search rankings. “It’s been extremely challenging, but exciting, too,” he is quoted as saying. “Even in a downturn like this, it is exciting to see the effects of what you are doing to get you back to where you were.”

How They Are Doing It

So great, these sites are evidently working their way back into Google’s good graces. How does that help you? Luckily, they’ve shared some information about the things they’ve been doing, which appear to have led to the new rise in traffic.

“In a nutshell, I’ve worked on removing duplicate content, making use of the canonical tag and better use of 301 redirects, and adding the noindex meta tag to SERP-like pages and tag clouds,” says Horowtiz. “I’ve also done a lot of work on page load times. Interestingly enough, I’ve discovered that the number of pages crawled per day has NOT decreased in tandem with Panda (surprisingly), but it HAS been directly affected by our page load times.”

Look at the correlation between DaniWeb’s pages crawled per day and time spent downloading a page:

Pages Crawled vs load time from daniweb

“I guess it also goes without saying that it’s also important to constantly build backlinks,” says Horowitz. “Like many other content sites out there, we are constantly scraped on a regular basis. A lot of other sites out there syndicate our RSS feeds. It is entirely possible/plausible that Google’s Panda algorithm [appropriately] hit all of the low quality sites that were just syndicating and linking back to us (with no unique content of their own), ultimately discrediting half of the sites in our backlink portfolio, killing our traffic indirectly. Therefore, it isn’t that we got flagged by Panda’s algorithm, but rather that we just need to work on building up more backlinks.”

According to Internet Retailer, Lieberman fired the the firm he was using to get inbound links before and hired a new one. He also hired some new copywriters to write original product descriptions aimed at being “friendly to search engines.” Enright writes:

“For example, a bar stool that previously used a manufacturer-supplied bullet list of details as its product description now has a five-sentence description that details how it can complement a bar set-up, links to bar accessories and sets the tone by mentioning alcoholic beverages, all of which makes it more SEO-friendly, Lieberman says. “We decided to change it all up,” he says. “What we’re seeing now is what is good for customers and what they see on the site is also good for Google.”

OneWayFurniture.com is also slimming down content that causes pages to load more slowly because this also affects how Google interprets the quality of a web page. “We’re focused on the basics, the structure of the site and on doing things that are not going to affect us negatively,” Lieberman says.

More Things You Can Do to Recover from Panda

In addition to the things dicussed by Horotwitz and Lieberman, there are plenty of other things to consider in your own SEO strategy that migjht just help you bounce back if you were negatively impacted by the Panda update.

First off, simply check up on your basic SEO practices. Just because you got hit by the Panda update doens’t mean there aren’t other totally unrelated things you could be doing much better. Remember – over 200 signals. They’re not all Panda related.

You should also keep up to date on future changes. Read Google’s webmaster blog and it’s new search blog. Follow Google’s search team on Twitter. Read the search blogs. Frequent the forums. Google makes changes every day. Stay in the loop. Something that has worked for years might suddenly stop working one day, and it might not get the kind of attention a major update like Panda gets.

Panda doesn’t like thin content, so bulk it up. Dr. Peter J. Meyers, President of User Effect, lays out seven types of “thin” content and discusses how to fatten them up here.

Some have simply been relying more heavily on professional SEO tools and services. SEOMoz Founder Rand Fishkin said in a recent interview with GeekWire, ““I can’t be sure about correlation-causation, but it seems like that’s [Panda] actually been a positive thing for us. The more Google talks about their ranking algorithm, how it changes how people have to keep up, the more people go and look for SEO information, and lots of times find us, which is a good thing.”

You may need to increase your SEO budget. Like search strategist Jason Acidre says on Blogging Google at Technorati, “This just shows how imperative it is to treat SEO as a long-term and ongoing business investment, seeing that Google’s search algorithm is constantly improving its capability to return high-quality websites to be displayed as results to their users worldwide. As the biggest search engine in the world is requiring more quality content and natural web popularity from each website who desires to be on the top of their search results, it would certainly require quality-driven campaigns and massive fixes on their websites, which of course will necessitate them to upsize their budgets to acquire help from topnotch SEO professionals.”

“Authority websites that were affected by this recent Google update are losing money by the day,” he adds. “They are in need of high quality service providers who can actually meet their needs, and in order to get the kind of quality that can be seen genuinely useful by both users and search engines, they’ll probably need to make a much expensive investment on content management and link development, as this campaign would require massive work and hours to really materialize.”

Set up alerts for SEO elements of your site, so you’re constantly up to speed on just what’s going on. Arpana Tiwari, the Sr. SEO Manager of Become Inc. has some interesting ideas about this.

We all know that Google loves local these days. Local content even appeared to benefit from the Panda update to some extent. If you have anything of value to offer in terms of local-based content, it might not be a bad idea to consider it. Obviously quality is still a major factor. The content must have value.

Then of course there’s Google’s own “guidance”. Don’t forget the 23 questions Google laid out as “questions that one could use to assess the ‘quality’ of a page or an article”.

The silver lining here for Panda victims is that there is hope of recovering search visibility from Google. Nobody said it is going to be easy, and for a lot of the victims, it’s going to be harder than others. Let’s not discount the fact that many of the victims were victimized for a reason. Google’s goal is to improve quality, and much of what was negatively impacted was indeed very lackluster in that department.

Serious businesses will continue to play by Google’s rules, because today, Google is still the top traffic source on the web. It’s simply a vital part of Internet marketing, and the Internet itself is a much more significant part of the marketing landscape than it has ever been before.

Impacted by Panda? What are some things you’ve done to aid your recovery? Share in the comments.


Posted in IM NewsComments Off