Tag Archive | "Penguin"

Do We Still Need to Disavow in the Era of Penguin 4.0?

Posted by MarieHaynes

It has now been six months since the launch of Penguin 4.0. In my opinion, Penguin 4.0 was awesome. It took ages for Google to release this update, but when they did, it was much more fair than previous versions of Penguin. Previous versions of Penguin would cause entire sites to be suppressed if the algorithm thought that you’d engaged in manipulative link building. Even if a site did a thorough link cleanup, the suppression would remain present until Google re-ran the Penguin algorithm and recognized your cleanup efforts. It was not uncommon to see situations like this:

I saw many businesses that had looooooong periods of time of suppression — even years!

According to Google spokesperson Gary Illyes, the new version of Penguin that was released in September of 2016 no longer suppresses sites:

Now, instead of causing a sitewide demotion when Penguin detects spam, they’ll simply devalue that spam so that it can’t help improve a site’s rankings.

I’m guessing that it took a lot of brainpower to figure out how to do this. Google now has enough trust in their ability to find and devalue spam that they are comfortable removing the punitive aspect of Penguin. That’s impressive.

This change brings up a question that I am asked several times a week now:

If Penguin is able to devalue spam, is there any reason to disavow links any more?

I’ve been asked this enough times now that I figured it was a good idea to write an article on my answer to this question.

A brief refresher: What is the disavow tool?

The disavow tool was given to us in October of 2012.

You can use it by uploading a file to Google that contains a list of either URLs or domains. Then, as Google crawls the web, if they come across a URL or domain that is in your disavow file, they won’t use links from that page in their calculations of PageRank for your site. Those links also won’t be used by the Penguin algorithm when it decides whether your site has been involved in webspam.

For sites that were affected by Penguin in the past, the disavow tool was an integral part of getting the suppression lifted off the site. It was essentially a way of saying to Google, “Hey… in the past we made some bad links to our site. But we don’t want you to use those links in your calculations.” Ideally, it would be best to remove bad links from the web, but that’s not always possible. The disavow tool was, in my opinion, super important for any site that was hit by Penguin.

For more in-depth information on using the disavow tool, see this Moz post: https://moz.com/blog/guide-to-googles-disavow-tool

What does Google say about using the disavow tool now?

It wasn’t long after the release of Penguin 4.0 before people starting asking Google whether the disavow tool was still necessary. After all, if Google can just devalue spam links on their own, why should I have to disavow them?

Here are some replies from Google employees:

Now, the conspiracy theorists out there will say, “Of course Google wants you to disavow! They need that data to machine-learn for Penguin!”

Google has said that Penguin is not a machine learning algorithm:

And even if they ARE using disavow data for some kind of machine learning training set, really, does it matter? In my opinion, if Google is saying that we should be still using the disavow tool, I don’t think they’re trying to trick us. I think it still has a real purpose.

Three reasons why I still recommend using the disavow tool

There are three main reasons why I still recommend disavowing. However, I don’t recommend it in as many cases as I used to.

1) Manual actions still exist

You do NOT want to risk getting a manual unnatural links penalty. I have documented on Moz before about the many cases I’ve seen where a manual unnatural links penalty was devastating to the long-term health of a site.

Google employee Gary Illyes commented during a podcast that, when a Google webspam team member looks at your site’s links, they can often see labels next to the links. He said the following:

If the manual actions team is reviewing a site for whatever reason, and they see that most of the links are labeled as Penguin Real-Time affected, then they might decide to take a much deeper look on the site… and then maybe apply a manual action on the site because of the links.

In other words, if you have an unnatural link profile and you leave it up to Penguin to devalue your links rather than disavowing, then you’re at risk for getting a manual action.

Of course, if you actually do have a manual action, then you’ll need to use the disavow tool as part of your cleanup efforts along with manual link removal.

2) There are other algorithms that use links

Link quality has always been important to Google. I believe that Penguin is just one way in which Google fights against unnatural links algorithmically. One example of another algorithm that likely uses links is the Payday Loans algorithm. This algorithm isn’t just for payday loans sites; it also affects sites in many high-competition verticals.

Bill Slawski recently posted this interesting article on his thoughts about a recent patent filed by Google. In one place, the patent talks about a situation where a resource may have a large number of links pointing to it but there is a disproportionate amount of traffic. In cases like that, the page being linked to might actually be demoted in rankings.

Now, that’s just a patent, so it doesn’t mean for sure that there’s actually an algorithm behind this… but there could be! Makes you think, right?

Google is always trying to fight against link spam and Penguin is just one of the ways in which they do this. If there are links that are potentially causing my link profile to look spammy to Google, then I don’t want them to count in any calculations that Google is making.

3) Can we trust that Penguin is able to devalue all spam pointing to our site?

The official announcement from Google on Penguin is here. Here’s what it says about devaluing as opposed to demoting:

“Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.”

This statement is not clear to me. I have questions:

  • When Google says they are “adjusting ranking,” could that also be negative adjustments?
  • Can Penguin possibly demote rankings for certain pages rather than affecting the whole site?
  • Can Penguin possibly demote rankings for certain keywords rather than affecting the whole site?

As posted above, we received some clarification on this from Google employees in a Facebook post (and again via tweets) to tell us that Penguin 4.0 doesn’t penalize, but rather devalues spam. However, these are not official statements from Google. These statements may mean that we never have to worry about any link pointing to our site ever again. Perhaps? Or they could mean that there’s less need to worry than there was previously.

Personally, if my business relies on Google organic rankings in order to succeed, I’m a little leery about putting all of my trust in this algorithm’s ability to ignore unnatural links and not let them hurt me.

Who should be disavowing?

While I do still recommend use of the disavow tool, I only recommend it in the following situations:

  1. For sites that have made links for SEO purposes on a large scale – If you or an SEO company on your behalf made links in low-quality directories, low-quality article sites, bookmark sites, or as comment spam, then these need to be cleaned up. Here’s more information on what makes a link a low-quality link. You can also run links past my disavow blacklist if you’re not sure whether it’s a good one or not. Low-quality links like this are probably being devalued by Penguin, but they’re the type of link that could lead to a manual unnatural links penalty if you happen to get a manual review by the webspam team and they haven’t been disavowed.
  2. For sites that previously had a manual action for unnatural links – I’ve found that if a site has enough of a spam problem to get an unnatural links penalty, then that site usually ends up collecting more spam links over the years. Sometimes this is because low-quality directories pop up and scrape info from other low-quality directories. Sometimes it’s because old automated link-generating processes keep on running. And sometimes I don’t have an explanation, but spammy links just keep appearing. In most cases, sites that have a history of collecting unnatural links tend to continue to collect them. If this is the case for you, then it’s best to disavow those on a regular basis (either monthly or quarterly) so that you can avoid getting another manual action.
  3. For sites under obvious negative SEO attacks – The key here is the word “obvious.” I do believe that in most cases, Google is able to figure out that spam links pointed at a site are links to be ignored. However, at SMX West this year, Gary Illyes said that the algorithm can potentially make mistakes:
    If you have a bunch of pharma and porn links pointing at your site, it’s not a bad idea to disavow them, but actually in most cases I just ignore these. Where I do recommend disavowing for negative SEO attacks is when the links pointing at your site contain anchors for keywords for which you want to rank. If it’s possible that a webspam team member could look at your link profile and think that there are a lot of links there that exist just for SEO reasons, then you want to be sure that those are cleaned up.

Who does NOT need to disavow?

If you look at your links and notice some “weird” links that you can’t explain, don’t panic!

Every site gets strange links, and often quite a few of them. If you haven’t been involved in manipulative SEO, you probably do not need to be disavowing links.

When Google takes action either manually or algorithmically against a site for unnatural linking, it’s because the site has been actively trying to manipulate Google rankings on a large scale. If you made a couple of directory links in the past, you’re not going to get a penalty.

You also don’t need to disavow just because you notice sitewide links pointing to you. It can look scary to see in Google Search Console that one site is linking to you thousands of times, especially if that link is keyword-anchored. However, Google knows that this is a sitewide link and not thousands of individual links. If you made the link yourself in order to help your rankings, then sure, go ahead and disavow it. But if it just appeared, it’s probably nothing to worry about.

Borderline cases

There are some cases where it can be difficult to decide whether or not to disavow. I sometimes have trouble advising on cases where a company has hired a medium- to high-quality SEO firm that’s done a lot of link building — rather than link earning — for them.

Here’s an example of a case that would be difficult:

Let’s say you’ve been getting most of your links by guest posting. These guest posts are not on low-quality sites that exist just to post articles, but rather on sites that real humans read. Are those good links?

According to Google, if you’re guest posting primarily for the sake of getting links, then these are unnatural links. Here’s a quote from Google employee John Mueller:

“Think about whether or not this is a link that would be on your site if it weren’t for your actions…When it comes to guest blogging it’s a situation where you are placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a link building point of view. It can make sense to guest blog on other people’s sites to drive some traffic to your site… but you should use a nofollow.”

If you have a small number of guest posts, Google is unlikely to go after you. But what if a webspam team member looks at your links and sees that you have a very large number of links built via guest posting efforts? That makes me uncomfortable.

You could consider disavowing those links to avoid getting a manual action. It’s quite possible, though, that those links are actually helping your site. Disavowing them could cause you to drop in rankings.

This article could easily turn into a discussion on the benefits and risks of guest posting if we had the space and time. My point in mentioning this is to say that some disavow decisions are tough.

In general, my rule of thumb is that you should use the disavow file if you have a good number of links that look like you made them with SEO as your primary goal.

Should you be auditing your disavow file?

I do believe that some sites could benefit from pruning their disavow file. However, I have yet to see any reports from anyone who has claimed to have done this and seen benefit that we can reasonably attribute to the recovery of PageRank that flows through those links.

If you have used your disavow file in the past in an effort to remove a manual action or recover from a Penguin hit, then there’s a good possibility that you were overly aggressive in your disavow efforts. I know I’ve had some manual penalties that were really difficult to remove and we likely disavowed more links than were necessary. In cases like those, we could go through our disavow files and remove the domains that were questionable disavow decisions.

It’s not always easy to do this, though, especially if you’ve done the correct thing and have disavowed on the domain level. If this is the case, you won’t have actual URLs in your disavow file to review. It’s hard to make reavowing decisions without seeing the actual link in question.

Here’s a process you can use to audit your disavow file. It gets a little technical, but if you want to give it a try, here it is:

(Note: Many of these steps are explained in greater detail and with pictures here.)

  • Download your disavow file from Google: https://www.google.com/webmasters/tools/disavow-links-main
  • Get a list of your links from Google Search Console. (It’s not a bad idea to also get links from other sources, as well.)
  • On your CSV of links, make a column for domains. You can extract the domain by using this formula, assuming your URLs are in Column B:

    =LEFT(B1,FIND(“/”,B1,9)-1)

    You can then use Find and Replace to replace the http, https, and www. with blanks. Now you have a list of domains.

  • On your disavow file, get a list of domains you’ve disavowed by replacing domain: with blanks. (This is assuming you have disavowed on the domain level and not the URL level.)
  • Put your new list of disavowed domains on the second sheet of your links spreadsheet and fill Column B down with “disavowed”.
  • Now, on the links list, we’re going to use a VLOOKUP to figure out which of our current live links are ones that we’ve previously disavowed. In this formula, your domains are in the first column of each spreadsheet and I’ve used 1000 as the total number of domains in my disavow list. Here goes:

    =VLOOKUP(A1,sheet2!$ A$ 1:$ B$ 1000,2,FALSE)

  • Now you can take the domains that are in your disavow file and audit those URLs.

What we’re looking for here are URLs where we had disavowed them just to be safe, but in reality, they are probably OK links.

Note: Just as in regular link auditing work, do not make decisions based on blanket metrics. While some of these metrics can help us make decisions, you do not want to base your decision for reavowing solely on Domain Authority, spam score, or some other metric. Rather, you want to look at each domain and think, “If a webspam team member looked at this link, would they think it only exists for SEO reasons, or does it have a valid purpose outside of SEO?”

Let’s say we’ve gone through the links in our disavow file and have found 20 links that we’d like to reavow. We would then go back to the disavow file that we downloaded from Google and remove the lines that say “domain:example.com” for each of those domains which we want to reavow.

Upload your disavow file to Google again. This will overwrite your old file. At some point in the future Google should start counting the links that you’ve removed from the file again. However, there are a few things to note:

  • Matt Cutts from Google mentioned in a video that reavowing a link takes “a lot longer” than disavowing. They built a lag function into the tool to try to stop spammers from reverse-engineering the algorithm.
  • Matt Cutts also said in the same video that a reavowed link may not carry the same weight it once did.

If this whole process of reavowing sounds too complicated, you can hire me to do the work for you. I might be willing to do the work at a discount if you allow me to use your site (anonymously) as a case study to show whether reavowing had any discernible benefit.

Conclusions

Should we still be using the disavow tool? In some cases, the answer to this is yes. If you have links that are obviously there for mostly SEO reasons, then it’s best to disavow these so that they don’t cause you to get a manual action in the future. Also, we want to be sure that Google isn’t using these links in any sort of algorithmic calculations that take link quality into account. Remember, it’s not just Penguin that uses links.

I think that it is unlikely that filing a disavow will cause a site to see a big improvement in rankings, unless the site is using it to recover from a sitewide manual action. Others will disagree with me, however. In fact, a recent Moz blog post showed a possible recovery from an algorithmic suppression shortly after a site filed a disavow. I think that, in this case, the recovery may have been due to a big algorithm change that SEOs call Fred that happened at the same time, rather than the filing of a disavow file.

In reality, though, no one outside of Google knows for sure how effective the disavow tool is now. We know that Google says we should still use it if we find unnatural links pointing to our site. As such, my advice is that if you have unnatural links, you should still be disavowing.

I’d love to hear what you think. Please do leave a comment below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google Penguin 4.0 Rollback & Reversals? Some Think So.

As you know, Google finally pushed out Penguin 4.0 in late September and the recoveries and declines were fully rolled out in the first two weeks of October…


Search Engine Roundtable

Posted in IM NewsComments Off

Search Buzz Video Recap: Google Interstitials, Penguin Droppings, HTTPS, AdWords & More

This week I cover more on the Google mobile interstitial ad penalty. I also discuss cases of Google Penguin 4.0 declines in rankings. Google still only uses HTTPS in the URL as the only qualifier for the ranking boost. Google said sitewide ranking scores are current state…


Search Engine Roundtable

Posted in IM NewsComments Off

Updated: Google: Penguin Can Discount All Your Links, Good Or Bad

Gary Illyes from Google sent us this statement:

When speaking yesterday, a statement I made about manual actions was phrased in a way that sounded like I was talking about Penguin — that was incorrect and I apologize for the confusion…


Search Engine Roundtable

Posted in IM NewsComments Off

Search Buzz Video Recap: Google Mobile Index, Penguin Destruction, Machine Learning & More

This week in search, Google said they are launching their mobile only index within months and moving the desktop index as a secondary index. Google Penguin 4.0 is completely live now. Gary Illyes said Penguin can act to discount all your links…


Search Engine Roundtable

Posted in IM NewsComments Off

Why Didn’t You Recover from Penguin?

Posted by Dr-Pete

After almost a two-year wait, the latest Penguin update rolled out in late September and into early October. This roll-out is unusual in many ways, and it only now seems to be settling down. In the past couple of weeks, we’ve seen many reports of recoveries from previous Penguin demotions, but this post is about those who were left behind. What if you didn’t recover from Penguin?

I’m going to work my way from unlikely, borderline conspiracy theories to difficult truths. Theories #1 and #2 might make you feel better, but, unfortunately, the truth is more likely in #4 or #5.


1. There is no Penguin

Then you’ll see that it is not the spoon that bends, it is only yourself. Ok, this is the closest I’ll get to full-on conspiracy theory. What if this new Penguin is a ruse, and Google did nothing or rolled out something else? We can’t know anything 100% without peering into the source code, but I’m 99% confident this isn’t the case. Interpreting Google often means reading between the lines, but I don’t know of any recent confirmed announcement that ended up being patently false.

Google representatives are confirming details about the new Penguin both publicly and privately, and algorithm flux matches the general timeline. Perhaps more importantly, we’re seeing many anecdotal reports of Penguin recoveries, such as:

Given the severity of Penguin demotions and the known and infrequent update timelines, these reports are unlikely to be coincidences. Some of these reports are also coming from reliable sources, like Marie Haynes (above) and Glenn Gabe (below), who closely track sites hit by Penguin.


2. Penguin is still rolling out

This Penguin update has been unusual in many ways. It’s probably best not to even call it “Penguin 4.0″ (yes, I realize I keep calling it that). The new, “real-time” Penguin is not simply an update to Penguins 1–3. It replaces them and works very differently.

Because real-time Penguin is so different, the roll-out was broken up into a couple of phases. I believe that the new code went live in roughly the timeline of Google’s announcement date of September 23rd. It might have happened a day or two before that, but probably not weeks before. This new code, though, was the kinder, gentler Penguin, which devalues bad links.

For this new code to fully take effect, the entire link graph had to be refreshed, and this takes time, especially for deeper links. So, the impact of the initial roll-out may have taken a few days to fully kick in. In terms of algorithm flux, the brunt of the initial release hit MozCast around September 27th. Now that the new Penguin is real-time, we’ll be feeling its impact continuously, although that impact will be unnoticeable for the vast majority of sites on the vast majority of days.

In addition, Google has rolled back previous Penguin demotions. This happened after the new code launched, but we don’t have an exact timeline. This process also took days, possibly a week or more. We saw additional algorithm spikes around October 2nd and 6th, although the entire period showed sustained flux.

On October 7th, Gary Illyes from Google said that the Penguin roll-out was in the “final stage” (presumably, the removal of demotions) and would take a “few more days”. As of this writing, it’s been five more days.

My best guess is that 95%+ of previous Penguin demotions have been removed at this point. There’s a chance you’re in the lucky 5% remaining, but I wouldn’t hold my breath.


3. You didn’t cut nearly deep enough

During the few previous Penguin updates, it was assumed that sites didn’t recover because they simply hadn’t cut deep enough. In other words, site owners and SEOs had tried to surgically remove or disavow a limited number of bad links, but those links were either not the suspect links or were just the tip of the iceberg.

I think it’s true that many people were probably trying to keep as many links as possible, and were hesitant to make the deep cuts Penguin required. However, this entire argument is misleading and possibly self-destructive, because this isn’t how the new Penguin works.

Theoretically, the new Penguin should only devalue bad links, and its impact will be felt on a more “granular” (in Google’s own words) level. In other words, your entire site won’t be demoted because of a few or even a lot of bad links, at least not by Penguin. Should you continue to clean up your link profile? Possibly. Will cutting deeper help you recover from Penguin down the road? Probably not.


4. Without bad links, you’d have no links at all

Here’s the more likely problem, and it’s a cousin of #3. Your link profile is so bad that there is practically no difference between “demotion” and “devaluation.” It’s quite possible that your past Penguin demotion was lifted, but your links were so heavily devalued that you saw no ranking recovery. There was simply no link equity left to provide SEO benefit.

In this case, continuing to prune those bad links isn’t going to help you. You need to build new quality signals and authoritative links. The good news is that you shouldn’t have to wait months or years now to see the positive impact of new links. The bad news is that building high-quality links is a long, difficult road. If it were easy, you probably wouldn’t have taken shortcuts in the first place.


5. Your problem was never Penguin

This is the explanation no one wants to hear, but I think it’s more common than most of us think. We’re obsessed with the confirmed update animals, especially Penguin and Panda, but these are only a few of the hundreds of animals in the Google Zoo.

There were algorithmic link demotions before Penguin, and there are still parts of the algorithm that look for and act on bad links. Given the power that links still hold over ranking, this should come as no surprise. The new Penguin isn’t a free pass on all past link-building sins.

In addition, there are still manual actions. These should (hopefully) show up in Google Search Console, but Google will act on bad links manually where it’s warranted.

It’s also possible that you have a very different algorithmic problem in play or any of a number of technical SEO issues. That diagnostic is well beyond the scope of this blog post, but I’ll offer this advice — dig deeper. If you haven’t recovered from Penguin, maybe you’ve got different or bigger problems.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Penguin 4.0: How the Real-Time Penguin-in-the-Core-Alg Model Changes SEO – Whiteboard Friday

Posted by randfish

The dust is finally beginning to settle after the long-awaited rollout of Penguin 4.0. Now that our aquatic avian friend is a real-time part of the core Google algorithm, we’ve got some changes to get used to. In today’s Whiteboard Friday, Rand explains Penguin’s past, present, and future, offers his analysis of the rollout so far, and gives advice for going forward (hint: never link spam).

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week, it is all about Google Penguin. So Google Penguin is an algorithm that’s been with us for a few years now, designed to combat link spam specifically. After many, many years of saying this was coming, Penguin 4.0 rolled out on Friday, September 23rd. It is now real-time in Google’s algorithm, Google’s core algorithm, which means that it’s constantly updating.

So there are a bunch of changes. What we’re going to talk about today is what Penguin 1.0 to 3.x looked like and how that’s changed as we’ve moved to the Penguin 4.0 model. Then we’ll cover a little bit of what the rollout has looked like and how it’s affecting folks’ sites and specifically some recommendations. Thankfully, we don’t have a ton.

Penguin 1.0-3x

But important to understand, if people ask you about Penguin, people ask you about the penalties that used to come from Penguin, you’ve got to know that, back in the day…

  • Penguin 1.0 to 3.x, it used to run intermittently. So every few months, Google would collect a bunch of information, they’d run the algorithm, and then they’d release it out in the wild. It would now be in the search results. When that rollout happened, that was the only time, pretty much the only time that penalties from Penguin specifically would be given to websites or removed.

    This meant that a lot of the time, you had this slow process, where if you got penalized by Penguin, you did something bad, you did some sketchy link building, you went through all the process, you went through all the processes of getting that penalty lifted, Google said, “Fine, you’re in good shape. The next time Penguin comes out, your penalty is lifted.” You could wait months. You could wait six months or more before that penalty got lifted. So a lot of fear here and a lot of slowness on Google’s side.

  • Penguin also penalized, much like Panda, where it looks at a portion of the site, these pages maybe are the only ones on this whole domain that got bad links to them, but old Penguin did not care. Penguin would hit the entire website.

    It would basically say, “No, you’re spamming to those pages, I’m burying your whole domain. Every page on your site is penalized and will not be able to rank well.” Those sorts of penalties are very, very tough for a lot of websites. That, in fact, might be changing a little bit with the new Penguin algorithm.

  • Old Penguin did not require a reconsideration request process, though manual penalties and, some SEOs believed, Penguin penalties, too, did lift often in conjunction with disavowing old links, proving to Google that you had gone through the process of trying to get those links removed.

    It wasn’t often enough to just say, “I’ve disavowed them.” You had to tell Google, “Hey, I tried to contact the site where I bought the links or I tried to contact the private blog network, but I couldn’t get them to take it down or I did get them to take it down or they blackmailed me and forced me to pay them to take it down.” Sometimes people did pay and Google said that was bad, but then sometimes would lift the penalties and sometimes they told them, “Okay, you don’t have to pay the extortionist and we’ll lift the penalty anyway.” Very manual process here.

  • Penguin 1.0 to 3.x was really designed to remove the impact of link spam on search results, but doing it in a somewhat weird way. They were doing it basically through penalties that affected entire websites that had tried to manipulate the results and by creating this fear that if I got bad links, I would be potentially subject to Penguin for a long period.

I have a theory here. It’s a personal theory. I don’t want you to hold me to it. I believe that Google specifically went through this process in order to collect a tremendous amount of information on sketchy links and bad links through the disavow file process. Once they had a ginormous database of what sketchy and spammy bad links looked like, that they knew webmasters had manually reviewed and had submitted through the disavowal file and thought could harm their sites and were paid for or just links that were not editorially acquired, they could then machine learn against that giant database. Once they’ve acquired enough disavowals, great. Everything else is gravy. But they needed to get that huge sample set. They needed it not to just be things that they, Google, could identify but things that all of us distributed across the hundreds of millions of websites on the planet could identify. Using those disavowal files, Google can now make Penguin more real-time.

Penguin 4.0+

So challenges here, this is too slow. It hurt too much to have that long process. So in the new Penguin 4.0 and going forward, this runs as part of the core algorithm, meaning…

  • As soon as Google crawls and indexes a site and is able to update that in their databases, that site’s penalty is either lifted or incurred. So this means that if you get sketchy links, you don’t have to wait for Penguin to come out. You could get hurt tomorrow.
  • Penguin does not necessarily any longer penalize an entire domain. It still might. It could be the case that if lots of pages on a domain are getting sketchy links or some substantive portion or Google thinks you’re just too sketchy, they could penalize you.

Remember, Penguin is not the only algorithm that can penalize websites for getting bad links. There are manual spam penalties, and there are other forms of spam penalties too. Penguin is not alone here. But it may be simply taking the pages that earn those bad links and discounting those links or using different signals, weighting different signals to rank those pages or search results that have lots of pages with sketchy links in them.

  • It is also the case — and this is not 100% confirmed yet — but some early discussion between Google’s representatives and folks in the webmaster and SEO community has revealed to us that it may not be the case that Penguin 4.0 and moving forward still requires the full disavow and whole reconsideration request process.

That’s not to say that if you incur a penalty, you should not go through this. But it may not be the case that’s the only way to get a penalty lifted, especially in two cases — no fault cases, meaning you did not get those links, they just happened to come to you, or specifically negative SEO cases.

I want to bring up Marie Haynes, who does phenomenally good work around spam penalties, along with folks like Sha Menz and Alan Bleiweiss, all three of them have been concentrating on Google penalties along with many, many other SEOs and webmasters. But Marie wrote an excellent blog post detailing a number of case studies, including a negative SEO case study where the link penalty had been lifted on the domain. You can see her results of that. She’s got some nice visual graphs showing the keyword rankings changing after Penguin’s rollout. I urge you to do that, and we’ll make sure to link to it in the transcript of this video.

  • Penguin 4.0 is a little bit different from Penguin 1.0 to 3 in that it’s still designed to remove the impact of spam links on search results, but it’s doing it by not counting those links in the core algo and/or by less strongly weighting links in search results where many folks are earning spammy links.

So, for example, your PPC, your porn, your pills, your casino searches, those types of queries may be places where Google says, “You know what? We don’t want to interpret, because all these folks have nasty links pointing to them, we are going to weight links less. We’re going to weight other signals higher.” Maybe it’s engagement and content and query interpretation models and non-link signals that are offsite, all those kinds of things, clickstream data, whatever they’ve got. “We’re going to push down the value of either these specific links or all links in the algo as we weight them on these types of results.”

Penguin 4.0 rollout

So this is what we know so far. We definitely will keep learning more about Penguin as we have more experience with it. We also have some information on the rollout.

  • Started on Friday, September 23rd, few people noticed any changes.

In fact, the first few days were pretty slow, which makes sense. It fits with what Google said about the rollout being real-time and them needing time to crawl and index and then refresh all this data. So until it rolls out across the full web and Google’s crawled and indexed all the pages, gone through processing, we’re not going to get there. So little effect that same day, but…

  • More SERP flux started three to five days after, that next Monday, Tuesday, Wednesday. We saw very hot temperatures starting that next week in MozCast, and Dr. Pete has been detailing those on Twitter.
  • As far as SEOs noticing, yes, a little bit.

So I asked the same poll on Twitter twice, once on September 27th and once on October 3rd, so about a week apart. Here is the data we got. “Nope, nothing yet.” “Went from 76% to 72%,” so a little more than a quarter of SEOs have noticed some changes.

A lot of folks noticing rankings went up. Moz itself, in fact, benefitted from this. Why is that the case? Well, any time a penalty rolls out to a lot of other websites, bad stuff gets pushed down and those of us who have not been spamming move up in the rankings. Of course, in the SEO world, which is where Moz operates, there are plenty of folks getting sketchy links and trying things out. So they were higher in the rankings, they moved down, and Moz moved up. We saw a very nice traffic boost. Thank you, Google, for rolling out Penguin. That makes our Audience Development team’s metrics look real good.

Four percent and then six percent said they saw a site or page get penalized in their control, and two percent and then one percent said they saw a penalty lifted. So a penalty lifted is still pretty light, but there are some penalties coming in. There are a few of those. Then there’s the nice benefit of if you don’t link spam, you do not get penalized. Every time Google improves on the Penguin algorithm, every time they improve on any link spam algorithm, those of us who don’t spam benefit.

It’s an awesome thing, right? Instead of cheering against Google, which you do if you’re a link spammer and you’re very nervous, you get to cheer for Google. Certainly Penguin 4.0 is a good time to cheer for Google. It’s brought a lot of traffic to a lot of good websites and pushed a lot of sketchy links down. We will see happens as far as disavows and reconsideration requests for the future.

All right, everyone, thanks for joining. Look forward to hearing about your experiences with Penguin. We’ll see you next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Only 12% Said They Saw Ranking Improvements After Google Penguin 4.0

As you know, Google began rolling out Penguin 4.0 on September 23rd – we’ve been writing about Penguin a lot recently so catch up here. But on the Tuesday after, we reported that there was minimal impact seen by anyone in the SEO community…


Search Engine Roundtable

Posted in IM NewsComments Off

SearchCap: Penguin & link building, PPC leads & social

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Penguin & link building, PPC leads & social appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Gary Illyes: Google Penguin 4.0 Was One Of Our Nicest Launches

Did you know how proud Google is with the Penguin 4.0, real time, launch? Very proud.
Gary Illyes from Google said…


Search Engine Roundtable

Posted in IM NewsComments Off

Advert