Tag Archive | "Disavow"

Search Buzz Video Recap: Google Search Console, Image Search, Mobile-First, Disavow & Much More

This week, we covered how Google said the old Search Console may sunset in March and specifically which features are going away. Some European Google searchers were seeing…


Search Engine Roundtable

Posted in IM NewsComments Off

Do We Still Need to Disavow in the Era of Penguin 4.0?

Posted by MarieHaynes

It has now been six months since the launch of Penguin 4.0. In my opinion, Penguin 4.0 was awesome. It took ages for Google to release this update, but when they did, it was much more fair than previous versions of Penguin. Previous versions of Penguin would cause entire sites to be suppressed if the algorithm thought that you’d engaged in manipulative link building. Even if a site did a thorough link cleanup, the suppression would remain present until Google re-ran the Penguin algorithm and recognized your cleanup efforts. It was not uncommon to see situations like this:

I saw many businesses that had looooooong periods of time of suppression — even years!

According to Google spokesperson Gary Illyes, the new version of Penguin that was released in September of 2016 no longer suppresses sites:

Now, instead of causing a sitewide demotion when Penguin detects spam, they’ll simply devalue that spam so that it can’t help improve a site’s rankings.

I’m guessing that it took a lot of brainpower to figure out how to do this. Google now has enough trust in their ability to find and devalue spam that they are comfortable removing the punitive aspect of Penguin. That’s impressive.

This change brings up a question that I am asked several times a week now:

If Penguin is able to devalue spam, is there any reason to disavow links any more?

I’ve been asked this enough times now that I figured it was a good idea to write an article on my answer to this question.

A brief refresher: What is the disavow tool?

The disavow tool was given to us in October of 2012.

You can use it by uploading a file to Google that contains a list of either URLs or domains. Then, as Google crawls the web, if they come across a URL or domain that is in your disavow file, they won’t use links from that page in their calculations of PageRank for your site. Those links also won’t be used by the Penguin algorithm when it decides whether your site has been involved in webspam.

For sites that were affected by Penguin in the past, the disavow tool was an integral part of getting the suppression lifted off the site. It was essentially a way of saying to Google, “Hey… in the past we made some bad links to our site. But we don’t want you to use those links in your calculations.” Ideally, it would be best to remove bad links from the web, but that’s not always possible. The disavow tool was, in my opinion, super important for any site that was hit by Penguin.

For more in-depth information on using the disavow tool, see this Moz post: https://moz.com/blog/guide-to-googles-disavow-tool

What does Google say about using the disavow tool now?

It wasn’t long after the release of Penguin 4.0 before people starting asking Google whether the disavow tool was still necessary. After all, if Google can just devalue spam links on their own, why should I have to disavow them?

Here are some replies from Google employees:

Now, the conspiracy theorists out there will say, “Of course Google wants you to disavow! They need that data to machine-learn for Penguin!”

Google has said that Penguin is not a machine learning algorithm:

And even if they ARE using disavow data for some kind of machine learning training set, really, does it matter? In my opinion, if Google is saying that we should be still using the disavow tool, I don’t think they’re trying to trick us. I think it still has a real purpose.

Three reasons why I still recommend using the disavow tool

There are three main reasons why I still recommend disavowing. However, I don’t recommend it in as many cases as I used to.

1) Manual actions still exist

You do NOT want to risk getting a manual unnatural links penalty. I have documented on Moz before about the many cases I’ve seen where a manual unnatural links penalty was devastating to the long-term health of a site.

Google employee Gary Illyes commented during a podcast that, when a Google webspam team member looks at your site’s links, they can often see labels next to the links. He said the following:

If the manual actions team is reviewing a site for whatever reason, and they see that most of the links are labeled as Penguin Real-Time affected, then they might decide to take a much deeper look on the site… and then maybe apply a manual action on the site because of the links.

In other words, if you have an unnatural link profile and you leave it up to Penguin to devalue your links rather than disavowing, then you’re at risk for getting a manual action.

Of course, if you actually do have a manual action, then you’ll need to use the disavow tool as part of your cleanup efforts along with manual link removal.

2) There are other algorithms that use links

Link quality has always been important to Google. I believe that Penguin is just one way in which Google fights against unnatural links algorithmically. One example of another algorithm that likely uses links is the Payday Loans algorithm. This algorithm isn’t just for payday loans sites; it also affects sites in many high-competition verticals.

Bill Slawski recently posted this interesting article on his thoughts about a recent patent filed by Google. In one place, the patent talks about a situation where a resource may have a large number of links pointing to it but there is a disproportionate amount of traffic. In cases like that, the page being linked to might actually be demoted in rankings.

Now, that’s just a patent, so it doesn’t mean for sure that there’s actually an algorithm behind this… but there could be! Makes you think, right?

Google is always trying to fight against link spam and Penguin is just one of the ways in which they do this. If there are links that are potentially causing my link profile to look spammy to Google, then I don’t want them to count in any calculations that Google is making.

3) Can we trust that Penguin is able to devalue all spam pointing to our site?

The official announcement from Google on Penguin is here. Here’s what it says about devaluing as opposed to demoting:

“Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.”

This statement is not clear to me. I have questions:

  • When Google says they are “adjusting ranking,” could that also be negative adjustments?
  • Can Penguin possibly demote rankings for certain pages rather than affecting the whole site?
  • Can Penguin possibly demote rankings for certain keywords rather than affecting the whole site?

As posted above, we received some clarification on this from Google employees in a Facebook post (and again via tweets) to tell us that Penguin 4.0 doesn’t penalize, but rather devalues spam. However, these are not official statements from Google. These statements may mean that we never have to worry about any link pointing to our site ever again. Perhaps? Or they could mean that there’s less need to worry than there was previously.

Personally, if my business relies on Google organic rankings in order to succeed, I’m a little leery about putting all of my trust in this algorithm’s ability to ignore unnatural links and not let them hurt me.

Who should be disavowing?

While I do still recommend use of the disavow tool, I only recommend it in the following situations:

  1. For sites that have made links for SEO purposes on a large scale – If you or an SEO company on your behalf made links in low-quality directories, low-quality article sites, bookmark sites, or as comment spam, then these need to be cleaned up. Here’s more information on what makes a link a low-quality link. You can also run links past my disavow blacklist if you’re not sure whether it’s a good one or not. Low-quality links like this are probably being devalued by Penguin, but they’re the type of link that could lead to a manual unnatural links penalty if you happen to get a manual review by the webspam team and they haven’t been disavowed.
  2. For sites that previously had a manual action for unnatural links – I’ve found that if a site has enough of a spam problem to get an unnatural links penalty, then that site usually ends up collecting more spam links over the years. Sometimes this is because low-quality directories pop up and scrape info from other low-quality directories. Sometimes it’s because old automated link-generating processes keep on running. And sometimes I don’t have an explanation, but spammy links just keep appearing. In most cases, sites that have a history of collecting unnatural links tend to continue to collect them. If this is the case for you, then it’s best to disavow those on a regular basis (either monthly or quarterly) so that you can avoid getting another manual action.
  3. For sites under obvious negative SEO attacks – The key here is the word “obvious.” I do believe that in most cases, Google is able to figure out that spam links pointed at a site are links to be ignored. However, at SMX West this year, Gary Illyes said that the algorithm can potentially make mistakes:
    If you have a bunch of pharma and porn links pointing at your site, it’s not a bad idea to disavow them, but actually in most cases I just ignore these. Where I do recommend disavowing for negative SEO attacks is when the links pointing at your site contain anchors for keywords for which you want to rank. If it’s possible that a webspam team member could look at your link profile and think that there are a lot of links there that exist just for SEO reasons, then you want to be sure that those are cleaned up.

Who does NOT need to disavow?

If you look at your links and notice some “weird” links that you can’t explain, don’t panic!

Every site gets strange links, and often quite a few of them. If you haven’t been involved in manipulative SEO, you probably do not need to be disavowing links.

When Google takes action either manually or algorithmically against a site for unnatural linking, it’s because the site has been actively trying to manipulate Google rankings on a large scale. If you made a couple of directory links in the past, you’re not going to get a penalty.

You also don’t need to disavow just because you notice sitewide links pointing to you. It can look scary to see in Google Search Console that one site is linking to you thousands of times, especially if that link is keyword-anchored. However, Google knows that this is a sitewide link and not thousands of individual links. If you made the link yourself in order to help your rankings, then sure, go ahead and disavow it. But if it just appeared, it’s probably nothing to worry about.

Borderline cases

There are some cases where it can be difficult to decide whether or not to disavow. I sometimes have trouble advising on cases where a company has hired a medium- to high-quality SEO firm that’s done a lot of link building — rather than link earning — for them.

Here’s an example of a case that would be difficult:

Let’s say you’ve been getting most of your links by guest posting. These guest posts are not on low-quality sites that exist just to post articles, but rather on sites that real humans read. Are those good links?

According to Google, if you’re guest posting primarily for the sake of getting links, then these are unnatural links. Here’s a quote from Google employee John Mueller:

“Think about whether or not this is a link that would be on your site if it weren’t for your actions…When it comes to guest blogging it’s a situation where you are placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a link building point of view. It can make sense to guest blog on other people’s sites to drive some traffic to your site… but you should use a nofollow.”

If you have a small number of guest posts, Google is unlikely to go after you. But what if a webspam team member looks at your links and sees that you have a very large number of links built via guest posting efforts? That makes me uncomfortable.

You could consider disavowing those links to avoid getting a manual action. It’s quite possible, though, that those links are actually helping your site. Disavowing them could cause you to drop in rankings.

This article could easily turn into a discussion on the benefits and risks of guest posting if we had the space and time. My point in mentioning this is to say that some disavow decisions are tough.

In general, my rule of thumb is that you should use the disavow file if you have a good number of links that look like you made them with SEO as your primary goal.

Should you be auditing your disavow file?

I do believe that some sites could benefit from pruning their disavow file. However, I have yet to see any reports from anyone who has claimed to have done this and seen benefit that we can reasonably attribute to the recovery of PageRank that flows through those links.

If you have used your disavow file in the past in an effort to remove a manual action or recover from a Penguin hit, then there’s a good possibility that you were overly aggressive in your disavow efforts. I know I’ve had some manual penalties that were really difficult to remove and we likely disavowed more links than were necessary. In cases like those, we could go through our disavow files and remove the domains that were questionable disavow decisions.

It’s not always easy to do this, though, especially if you’ve done the correct thing and have disavowed on the domain level. If this is the case, you won’t have actual URLs in your disavow file to review. It’s hard to make reavowing decisions without seeing the actual link in question.

Here’s a process you can use to audit your disavow file. It gets a little technical, but if you want to give it a try, here it is:

(Note: Many of these steps are explained in greater detail and with pictures here.)

  • Download your disavow file from Google: https://www.google.com/webmasters/tools/disavow-links-main
  • Get a list of your links from Google Search Console. (It’s not a bad idea to also get links from other sources, as well.)
  • On your CSV of links, make a column for domains. You can extract the domain by using this formula, assuming your URLs are in Column B:

    =LEFT(B1,FIND(“/”,B1,9)-1)

    You can then use Find and Replace to replace the http, https, and www. with blanks. Now you have a list of domains.

  • On your disavow file, get a list of domains you’ve disavowed by replacing domain: with blanks. (This is assuming you have disavowed on the domain level and not the URL level.)
  • Put your new list of disavowed domains on the second sheet of your links spreadsheet and fill Column B down with “disavowed”.
  • Now, on the links list, we’re going to use a VLOOKUP to figure out which of our current live links are ones that we’ve previously disavowed. In this formula, your domains are in the first column of each spreadsheet and I’ve used 1000 as the total number of domains in my disavow list. Here goes:

    =VLOOKUP(A1,sheet2!$ A$ 1:$ B$ 1000,2,FALSE)

  • Now you can take the domains that are in your disavow file and audit those URLs.

What we’re looking for here are URLs where we had disavowed them just to be safe, but in reality, they are probably OK links.

Note: Just as in regular link auditing work, do not make decisions based on blanket metrics. While some of these metrics can help us make decisions, you do not want to base your decision for reavowing solely on Domain Authority, spam score, or some other metric. Rather, you want to look at each domain and think, “If a webspam team member looked at this link, would they think it only exists for SEO reasons, or does it have a valid purpose outside of SEO?”

Let’s say we’ve gone through the links in our disavow file and have found 20 links that we’d like to reavow. We would then go back to the disavow file that we downloaded from Google and remove the lines that say “domain:example.com” for each of those domains which we want to reavow.

Upload your disavow file to Google again. This will overwrite your old file. At some point in the future Google should start counting the links that you’ve removed from the file again. However, there are a few things to note:

  • Matt Cutts from Google mentioned in a video that reavowing a link takes “a lot longer” than disavowing. They built a lag function into the tool to try to stop spammers from reverse-engineering the algorithm.
  • Matt Cutts also said in the same video that a reavowed link may not carry the same weight it once did.

If this whole process of reavowing sounds too complicated, you can hire me to do the work for you. I might be willing to do the work at a discount if you allow me to use your site (anonymously) as a case study to show whether reavowing had any discernible benefit.

Conclusions

Should we still be using the disavow tool? In some cases, the answer to this is yes. If you have links that are obviously there for mostly SEO reasons, then it’s best to disavow these so that they don’t cause you to get a manual action in the future. Also, we want to be sure that Google isn’t using these links in any sort of algorithmic calculations that take link quality into account. Remember, it’s not just Penguin that uses links.

I think that it is unlikely that filing a disavow will cause a site to see a big improvement in rankings, unless the site is using it to recover from a sitewide manual action. Others will disagree with me, however. In fact, a recent Moz blog post showed a possible recovery from an algorithmic suppression shortly after a site filed a disavow. I think that, in this case, the recovery may have been due to a big algorithm change that SEOs call Fred that happened at the same time, rather than the filing of a disavow file.

In reality, though, no one outside of Google knows for sure how effective the disavow tool is now. We know that Google says we should still use it if we find unnatural links pointing to our site. As such, my advice is that if you have unnatural links, you should still be disavowing.

I’d love to hear what you think. Please do leave a comment below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Discussion: Can You Disavow out of Penguin? – Whiteboard Friday

Posted by josh_bachynski

Penguin is back at the forefront of many marketers’ minds now that the third iteration of the algorithm update has been released, and a rumor has begun circulating that you can weasel your way out of a Penguin penalty by simply submitting a disavow file. In today’s Whiteboard Friday, Josh Bachynski breaks down that argument and starts a realistic discussion to find the answer. While we (and, as you’ll see, Josh) don’t have definitive answers, we hope you’ll join in with your thoughts in the comments!

For reference, here’s a still of this week’s whiteboard!

Can You Disavow Out of Penguin Whiteboard

Video transcription

Hi. Welcome to Whiteboard Friday, and I’m your guest host for this week, Josh Bachynski. This week I’d like to talk about whether or not it is true that you can disavow your way out of the dreaded Penguin algorithm.

So there is a hypothesis going around the SEO community that it is possible that you can just use a disavow file to get out of Penguin. Now, for those of you who don’t know, the disavow file is a feature that Google implemented a couple years ago where you can upload your spammy links into a file, very similar to robots.txt, and they will apparently remove those links out of your link graph or have them not count against you or something along those lines.

However, the hypothesis is — and Google has confirmed this both by John Mueller and Matt Cutts — that apparently if someone sends you bad links or you make bad links, however the link showed up, you can just disavow those links, you can put them in your disavow file, and this will help you get out of the Penguin algorithm.

So this is the hypothesis, and not only does Google claim this is the case, but many SEOs claim this is the case as well. In fact, they go so far as to claim to have succeeded doing this for themselves or for clients, that they have just taken links and put them in the disavow file and those clients, on a Penguin refresh, have been saved from the terrible Penguin.

It would be a large problem if this was not the case, because, as I mentioned, it is possible for people to simply buy a Fiverr blast — I don’t want to list off too many options to give you negative SEO ideas — but you could imagine scenarios where it’s pretty easy to build these spammy links pointing at sites and possibly get Google to notice them and then to milk those sites when Penguin 3 comes around or Penguin 4, the next iteration of Penguin.

It would be very good if the disavow file worked. Personally, I’d like the disavow file to work if I could prove that it did. It is a problem, in a lot of ways, that it’s not, which I’ll get to in a second.

However, if this is true, then there should be no recovery with link loss. For example, if this is true, that you can just disavow your way out of Penguin, then we should be able to find sites that escape on the Penguin date, but have deleted no links or we can tell have had no link loss whatsoever. That way, we can know that it was just a disavow file and not some combination of either deleting links on the disavow file or something else entirely. So if that is the case, that this hypothesis is true, then we can use a scientific method to determine that we should be able to find exemplars of the hypothesis.

On the last Penguin 3, when it was released October 18th, for those people who claimed that they recovered and claim they did it only from a disavow file, I asked them to send me examples. I said, “Fine. Send me your URL, and I’d like to check it.” I tested over 12 sites altogether who claimed to have both recovered from Penguin on that date and to only use the disavow file or claimed to only use disavow file to do so.

However, I found something rather striking, that every single one that I checked, they all had link loss. In either Majestic or Ahrefs or using the Moz tools, I found that they all had links that they lost a few months prior to the release of Penguin.

Now whether they deleted the links and just lied to me, or whether they forgot they deleted the links, or whether the links just dropped off the link graph because, of course, web pages on the Internet change. For all we know, these could have been just scraper sites scraping them, giving them links that they didn’t even want, and those sites just disappear. However the links were lost, the links were lost.

So, what does that tell us? Well, unfortunately, it tells us that I cannot confirm the hypothesis. After 12 tries, the hypothesis that you can just use the disavow file to escape Penguin, I was not able to confirm that hypothesis. The examples, the evidence that people sent me trying to prove this hypothesis proved to be false. So I say myth busted or at the very least myth not confirmed. I was not able to confirm it after 12 plus tries to do so.

At best, all I can say after doing the testing and, of course, I just want to add this note in now, if anybody out there, anybody seeing this video claims to have recovered from Penguin and just done it solely from the disavow file and they didn’t delete any links and they didn’t lose any links, please, by all means, send it to me, because as I said, it’d be lovely. It’d be wonderful if that’s the way it worked, because then if someone is sending you a negative SEO attack, all you have to do is watch your backlinks on a daily basis and throw in there any ones that seems suspect.

But as I said, I could not confirm that’s the way it works. At best, all I can confirm is that deletion of links or loss of links still apparently has to be required in some way, and then two, this experiment, of course, has absolutely no bearing whatsoever on the manual penalty process, which I won’t even get into, which the disavow file may or may not help with. I’m not talking about that for this Whiteboard Friday.

The question then you’ll ask me is, “Josh, why, why, why, oh why, do people perpetuate this myth?” Well, I’m afraid there is a number of plausible reasons why they might perpetuate this myth, both Google and other SEOs. One is because it’s easy. An SEO who knows half of what they’re doing can get a list of links and put them in a disavow file and give them to a client in about five minutes to upload. In fact, there are programs that will do it for you very quickly. Are they selling snake oil? I don’t know, but I could not prove that the disavow file helped in any way, shape, or form for trying to get out of Penguin.

Two, there is another reason why Google might possibly — I’m just putting it out there for your consideration — perpetuate the myth — as far as I can tell it’s a myth — that the disavow file will help you escape from Penguin is because you’re feeding their machine learning. Every link you put in there, it’s entirely possible they can run through their algorithms, which Matt Cutts has admitted, at SMX Advanced 2013, they might just think of doing at some point in future, so they can tell what these badder spammy links are.

And finally, propaganda. People are very afraid of negative SEO, with good reason. Whether or not it works or not, it definitely is a scary concept, and so it would be very reassuring for Google to tell people that, “Hey, we have this nifty disavow file. So if you get scared, if you see some suspect links pointing to you, all you have to do is put them in your disavow file, and you don’t have to worry about it at all whatsoever.”

However, I’d love that to be true, but I was not able to prove that being the case. So I’m going to say that I think the myth is busted. If anybody has any counter evidence to send to me, by all means I am all ears to look at it. All I need to do is plug it into a Majestic SEO or Ahrefs and see if there are any deleted links before the last Penguin release and say, “No, you lost links, and so we cannot say that it is the disavow alone.”

To confirm that hypothesis, I would have to see no links lost in Majestic and no links lost in Ahrefs whatsoever, and, of course, I’d have to see an uptick on a declared Penguin date for me to say, “Well, jeez, the evidence looks like they have released on Penguin, and they had no deleted links.” Then I’ll take your word for it that you submitted a disavow file, because, of course, I can’t see that. Only the site owner can see that, or you can give me your login whatever. You can trust me.

Until that time, I’m saying the myth is busted. The disavow file alone does not help you escape from Penguin, maybe in combination with deleting links, I’m not sure. I’m saying the disavow file is, unfortunately, the opiate of the masses. It is a safe myth we believe in because it makes us feel warm and snuggly at night. But I’m afraid that, after scientific testing, I cannot prove that that is case.

I’ve come away from that with two more suggestions that I would recommend. One, I would stop paying for it. I would stop buying it. I would stop paying people to simply make you a disavow file and upload it. I would tell SEOs to stop selling that as a service alone. Of course, in conjunction with other services, fine. But that as a tactic alone, that’s not going to do anything at all, because the evidence, so far that I’ve seen, doesn’t suggest that it will.

Furthermore, a more general point, it might be a good idea to think about stop selling and stop buying from-the-hip SEO, where SEOs are selling services based merely on hearsay and as much as we can, in our industry, triangle more for science based SEO or data based SEO.

If anyone recommends any service to you or any suggestion or any SEO tactic to you, the first question you should ask is, “Where did you come by this information? Do you have any data to prove that this is a good thing to do?”

That is my Whiteboard Friday for this week. If you have any questions at all or you want to e-mail me, yell at me, contradict me by all means, or please send me more sites I can test that may have sites that didn’t delete any links, but did see an uptick on Penguin. By all means, join in the comments below, or e-mail me at JoshBachynski@gmail.com with that or any other questions. With that, I bid you adieu, and we’ll see you again next time. Bye-bye.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Disavow & Link Removal: Understanding Google

Fear Sells

Few SEOs took notice when Matt Cutts mentioned on TWIG that “breaking their spirits” was essential to stopping spammers. But that single piece of information add layers of insights around things like:

  • duplicity on user privacy on organic versus AdWords
  • benefit of the doubt for big brands versus absolute apathy toward smaller entities
  • the importance of identity versus total wipeouts of those who are clipped
  • mixed messaging on how to use disavow & the general fear around links

From Growth to No Growth

Some people internalize failure when growth slows or stops. One can’t raise venture capital and keep selling the dream of the growth story unless the blame is internalized. If one understands that another dominant entity (monopoly) is intentionally subverting the market then a feel good belief in the story of unlimited growth flames out.

Most of the growth in the search channel is being absorbed by Google. In RKG’s Q4 report they mentioned that mobile ad clicks were up over 100% for the year & mobile organic clicks were only up 28%.

Investing in Fear

There’s a saying in investing that “genius is declining interest rates” but when the rates reverse the cost of that additional leverage surfaces. Risks from years ago that didn’t really matter suddenly do.

The same is true with SEO. A buddy of mine mentioned getting a bad link example from Google where the link was in place longer than Google has been in existence. Risk can arbitrarily be added after the fact to any SEO activity. Over time Google can keep shifting the norms of what is acceptable. So long as they are fighting off WordPress hackers and other major issues they are kept busy, but when they catch up on that stuff they can then focus on efforts to shift white to gray and gray to black – forcing people to abandon techniques which offered a predictable positive ROI.

Defunding SEO is an essential & virtuous goal.

Hiding data (and then giving crumbs of it back to profile webmasters) is one way of doing it, but adding layers of risk is another. What panda did to content was add a latent risk to content where the cost of that risk in many cases vastly exceeded the cost of the content itself. What penguin did to links was the same thing: make the latent risk much larger than the upfront cost.

As Google dials up their weighting on domain authority many smaller sites which competed on legacy relevancy metrics like anchor text slide down the result set. When they fall down the result set, many of those site owners think they were penalized (even if their slide was primarily driven by a reweighting of factors rather than an actual penalty). Since there is such rampant fearmongering on links, they start there. Nearly every widely used form of link building has been promoted by Google engineers as being spam.

  • Paid links? Spam.
  • Reciprocal links? Spam.
  • Blog comments? Spam.
  • Forum profile links? Spam.
  • Integrated newspaper ads? Spam.
  • Article databases? Spam.
  • Designed by credit links? Spam.
  • Press releases? Spam.
  • Web 2.0 profile & social links? Spam.
  • Web directories? Spam.
  • Widgets? Spam.
  • Infographics? Spam.
  • Guest posts? Spam.

It doesn’t make things any easier when Google sends out examples of spam links which are sites the webmaster has already disavowed or sites which Google explicitly recommended in their webmaster guidelines, like DMOZ.

It is quite the contradiction where Google suggests we should be aggressive marketers everywhere EXCEPT for SEO & basically any form of link building is far too risky.

It’s a strange world where when it comes to social media, Google is all promote promote promote. Or even in paid search, buy ads, buy ads, buy ads. But when it comes to organic listings, it’s just sit back and hope it works, and really don’t actively go out and build links, even those are so important. – Danny Sullivan

Google is in no way a passive observer of the web. Rather they actively seek to distribute fear and propaganda in order to take advantage of the experiment effect.

They can find and discredit the obvious, but most on their “spam list” done “well” are ones they can’t detect. So, it’s easier to have webmasters provide you a list (disavows), scare the ones that aren’t crap sites providing the links into submission and damn those building the links as “examples” – dragging them into town square for a public hanging to serve as a warning to anyone who dare disobey the dictatorship. – Sugarrae

This propaganda is so effective that email spammers promoting “SEO solutions” are now shifting their pitches from grow your business with SEO to recover your lost traffic

Where Do Profits Come From?

I saw Rand tweet this out a few days ago…

… and thought “wow, that couldn’t possibly be any less correct.”

When ecosystems are stable you can create processes which are profitable & pay for themselves over the longer term.

I very frequently get the question: ‘what’s going to change in the next 10 years?’ And that is a very interesting question; it’s a very common one. I almost never get the question: ‘what’s not going to change in the next 10 years?’ And I submit to you that that second question is actually the more important of the two – because you can build a business strategy around the things that are stable in time….in our retail business, we know that customers want low prices and I know that’s going to be true 10 years from now. They want fast delivery, they want vast selection. It’s impossible to imagine a future 10 years from now where a customer comes up and says, ‘Jeff I love Amazon, I just wish the prices were a little higher [or] I love Amazon, I just wish you’d deliver a little more slowly.’ Impossible. And so the effort we put into those things, spinning those things up, we know the energy we put into it today will still be paying off dividends for our customers 10 years from now. When you have something that you know is true, even over the long-term, you can afford to put a lot of energy into it. – Jeff Bezos at re: Invent, November, 2012

When ecosystems are unstable, anything approaching boilerplate has an outsized risk added by the dominant market participant. The quicker your strategy can be done at scale or in the third world, the quicker Google shifts it from a positive to a negative ranking signal. It becomes much harder to train entry level employees on the basics when some of the starter work they did in years past now causes penalties. It becomes much harder to manage client relationships when their traffic spikes up and down, especially if Google sends out rounds of warnings they later semi-retract.

What’s more, anything that is vastly beyond boilerplate tends to require a deeper integration and a higher level of investment – making it take longer to pay back. But the budgets for such engagement dry up when the ecosystem itself is less stable. Imagine the sales pitch, “I realize we are off 35% this year, but if we increase the budget 500% we should be in a good spot a half-decade from now.”

All great consultants aim to do more than the bare minimum in order to give their clients a sustainable competitive advantage, but by removing things which are scalable and low risk Google basically prices out the bottom 90% to 95% of the market. Small businesses which hire an SEO are almost guaranteed to get screwed because Google has made delivering said services unprofitable, particularly on a risk-adjusted basis.

Being an entrepreneur is hard. Today Google & Amazon are giants, but it wasn’t always that way. Add enough risk and those streams of investment in innovation disappear. Tomorrow’s Amazon or Google of other markets may die a premature death. You can’t see what isn’t there until you look back from the future – just like the answering machine AT&T held back from public view for decades.

Meanwhile, the Google Venture backed companies keep on keeping on – they are protected.

When ad agencies complain about the talent gap, what they are really complaining about is paying people what they are worth. But as the barrier to entry in search increases, independent players die, leaving more SEOs to chase fewer corporate jobs at lower wages. Even companies servicing fortune 500s are struggling.

On an individual basis, creating value and being fairly compensated for the value you create are not the same thing. Look no further than companies like Google & Apple which engage in flagrantly illegal anti-employee cartel agreements. These companies “partnered” with their direct competitors to screw their own employees. Even if you are on a winning team it does not mean that you will be a winner after you back out higher living costs and such illegal employer agreements.

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black

Meanwhile, complaints about the above sorts of inequality or other forms of asset stripping are pitched as being aligned with Nazi Germany’s treatment of Jews. Obviously we need more H-1B visas to further drive down wages even as graduates are underemployed with a mountain of debt.

A Disavow For Any (& Every) Problem

Removing links is perhaps the single biggest growth area in SEO.

Just this week I got an unsolicited email from an SEO listing directory

We feel you may qualify for a Top position among our soon to be launched Link Cleaning Services Category and we would like to learn more about Search Marketing Info. Due to the demand for link cleaning services we’re poised to launch the link cleaning category. I took a few minutes to review your profile and felt you may qualify. Do you have time to talk this Monday or Tuesday?

Most of the people I interact with tend to skew toward the more experienced end of the market. Some of the folks who join our site do so after their traffic falls off. In some cases the issues look intimately tied to Panda & the sites with hundreds of thousands of pages maybe only have a couple dozen inbound links. In spite of having few inbound links & us telling people the problem looks to be clearly aligned with Panda, some people presume that the issue is links & they still need to do a disavow file.

Why do they make that presumption? It’s the fear message Google has been selling nonstop for years.

Punishing people is much different, and dramatic, from not rewarding. And it feeds into the increasing fear that people might get punished for anything. – Danny Sullivan

What happens when Google hands out free all-you-can-eat gummy bear laxatives to children at the public swimming pool? A tragedy of the commons.

Rather than questioning or countering the fear stuff, the role of the SEO industry has largely been to act as lap dogs, syndicating & amplifying the fear.

  • link tool vendors want to sell proprietary clean up data
  • SEO consultants want to tell you that they are the best and if you work with someone else there is a high risk hidden in the low price
  • marketers who crap on SEO to promote other relabeled terms want to sell you on the new term and paint the picture that SEO is a self-limiting label & a backward looking view of marketing
  • paid search consultants want to enhance the perception that SEO is unreliable and not worthy of your attention or investment

Even entities with a 9 figure valuation (and thus plenty of resources to invest in a competent consultant) may be incorrectly attributing SEO performance problems to links.

A friend recently sent me a link removal request from Buy Domains referring to a post which linked to them.

On the face of this, it’s pretty absurd, no? A company which does nothing but trade in names themselves asks that their name reference be removed from a fairly credible webpage recommending them.

The big problem for Buy Domains is not backlinks. They may have had an issue with some of the backlinks from PPC park pages in the past, but now those run through a redirect and are nofollowed.

Their big issue is that they have less than great engagement metrics (as do most marketplace sites other than eBay & Amazon which are not tied to physical stores). That typically won’t work if the entity has limited brand awareness coupled with having nearly 5 million pages in Google’s index.

They not only have pages for each individual domain name, but they link to their internal search results from their blog posts & those search pages are indexed. Here’s part of a recent blog post

And here are examples of the thin listing sorts of pages which Panda was designed in part to whack. These pages were among the millions indexed in Google.

A marketplace with millions of pages that doesn’t have broad consumer awareness is likely to get nailed by Panda. And the websites linking to it are likely to end up in disavow files, not because they did anything wrong but because Google is excellent at nurturing fear.

What a Manual Penalty Looks Like

Expedia saw a 25% decline in search visibility due to an unnatural links penalty , causing their stock to fall 6.4%. Both Google & Expedia declined to comment. It appears that the eventual Expedia undoing stemmed from Hacker News feedback & coverage about an outing story on an SEO blog that certainly sounded like it stemmed from an extortion attempt. USA Today asked if the Expedia campaign was a negative SEO attack.

While Expedia’s stock drop was anything but trivial, they will likely recover within a week to a month.

Smaller players can wait and wait and wait and wait … and wait.

Manual penalties are no joke, especially if you are a small entity with no political influence. The impact of them can be absolutely devastating. Such penalties are widespread too.

In Google’s busting bad advertising practices post they highlighted having zero tolerance, banning more than 270,000 advertisers, removing more than 250,000 publishers accounts, and disapproving more than 3,000,000 applications to join their ad network. All that was in 2013 & Susan Wojcicki mentioned Google having 2,000,000 sites in their display ad network. That would mean that something like 12% of their business partners were churned last year alone.

If Google’s churn is that aggressive on their own partners (where Google has an economic incentive for the relationship) imagine how much broader the churn is among the broader web. In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who do reply are wasting their time.

The Disavow Threat

Originally when disavow was launched it was pitched as something to be used with extreme caution:

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.

Recently Matt Cutts has encouraged broader usage. He has one video which discusses proatively disavowing bad links as they come in & another where he mentioned how a large company disavowed 100% of their backlinks that came in for a year.

The idea of proactively monitoring your backlink profile is quickly becoming mainstream – yet another recurring fixed cost center in SEO with no upside to the client (unless you can convince the client SEO is unstable and they should be afraid – which would ultimately retard their longterm investment in SEO).

Given the harshness of manual actions & algorithms like Penguin, they drive companies to desperation, acting irrationally based on fear.

People are investing to undo past investments. It’s sort of like riding a stock down 60%, locking in the losses by selling it, and then using the remaining 40% of the money to buy put options or short sell the very same stock. :D

Some companies are so desperate to get links removed that they “subscribe” sites that linked to them organically with spam email messages asking the links be removed.

Some go so far that they not only email you on and on, but they created dedicated pages on their site claiming that the email was real.

What’s so risky about the above is that many webmasters will remove links sight unseen, even from an anonymous Gmail account. Mix in the above sort of “this message is real” stuff and how easy would it be for a competitor to target all your quality backlinks with a “please remove my links” message? Further, how easy would it be for a competitor aware of such a campaign to drop a few hundred Dollars on Fiverr or Xrummer or other similar link sources, building up your spam links while removing your quality links?

A lot of the “remove my link” messages are based around lying to the people who are linking & telling them that the outbound link is harming them as well: “As these links are harmful to both yours and our business after penguin2.0 update, we would greatly appreciate it if you would delete these backlinks from your website.”

Here’s the problem though. Even if you spend your resources and remove the links, people will still likely add your site to their disavow file. I saw a YouTube video recording of an SEO conference where 4 well known SEO consultants mentioned that even if they remove the links “go ahead and disavow anyhow,” so there is absolutely no upside for publishers in removing links.

How Aggregate Disavow Data Could Be Used

Recovery is by no means guaranteed. In fact of the people who go to the trouble to remove many links & create a disavow file, only 15% of people claim to have seen any benefit.

The other 85% who weren’t sure of any benefit may not have only wasted their time, but they may have moved some of their other projects closer toward being penalized.

Let’s look at the process:

  • For the disavow to work you also have to have some links removed.

    • Some of the links that are removed may not have been the ones that hurt you in Google, thus removing them could further lower your rank.
    • Some of the links you have removed may be the ones that hurt you in Google, while also being ones that helped you in Bing.
    • The Bing & Yahoo! Search traffic hit comes immediately, whereas the Google recovery only comes later (if at all).
  • Many forms of profits (from client services or running a network of sites) come systematization. If you view everything that is systematized or scalable as spam, then you are not only disavowing to try to recover your penalized site, but you are send co-citation disavow data to Google which could have them torch other sites connected to those same sources.
    • If you run a network of sites & use the same sources across your network and/or cross link around your network, you may be torching your own network.
    • If you primarily do client services & disavow the same links you previously built for past clients, what happens to the reputation of your firm when dozens or hundreds of past clients get penalized? What happens if a discussion forum thread on Google Groups or elsewhere starts up where your company gets named & then a tsunami of pile on stuff fills out in the thread? Might that be brand destroying?

The disavow and review process is not about recovery, but is about collecting data and distributing pain in a game of one-way transparency. Matt has warned that people shouldn’t lie to Google…

…however Google routinely offers useless non-information in their responses.

Some Google webmaster messages leave a bit to be desired.

Recovery is uncommon. Your first response from Google might take a month or more. If you work for a week or two on clean up and then the response takes a month, the penalty has already lasted at least 6 weeks. And that first response might be something like this

Reconsideration request for site.com: Site violates Google’s quality guidelines

We received a reconsideration request from a site owner for site.com/.

We’ve reviewed your site and we believe that site.com/ still violates our quality guidelines. In order to preserve the quality of our search engine, pages from site.com/ may not appear or may not rank as highly in Google’s search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines.

For more specific information about the status of your site, visit the Manual Actions page in Webmaster Tools. From there, you may request reconsideration of your site again when you believe your site no longer violates the quality guidelines.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum.

Absolutely useless.

Zero useful information whatsoever.

As people are unsuccessful in the recovery process they cut deeper and deeper. Some people have removed over 90% of their profile without recovering & been nearly a half-year into the (12-step) “recovery” process before even getting a single example of a bad link from Google. In some cases these bad links Google identified were links were obviously created by third party scraper sites & were not in Google’s original sample of links to look at (so even if you looked at every single link they showed you & cleaned up 100% of issues you would still be screwed.)

Another issue with aggregate disavow data is there is a lot of ignorance in the SEO industry in general, and people who try to do things cheap (essentially free) at scale have an outsized footprint in the aggregate data. For instance, our site’s profile links are nofollowed & our profiles are not indexed by Google. In spite of this, examples like the one below are associated with not 1 but 3 separate profiles for a single site.

Our site only has about 20,000 to 25,000 unique linking domains. However over the years we have had well over a million registered user profiles. If only 2% of the registered user profiles were ignorant spammers who spammed our profile pages and then later added our site to a disavow file, we would have more people voting *against* our site than we have voting for it. And that wouldn’t be because we did anything wrong, but rather because Google is fostering an environment of mixed messaging, fear & widespread ignorance.

And if we are ever penalized, the hundreds of scraper sites built off scraping our RSS feed would make the recovery process absolutely brutal.

Another factor with Google saying “you haven’t cut out enough bone marrow yet” along with suggesting that virtually any/every type of link is spam is that there is going to be a lot of other forms of false positives in the aggregate data.

I know some companies specializing in link recovery which in part base some aspects of their disavows on the site’s ranking footprint. Well if you get a manual penalty, a Panda penalty, or your site gets hacked, then those sorts of sites which you are linking to may re-confirm that your site deserves to be penalized (on a nearly automated basis with little to no thought) based on the fact that it is already penalized. Good luck on recovering from that as Google folds in aggregate disavow data to justify further penalties.

Responsibility

All large ecosystems are gamed. We see it with app ratings & reviews, stealth video marketing, advertising, malware installs, and of course paid links.

Historically in search there has been the view that you are responsible for what you have done, but not the actions of others. The alternate roadmap would lead to this sort of insanity:

Our system has noticed that in the last week you received 240 spam emails. In result, your email account was temporarily suspended. Please contact the spammers and once you have a proof they unsuscribed you from their spam databases, we will reconsider reopening your email account.

As Google has closed down their own ecosystem, they allow their own $ 0 editorial to rank front & center even if it is pure spam, but third parties are now held to a higher standard – you could be held liable for the actions of others.

At the extreme, one of Google’s self-promotional automated email spam messages sent a guy to jail. In spite of such issues, Google remains unfazed, adding a setting which allows anyone on Google+ to email other members.

Ask Google if they should be held liable for the actions of third parties and they will tell you to go to hell. Their approach to copyright remains fuzzy, they keep hosting more third party content on their own sites, and even when that content has been deemed illegal they scream that it undermines their first amendment rights if they are made to proactively filter:

Finally, they claimed they were defending free speech. But it’s the courts which said the pictures were illegal and should not be shown, so the issue is the rule of law, not freedom of speech.

the non-technical management, particularly in the legal department, seems to be irrational to the point of becoming adolescent. It’s almost as if they refuse to do something entirely sensible, and which would save them and others time and trouble, for no better reason than that someone asked them to.

Monopolies with nearly unlimited resources shall be held liable for nothing.

Individuals with limited resources shall be liable for the behavior of third parties.

Google Duplicity (beta).

Torching a Competitor

As people have become more acclimated toward link penalties, a variety of tools have been created to help make sorting through the bad ones easier.

“There have been a few tools coming out on the market since the first Penguin – but I have to say that LinkRisk wins right now for me on ease of use and intuitive accuracy. They can cut the time it takes to analyse and root out your bad links from days to minutes…” – Dixon Jones

But as there have been more tools created for sorting out bad links & more tools created to automate sending link emails, two things have happened

  • Google is demanding more links be removed to allow for recovery

  • people are becoming less responsive to link removal requests as they get bombarded with them
    • Some of these tools keep bombarding people over and over again weekly until the link is removed or the emails go to the spam bin
    • to many people the link removal emails are the new link request emails ;)
    • one highly trusted publisher who participates in our forums stated they filtered the word “disavow” to automatically go to their trash bin
    • on WebmasterWorld a member decided it was easier to delete their site than deal with the deluge of link removal spam emails

The problem with Google rewarding negative signals is there are false positives and it is far cheaper to kill a business than it is to build one. The technically savvy teenager who created the original version of the software used in the Target PoS attack sold the code for only $ 2,000.

There have been some idiotic articles like this one on The Awl suggesting that comment spamming is now dead as spammers run for the hills, but that couldn’t be further from the truth. Some (not particularly popular) blogs are getting hundreds to thousands of spam comments daily & WordPress can have trouble even backing up the database (unless the comment spam is regularly deleted) as the database can quickly get a million records.

The spam continues but the targets change. A lot of these comments are now pointed at YouTube videos rather than ordinary websites.

As Google keeps leaning into negative signals, one can expect a greater share of spam links to be created for negative SEO purposes.

Maybe this maternity jeans comment spam is tied to the site owner, but if they didn’t do it, how do they prove it?

Once again, I’ll reiterate Bill Black

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black

The cost of “an academic test” can be as low as $ 5. You know you might be in trouble when you see fiverr.com/conversations/theirusername in your referrers:

Our site was hit with negative SEO. We have manually collected about 24,000 bad links for our disavow file (so far). It probably cost the perp $ 5 on Fiverr to point these links at our site. Do you want to know how bad that sucks? I’ll tell you. A LOT!! Google should be sued enmass by web masters for wasting our time with this “bad link” nonsense. For a company with so many Ph.D’s on staff, I can’t believe how utterly stupid they are

Or, worse yet, you might see SAPE in your referrers

And if the attempt to get you torched fails, they can try & try again. The cost of failure is essentially zero. They can keep pouring on the fuel until the fire erupts.

Even Matt Cutts complains about website hacking, but that doesn’t mean you are free of risk if someone else links to your site from hacked blogs. I’ve been forwarded unnatural link messages from Google which came about after person’s site was added in on a SAPE hack by a third party in an attempt to conceal who the beneficial target was. When in doubt, Google may choose to blame all parties in a scorched Earth strategy.

If you get one of those manual penalties, you’re screwed.

Even if you are not responsible for such links, and even if you respond on the same day, and even if Google believes you, you are still likely penalized AT LEAST for a month. Most likely Google will presume you are a liar and you have at least a second month in the penalty box. To recover you might have to waste days (weeks?) of your life & remove some of your organic links to show that you have went through sufficient pain to appease the abusive market monopoly.

As bad as the above is, it is just the tip of the iceberg.

  • People can redirect torched websites.
  • People can link to you from spam link networks which rotate links across sites, so you can’t possibly remove or even disavow all the link sources.
  • People can order you a subscription of those rotating spam links from hacked sites, where new spam links appear daily. Google mentioned discovering 9,500 malicious sites daily & surely the number has only increased from there.
  • People can tie any/all of the above with cloaking links or rel=canonical messages to GoogleBot & then potentially chain that through further redirects cloaked to GoogleBot.
  • And on and on … the possibilities are endless.

Extortion

Another thing this link removal fiasco subsidizes is various layers of extortion.

Not only are there the harassing emails threatening to add sites to disavow lists if they don’t remove the links, but some companies quickly escalate things from there. I’ve seen hosting abuse, lawyer threat letters, and one friend was actually sued in court (and the people who sued him actually had the link placed!)

Google created a URL removal tool which allows webmasters to remove pages from third party websites. How long until that is coupled with DDoS attacks? Once effective with removing one page, a competitor might decide to remove another.

Another approach to get links removed is to offer payment. But payment itself might encourage the creation of further spammy links as link networks look to replace their old cashflow with new sources.

The recent Expedia fiasco started as an extortion attempt: “If I wanted him to not publish it, he would “sell the post to the highest bidder.”

Another nasty issue here is articles like this one on Link Research Tools, where they not only highlight client lists of particular firms, but then state which URLs have not yet been penalized followed by “most likely not yet visible.” So long as that sort of “publishing” is acceptable in the SEO industry, you can bet that some people will hire the SEOs nearly guaranteeing a penalty to work on their competitor’s sites, while having an employee write a “case study” for Link Research Tools. Is this the sort of bullshit we really want to promote?

Some folks are now engaging in overt extortion:

I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn’t pay him £10 per month to NOT do this.

Branding / Rebranding / Starting Over

Sites that are overly literal in branding likely have no chance at redemption. That triple hyphenated domain name in a market that is seen as spammy has zero chance of recovery.

Even being a generic unbranded site in a YMYL category can make you be seen as spam. The remote rater documents stated that the following site was spam…

… even though the spammiest thing on it was the stuff advertised in the AdSense ads:

For many (most?) people who receive a manual link penalty or are hit by Penguin it is going to be cheaper to start over than to clean up.

At the very minimum it can make sense to lay groundwork for a new project immediately just in case the old site can’t recover or takes nearly a year to recover. However, even if you figure out the technical bits, as soon as you have any level of success (or as soon as you connect your projects together in any way) you once again become a target.

And you can’t really invest in higher level branding functions unless you think the site is going to be around for many years to earn off the sunk cost.

Succeeding at SEO is not only about building rank while managing cashflow and staying unpenalized, but it is also about participating in markets where you are not marginalized due to Google inserting their own vertical search properties.

Even companies which are large and well funded may not succeed with a rebrand if Google comes after their vertical from the top down.

Hope & Despair

If you are a large partner affiliated with Google, hope is on your side & you can monetize the link graph: “By ensuring that our clients are pointing their links to maximize their revenue, we’re not only helping them earn more money, but we’re also stimulating the link economy.”

You have every reason to be Excited, as old projects like Excite or Merchant Circle can be relaunched again and again.

Even smaller players with the right employer or investor connections are exempt from these arbitrary risks.

You can even be an SEO and start a vertical directory knowing you will do well if you can get that Google Ventures investment, even as other similar vertical directories were torched by Panda.

For most other players in that same ecosystem, the above tailwind is a headwind. Don’t expect much 1 on 1 help in webmaster tools.

In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who reply are wasting their time. How many confirmed Penguin 1.0 recoveries are you aware of?

Even if a recovery is deserved, it does not mean one will happen, as errors do happen. And on the off chance recovery happens, recovery does not mean a full restoration of rankings.

There are many things we can learn from Google’s messages, but probably the most important is this:

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way – in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only. – Charles Dickens, A Tale of Two Cities

Categories: 

SEO Book

Posted in IM NewsComments Off

Google: Google’s Disavow Tool Won’t Damage Others

A WebmasterWorld thread links to a blog post on sno.pe about some questions he asked a Googler in regards to the Google disavow tool…


Search Engine Roundtable

Posted in IM NewsComments Off

The Google and Bing Disavow tools

Author (displayed on the page): 

Stop sign

It’s big news that Google now has a Disavow tool. However to understand why, we need to first take a quick look at the history behind it.

The history

Back in April (2012) Google released the Penguin algorithm update. This focused on trying to identify which sites were likely to be spam, or of low quality, by examining their backlinks.

The result was that sites could now be damaged by low quality links. Webmasters quickly realized that by focusing large numbers of poor quality links at a competitor they could damage their rankings. This became known as ‘negative SEO’.

How effective negative SEO is has been controversial. Tests had an impact on some sites, while other sites have remained untouched. It is most likely that the number of poor quality links it takes to be damaging is directly linked to the strength of the sites’ backlink profile. Once sites gain a high enough level of authority it is likely that they can no longer be significantly affected by poor quality links.

Google is also getting better at assessing the proper value links. And as a result, webmasters have found that an increasing proportion of their backlinks are having a negative effect. Techniques such as placing links in directories used to improve your backlink profile. But now this technique, and others, are potentially damaging.

The Disavow tool

Webmasters have traditionally had pretty limited tools at their disposal for removing links. The only way to do it was to physically remove the link either by removing the page the link is on or breaking it by removing the page it’s pointing to.

Asking webmasters to remove links pointing to your site is known as a link removal request. These have had very mixed success and some webmasters have even started charging to remove links. Where larger numbers of links are causing problems it can be easier to just remove the page which they link to. This of course loses all the value from any good links which point to that page as well.

What was needed was a way to communicate with the search engines directly about links pointing to your site. This would allow webmasters to disassociate themselves from links which they felt were harmful. The search engines could then just discount these links from the site.

Bing saw that a way was needed for webmasters to communicate which links they didn’t want and in June released the Disavow links tool as part of its Webmaster tools suite. However Bing only holds a small share of the market (around 15%) against Google (67%) (US figures) meaning that it has been a long wait for Google to release its own Disavow tool.

Google and Bing Disavow tools

How it works

The Disavow tools rely on the webmaster having a Google Webmaster Tools or Bing Webmaster Tools account and manually checking and requesting the removal of links. The exact process differs slightly in both tools.

Google regard this as a safety net, to be used only once all other options have been exhausted:

“If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page.”

Bing however see it as more of a catch-all and don’t emphasize trying more traditional routes first:

“Using the Disavow Links tool, you can easily and quickly alert Bing about links you don’t trust …”

“These signals help us understand when you find links pointing to your content that you want to distance yourself from for any reason”

Google does allow you to upload URLs in bulk which could make it a significantly quicker process than Bing, depending on the level of accuracy you are looking for. Both will allow you to mark a URL as a domain to disavow all links from the site as a whole.

Next steps

Checking your site’s backlink profile is definitely something that’s worth doing. If you don’t have a good grasp of what this is and how it affects you, you should. In short you need to watch these link building videos right now and get up to speed.

If you want more information on exactly how to identify those low value links then a future Wordtracker Academy post will show you exactly how to do just that. Some quick pointers however are:

  • Find the low value sites which link to you lots of times.
  • Be on the lookout for sites with adult content which link to you.
  • Use both the Google and Bing tools.
  • Remember, disavowing links sends a negative signal about that site so don’t be overzealous.
  • Don’t worry about links from low value pages that sit on domains with good overall value.
  • Don’t expect quick results, both tools take a while to have an effect.
  • Make it a quarterly activity, don’t think that once is enough.

Wordtracker Blog

Posted in IM NewsComments Off

Poll: Did You Use The Google Disavow Tool?

The Google Disavow Link Tool launched exactly 30 days ago and many webmasters have been eagerly awaiting the tool.

So I wanted to know…




Search Engine Roundtable

Posted in IM NewsComments Off

Q&A With Google’s Matt Cutts On How To Use The Link Disavow Tool

It’s been almost two weeks since Google launched its link disavowal tool. Some have been busy diving in and using it, but others have had more detailed questions about it. We’ve got some answers, from the head of Google’s web spam team, Matt Cutts. Question: How do people know…



Please visit Search Engine Land for the full article.




Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Google Disavow Tool

Google launched a disavow links tool. Webmasters who want to tell Google which links they don’t want counted can now do so by uploading a list of links in Google Webmaster Tools.

If you haven’t received an “unnatural link” alert from Google, you don’t really need to use this tool. And even if you have received notification, Google are quick to point out that you may wish to pursue other avenues, such as approaching the site owner, first.

Webmasters have met with mixed success following this approach, of course. It’s difficult to imagine many webmasters going to that trouble and expense when they can now upload a txt file to Google.

Careful, Now

The disavow tool is a loaded gun.

If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.

Could the use of the tool be seen as an admission of guilt? Matt gives examples of “bad” webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.

Some webmasters have been victims of negative SEO. Some webmasters have had scrapers and autogen sites that steal their content, and then link back. There are legitimate reasons to disavow links. Hopefully, Google makes an effort to make such a distinction.

One wonders why Google simply don’t discount the links they already deem to be “bad”? Why the need for the webmaster to jump through hoops? The webmaster is still left to guess which links are “bad”, of course.

Not only is it difficult working out the links that may be a problem, it can be difficult getting a view of the entire link graph. There are various third party tools, including Google’s own Webmaster Central, but they aren’t exhaustive.

Matt mentioned that the link notification emails will provide examples of problem links, however this list won’t be exhaustive. He also mentioned that you should pay attention to the more recent links, presumably because if you haven’t received notification up until now, then older links weren’t the problem. The issue with that assumption is that links that were once good can over time become bad:

  • That donation where you helped a good cause & were later mortified that “online casino” and “discount cheap viagra” followed your course for purely altruistic reasons.
  • That clever comment on a well-linked PR7 page that is looking to cure erectile dysfunction 20 different ways in the comments.
  • Links from sources that were considered fine years ago & were later repositioned as spam (article banks anyone?)
  • Links from sites that were fine, but a number of other webmasters disavowed, turning a site that originally passed the sniff test into one that earns a second review revealing a sour stench.

This could all get rather painful if webmasters start taking out links they perceive to be a problem, but aren’t. I imagine a few feet will get blasted off in the process.

Webmasters Asked, Google Gaveth

Webmasters have been demanding such a tool since the un-natural notifications started appearing. There is no question that removing established links can be as hard, if not harder, than getting the links in the first place. Generally speaking, the cheaper the link was to get the higher the cost of removal (relative to the original purchase price). If you are renting text link ads for $ 50 a month you can get them removed simply by not paying. But if you did a bulk submission to 5,000 high PR SEO friendly directories…best of luck with that!

It is time consuming. Firstly, there’s the overhead in working out which links to remove, as Google doesn’t specify them. Once a webmaster has made a list of the links she thinks might be a problem, she then needs to go through the tedious task of contacting each sites and requesting that a link be taken down.

Even with the best will in the world, this is an overhead for the linking site, too. A legitimate site may wish to verify the identity of the person requesting the delink, as the delink request could come from a malicious competitor. Once identity has been established, the site owner must go to the trouble of making the change on their site.

This is not a big deal if a site owner only receives one request, but what if they receive multiple requests per day? It may not be unreasonable for a site owner to charge for the time taken to make the change, as such a change incurs a time cost. If the webmaster who has incurred a penalty has to remove many links, from multiple sites, then such costs could quickly mount. Taken to the (il)logical extremes, this link removal stuff is a big business. Not only are there a number of link removal services on the market, but one of our members was actually sued for linking to a site (when the person who was suing them paid to place the link!)

What’s In It For Google?

Webmasters now face the prisoner’s dilemma and are doing Google’s job for them.

It’s hard to imagine this data not finding it’s way to the manual reviewers. If there are multiple instances of webmasters reporting paid links from a certain site, then Google have more than enough justification to take it out. This would be a cunning way around the “how do we know if a link is paid?” problem.

Webmasters will likely incorporate bad link checking into their daily activities. Monitoring inbound links wasn’t something you had to watch in the past, as links were good, and those that weren’t, didn’t matter, as they didn’t affect ranking anyway. Now, webmasters may feel compelled to avoid an unnatural links warning by meticulously monitoring their inbound links and reporting anything that looks odd. Google haven’t been clear on whether they would take such action as a result – Matt suggests they just reclassify the link & see it as a strong suggestion to treat it like the link has a nofollow attribute – but no doubt there will be clarification as the tool beds in. Google has long used a tiered index structure & enough complaints might lower the tier of a page or site, cause it’s ability to pass trust to be blocked, or cause the site to be directly penalized.

This is also a way of reaffirming “the law”, as Google sees it. In many instances, it is no fault of the webmaster that rogue sites link up, yet the webmaster will feel compelled to jump through Google’s hoops. Google sets the rules of the game. If you want to play, then you play by their rules, and recognize their authority. Matt Cutts suggested:

we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business.

Left unsaid in the above is most people don’t have access to aggregate link data while they surf the web, most modern systems of justice are based on the presumption of innocence rather than guilt, and most rational people don’t presume that a site that is linked to is somehow shady simply for being linked to.

If the KKK links to Matt’s blog tomorrow that doesn’t imply anything about Matt. And when Google gets featured in an InfoWars article it doesn’t mean that Google desires that link or coverage. Many sketchy sites link to Adobe (for their flash player) or sites like Disney & Google for people who are not old enough to view them or such. Those links do not indicate anything negative about the sites being linked into. However, as stated above, search is Google’s monopoly to do with as they please.

On the positive side, if Google really do want sites to conform to certain patterns, and will reward them for doing so by letting them out of jail, then this is yet another way to clean up the SERPs. They get the webmaster on side and that webmaster doing link classification work for them for free.

Who, Not What

For a decade search was driven largely by meritocracy. What you did was far more important than who you were. It was much less corrupt than the physical world. But as Google chases brand ad Dollars, that view of the search landscape is no longer relevant.

Large companies can likely safely ignore much of the fear-first approach to search regulation. And when things blow up they can cast off blame on a rogue anonymous contractor of sorts. Whereas smaller webmasters walk on egg shells.

When the government wanted to regulate copyright issues Google claimed it would be too expensive and kill innovation at small start ups. Google then drafted their own copyright policy from which they themselves are exempt. And now small businesses not only need to bear that cost but also need to police their link profiles, even as competitors can use Fivver, ScrapeBox, splog link networks & various other sources to drip a constant stream of low cost sludge in their direction.

Now more than ever, status is important.

Gotchas

No doubt you’ve thought of a few. A couple thoughts – not that we advocate them, but realize they will happen:

  • Intentionally build spam links to yourself & then disavow them (in order to make your profile look larger than it is & to ensure that competitor who follows everything you do – but lacks access to your disavow data – walks into a penalty).
  • Find sites that link to competitors and leave loads of comments for the competitor on them, hoping that the competitor blocks the domain as a whole.
  • Find sites that link to competitors & buy links from them into a variety of other websites & then disavow from multiple accounts.
  • Get a competitor some link warnings & watch them push to get some of their own clean “unauthorized” links removed.
  • The webmaster who parts on poor terms burning the bridge behind them, or leaving a backdoor so that they may do so at anytime.

If a malicious webmaster wanted to get a target site in the bad books, they could post obvious comment spam – pointing at their site, and other sites. If this activity doesn’t result in an unnatural linking notification, then all good. It’s a test of how Google values that domain. If it does result in an unnatural link notification, the webmaster could then disavow links from that site. Other webmasters will likely do the same. Result: the target site may get taken out.

To avoid this sort of hit, pay close attention to your comment moderation.

Please add your own to the comments! :) Gotchas, that is, not rogue links.

Further opinions @ searchengineland and seoroundtable.

Categories: 

SEO Book.com

Periodic updates and adjustments by Google remind us how important it is to DIVERSIFY. Online marketers and SEO resellers benefit from a highly diversified online marketing program. HubShout’s SEO reseller program includes SEO, PPC, Social, Email and a wide variety of online marketing channels that help protect their clients from Google’s frequent updates–and penalties. In this webinar, we talk about using social media, email, PPC and offline channels to diversify your lead flow. Diversification of online marketing also helps SEO resellers improve their lead flow and bring in more revenue. This webinar is part of the SEO reseller training series offered through HubShout’s white label SEO reseller program.
Video Rating: 4 / 5

Posted in IM NewsComments Off


Advert