Tag Archive | "Removal"

Search Buzz Video Recap: Google Bugs, Navigation Removal, AMP Updates, OMG I Don�t Know & More

This week, we covered the ongoing Google bugs, this time with Google News indexing, Search Console issues, and other issues with Google. Google also is unaware of a…

Search Engine Roundtable

Posted in IM NewsComments Off

SearchCap: Google hack removal, Allo goes goodbye & responsive search ads

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Q&A: Lost Your Anonymous Google Reviews? The Scoop on Removal and Moving Forward

Posted by MiriamEllis

Did you recently notice a minor or major drop in your Google review count, and then realize that some of your actual reviews had gone missing, too? Read on to see if your experience of removal review was part of the action Google took in late May surrounding anonymous reviews.

Q: What happened?

A: As nearly as I can pinpoint it, Google began discounting reviews left by “A Google User” from total review counts around May 23, 2018. For a brief period, these anonymous reviews were still visible, but were then removed from display. I haven’t seen any official announcement about this, to date, and it remains unclear as to whether all reviews designated as being from “A Google User” have been removed, or whether some still remain. I haven’t been able to discover a single one since the update.

Q: How do I know if I was affected by this action?

A: If, prior to my estimated date, you had reviews that had been left by profiles marked “A Google User,” and these reviews are now gone, that’s the diagnostic of why your total review count has dropped.

Q: The reviews I’ve lost weren’t from “A Google User” profiles. What happened?

A: If you’ve lost reviews from non-anonymous profiles, it’s time to investigate other causes of removal. These could include:

  • Having paid for or incentivized reviews, either directly or via an unethical marketer
  • Reviews stemming from a review station/kiosk at your business
  • Getting too many reviews at once
  • URLs, prohibited language, or other objectionable content in the body of reviews
  • Reviewing yourself, or having employees (past or present) do so
  • Reviews were left on your same IP (as in the case of free on-site Wi-Fi)
  • The use of review strategies/software that prohibit negative reviews or selectively solicit positive reviews
  • Any other violation of Google’s review guidelines
  • A Google bug, in which case, check the GMB forum for reports of similar review loss, and wait a few days to see if your reviews return; if not, you can take the time to post about your issue in the GMB forum, but chances are not good that removed reviews will be reinstated

Q: Is anonymous review removal a bug or a test?

A: One month later, these reviews remain absent. This is not a bug, and seems unlikely to be a test.

Q: Could my missing anonymous reviews come back?

A: Never say “never” with Google. From their inception, Google review counts have been wonky, and have been afflicted by various bugs. There have been cases in which reviews have vanished and reappeared. But, in this case, I don’t believe these types of reviews will return. This is most likely an action on Google’s part with the intention of improving their review corpus, which is, unfortunately, plagued with spam.

Q: What were the origins of “A Google User” reviews?

A: Reviews designated by this language came from a variety of scenarios, but are chiefly fallout from Google’s rollout of Google+ and then its subsequent detachment from local. As Mike Blumenthal explains:

As recently as 2016, Google required users to log in as G+ users to leave a review. When they transitioned away from + they allowed users several choices as to whether to delete their reviews or to create a name. Many users did not make that transition. For the users that chose not to give their name and make that transition Google identified them as ” A Google User”…. also certain devices like the old Blackberry’s could leave a review but not a name. Also users left + and may have changed profiles at Google abandoning their old profiles. Needless to say there were many ways that these reviews became from “A Google User.”

Q: Is the removal of anonymous reviews a positive or negative thing? What’s Google trying to do here?

A: Whether this action has worked out well or poorly for you likely depends on the quality of the reviews you’ve lost. In some cases, the loss may have suddenly put you behind competitors, in terms of review count or rating. In others, the loss of anonymous negative reviews may have just resulted in your star rating improving — which would be great news!

As to Google’s intent with this action, my assumption is that it’s a step toward increasing transparency. Not their own transparency, but the accountability of the reviewing public. Google doesn’t really like to acknowledge it, but their review corpus is inundated with spam, some of it the product of global networks of bad actors who have made a business of leaving fake reviews. Personally, I welcome Google making any attempts to cope with this, but the removal of this specific type of anonymous review is definitely not an adequate solution to review spam when the livelihoods of real people are on the line.

Q: Does this Google update mean my business is now safe from anonymous reviews?

A: Unfortunately, no. While it does mean you’re unlikely to see reviews marked as being from “A Google User”, it does not in any way deter people from creating as many Google identities as they’d like to review your business. Consider:

  • Google’s review product has yet to reach a level of sophistication which could automatically flag reviews left by “Rocky Balboa” or “Whatever Whatever” as, perhaps, somewhat lacking in legitimacy.
  • Google’s product also doesn’t appear to suspect profiles created solely to leave one-time reviews, though this is a clear hallmark of many instances of spam
  • Google won’t remove text-less negative star ratings, despite owner requests
  • Google hasn’t been historically swayed to remove reviews on the basis of the owner claiming no records show that a negative reviewer was ever a customer

Q: Should Google’s removal of anonymous reviews alter my review strategy?

A: No, not really. I empathize with the business owners expressing frustration over the loss of reviews they were proud of and had worked hard to earn. I see actions like this as important signals to all local businesses to remember that you don’t own your Google reviews, you don’t own your Google My Business listing/Knowledge Panel. Google owns those assets, and manages them in any way they deem best for Google.

In the local SEO industry, we are increasingly seeing the transformation of businesses from the status of empowered “website owner” to the shakier “Google tenant,” with more and more consumer actions taking place within Google’s interface. The May removal of reviews should be one more nudge to your local brand to:

  • Be sure you have an ongoing, guideline-compliant Google review acquisition campaign in place so that reviews that become filtered out can be replaced with fresh reviews
  • Take an active approach to monitoring your GMB reviews so that you become aware of changes quickly. Software like Moz Local can help with this, especially if you own or market large, multi-location enterprises. Even when no action can be taken in response to a new Google policy, awareness is always a competitive advantage.
  • Diversify your presence on review platforms beyond Google
  • Collect reviews and testimonials directly from your customers to be placed on your own website; don’t forget the Schema markup while you’re at it
  • Diversify the ways in which you are cultivating positive consumer sentiment offline; word-of-mouth marketing, loyalty programs, and the development of real-world relationships with your customers is something you directly control
  • Keep collecting those email addresses and, following the laws of your country, cultivate non-Google-dependent lines of communication with your customers
  • Invest heavily in hiring and training practices that empower staff to offer the finest possible experience to customers at the time of service — this is the very best way to ensure you are building a strong reputation both on and offline

Q: So, what should Google do next about review spam?

A: A Google rep once famously stated,

The wiki nature of Google Maps expands upon Google’s steadfast commitment to open community.”

I’d welcome your opinions as to how Google should deal with review spam, as I find this a very hard question to answer. It may well be a case of trying to lock the barn door after the horse has bolted, and Google’s wiki mentality applied to real-world businesses is one with which our industry has contended for years.

You see, the trouble with Google’s local product is that it was never opt-in. Whether you list your business or not, it can end up in Google’s local business index, and that means you are open to reviews (positive, negative, and fallacious) on the most visible possible platform, like it or not. As I’m not seeing a way to walk this back, review spam should be Google’s problem to fix, and they are obliged to fix it if:

  • They are committed to their own earnings, based on the trust the public feels in their review corpus
  • They are committed to user experience, implementing necessary technology and human intervention to protect consumers from fake reviews
  • They want to stop treating the very businesses on whom their whole product is structured as unimportant in the scheme of things; companies going out of business due to review spam attacks really shouldn’t be viewed as acceptable collateral damage

Knowing that Alphabet has an estimated operating income of $ 7 billion for 2018, I believe Google could fund these safeguards:

  1. Take a bold step and resource human review mediators. Make this a new department within the local department. Google sends out lots of emails to businesses now. Let them all include clear contact options for reaching the review mediation department if the business experiences spam reviews. Put the department behind a wizard that walks the business owner through guidelines to determine if a review is truly spam, and if this process signals a “yes,” open a ticket and fix the issue. Don’t depend on volunteers in the GMB forum. Invest money in paid staff to maintain the quality of Google’s own product.
  2. If Google is committed to the review flagging process (which is iffy, at best), offer every business owner clear guidelines for flagging reviews within their own GMB dashboard, and then communicate about what is happening to the flagged reviews.
  3. Improve algorithmic detection of suspicious signals, like profiles with one-off reviews, the sudden influx of negative reviews and text-less ratings, global reviews within a single profile, and companies or profiles with a history of guideline violations. Hold the first few reviews left by any profile in a “sandbox,” à la Yelp.

Now it’s your turn! Let’s look at Google’s removal of “A Google User” reviews as a first step in the right direction. If you had Google’s ear, what would you suggest they do next to combat review spam? I’d really like to know.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

Disavow & Link Removal: Understanding Google

Fear Sells

Few SEOs took notice when Matt Cutts mentioned on TWIG that “breaking their spirits” was essential to stopping spammers. But that single piece of information add layers of insights around things like:

  • duplicity on user privacy on organic versus AdWords
  • benefit of the doubt for big brands versus absolute apathy toward smaller entities
  • the importance of identity versus total wipeouts of those who are clipped
  • mixed messaging on how to use disavow & the general fear around links

From Growth to No Growth

Some people internalize failure when growth slows or stops. One can’t raise venture capital and keep selling the dream of the growth story unless the blame is internalized. If one understands that another dominant entity (monopoly) is intentionally subverting the market then a feel good belief in the story of unlimited growth flames out.

Most of the growth in the search channel is being absorbed by Google. In RKG’s Q4 report they mentioned that mobile ad clicks were up over 100% for the year & mobile organic clicks were only up 28%.

Investing in Fear

There’s a saying in investing that “genius is declining interest rates” but when the rates reverse the cost of that additional leverage surfaces. Risks from years ago that didn’t really matter suddenly do.

The same is true with SEO. A buddy of mine mentioned getting a bad link example from Google where the link was in place longer than Google has been in existence. Risk can arbitrarily be added after the fact to any SEO activity. Over time Google can keep shifting the norms of what is acceptable. So long as they are fighting off WordPress hackers and other major issues they are kept busy, but when they catch up on that stuff they can then focus on efforts to shift white to gray and gray to black – forcing people to abandon techniques which offered a predictable positive ROI.

Defunding SEO is an essential & virtuous goal.

Hiding data (and then giving crumbs of it back to profile webmasters) is one way of doing it, but adding layers of risk is another. What panda did to content was add a latent risk to content where the cost of that risk in many cases vastly exceeded the cost of the content itself. What penguin did to links was the same thing: make the latent risk much larger than the upfront cost.

As Google dials up their weighting on domain authority many smaller sites which competed on legacy relevancy metrics like anchor text slide down the result set. When they fall down the result set, many of those site owners think they were penalized (even if their slide was primarily driven by a reweighting of factors rather than an actual penalty). Since there is such rampant fearmongering on links, they start there. Nearly every widely used form of link building has been promoted by Google engineers as being spam.

  • Paid links? Spam.
  • Reciprocal links? Spam.
  • Blog comments? Spam.
  • Forum profile links? Spam.
  • Integrated newspaper ads? Spam.
  • Article databases? Spam.
  • Designed by credit links? Spam.
  • Press releases? Spam.
  • Web 2.0 profile & social links? Spam.
  • Web directories? Spam.
  • Widgets? Spam.
  • Infographics? Spam.
  • Guest posts? Spam.

It doesn’t make things any easier when Google sends out examples of spam links which are sites the webmaster has already disavowed or sites which Google explicitly recommended in their webmaster guidelines, like DMOZ.

It is quite the contradiction where Google suggests we should be aggressive marketers everywhere EXCEPT for SEO & basically any form of link building is far too risky.

It’s a strange world where when it comes to social media, Google is all promote promote promote. Or even in paid search, buy ads, buy ads, buy ads. But when it comes to organic listings, it’s just sit back and hope it works, and really don’t actively go out and build links, even those are so important. – Danny Sullivan

Google is in no way a passive observer of the web. Rather they actively seek to distribute fear and propaganda in order to take advantage of the experiment effect.

They can find and discredit the obvious, but most on their “spam list” done “well” are ones they can’t detect. So, it’s easier to have webmasters provide you a list (disavows), scare the ones that aren’t crap sites providing the links into submission and damn those building the links as “examples” – dragging them into town square for a public hanging to serve as a warning to anyone who dare disobey the dictatorship. – Sugarrae

This propaganda is so effective that email spammers promoting “SEO solutions” are now shifting their pitches from grow your business with SEO to recover your lost traffic

Where Do Profits Come From?

I saw Rand tweet this out a few days ago…

… and thought “wow, that couldn’t possibly be any less correct.”

When ecosystems are stable you can create processes which are profitable & pay for themselves over the longer term.

I very frequently get the question: ‘what’s going to change in the next 10 years?’ And that is a very interesting question; it’s a very common one. I almost never get the question: ‘what’s not going to change in the next 10 years?’ And I submit to you that that second question is actually the more important of the two – because you can build a business strategy around the things that are stable in time….in our retail business, we know that customers want low prices and I know that’s going to be true 10 years from now. They want fast delivery, they want vast selection. It’s impossible to imagine a future 10 years from now where a customer comes up and says, ‘Jeff I love Amazon, I just wish the prices were a little higher [or] I love Amazon, I just wish you’d deliver a little more slowly.’ Impossible. And so the effort we put into those things, spinning those things up, we know the energy we put into it today will still be paying off dividends for our customers 10 years from now. When you have something that you know is true, even over the long-term, you can afford to put a lot of energy into it. – Jeff Bezos at re: Invent, November, 2012

When ecosystems are unstable, anything approaching boilerplate has an outsized risk added by the dominant market participant. The quicker your strategy can be done at scale or in the third world, the quicker Google shifts it from a positive to a negative ranking signal. It becomes much harder to train entry level employees on the basics when some of the starter work they did in years past now causes penalties. It becomes much harder to manage client relationships when their traffic spikes up and down, especially if Google sends out rounds of warnings they later semi-retract.

What’s more, anything that is vastly beyond boilerplate tends to require a deeper integration and a higher level of investment – making it take longer to pay back. But the budgets for such engagement dry up when the ecosystem itself is less stable. Imagine the sales pitch, “I realize we are off 35% this year, but if we increase the budget 500% we should be in a good spot a half-decade from now.”

All great consultants aim to do more than the bare minimum in order to give their clients a sustainable competitive advantage, but by removing things which are scalable and low risk Google basically prices out the bottom 90% to 95% of the market. Small businesses which hire an SEO are almost guaranteed to get screwed because Google has made delivering said services unprofitable, particularly on a risk-adjusted basis.

Being an entrepreneur is hard. Today Google & Amazon are giants, but it wasn’t always that way. Add enough risk and those streams of investment in innovation disappear. Tomorrow’s Amazon or Google of other markets may die a premature death. You can’t see what isn’t there until you look back from the future – just like the answering machine AT&T held back from public view for decades.

Meanwhile, the Google Venture backed companies keep on keeping on – they are protected.

When ad agencies complain about the talent gap, what they are really complaining about is paying people what they are worth. But as the barrier to entry in search increases, independent players die, leaving more SEOs to chase fewer corporate jobs at lower wages. Even companies servicing fortune 500s are struggling.

On an individual basis, creating value and being fairly compensated for the value you create are not the same thing. Look no further than companies like Google & Apple which engage in flagrantly illegal anti-employee cartel agreements. These companies “partnered” with their direct competitors to screw their own employees. Even if you are on a winning team it does not mean that you will be a winner after you back out higher living costs and such illegal employer agreements.

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black

Meanwhile, complaints about the above sorts of inequality or other forms of asset stripping are pitched as being aligned with Nazi Germany’s treatment of Jews. Obviously we need more H-1B visas to further drive down wages even as graduates are underemployed with a mountain of debt.

A Disavow For Any (& Every) Problem

Removing links is perhaps the single biggest growth area in SEO.

Just this week I got an unsolicited email from an SEO listing directory

We feel you may qualify for a Top position among our soon to be launched Link Cleaning Services Category and we would like to learn more about Search Marketing Info. Due to the demand for link cleaning services we’re poised to launch the link cleaning category. I took a few minutes to review your profile and felt you may qualify. Do you have time to talk this Monday or Tuesday?

Most of the people I interact with tend to skew toward the more experienced end of the market. Some of the folks who join our site do so after their traffic falls off. In some cases the issues look intimately tied to Panda & the sites with hundreds of thousands of pages maybe only have a couple dozen inbound links. In spite of having few inbound links & us telling people the problem looks to be clearly aligned with Panda, some people presume that the issue is links & they still need to do a disavow file.

Why do they make that presumption? It’s the fear message Google has been selling nonstop for years.

Punishing people is much different, and dramatic, from not rewarding. And it feeds into the increasing fear that people might get punished for anything. – Danny Sullivan

What happens when Google hands out free all-you-can-eat gummy bear laxatives to children at the public swimming pool? A tragedy of the commons.

Rather than questioning or countering the fear stuff, the role of the SEO industry has largely been to act as lap dogs, syndicating & amplifying the fear.

  • link tool vendors want to sell proprietary clean up data
  • SEO consultants want to tell you that they are the best and if you work with someone else there is a high risk hidden in the low price
  • marketers who crap on SEO to promote other relabeled terms want to sell you on the new term and paint the picture that SEO is a self-limiting label & a backward looking view of marketing
  • paid search consultants want to enhance the perception that SEO is unreliable and not worthy of your attention or investment

Even entities with a 9 figure valuation (and thus plenty of resources to invest in a competent consultant) may be incorrectly attributing SEO performance problems to links.

A friend recently sent me a link removal request from Buy Domains referring to a post which linked to them.

On the face of this, it’s pretty absurd, no? A company which does nothing but trade in names themselves asks that their name reference be removed from a fairly credible webpage recommending them.

The big problem for Buy Domains is not backlinks. They may have had an issue with some of the backlinks from PPC park pages in the past, but now those run through a redirect and are nofollowed.

Their big issue is that they have less than great engagement metrics (as do most marketplace sites other than eBay & Amazon which are not tied to physical stores). That typically won’t work if the entity has limited brand awareness coupled with having nearly 5 million pages in Google’s index.

They not only have pages for each individual domain name, but they link to their internal search results from their blog posts & those search pages are indexed. Here’s part of a recent blog post

And here are examples of the thin listing sorts of pages which Panda was designed in part to whack. These pages were among the millions indexed in Google.

A marketplace with millions of pages that doesn’t have broad consumer awareness is likely to get nailed by Panda. And the websites linking to it are likely to end up in disavow files, not because they did anything wrong but because Google is excellent at nurturing fear.

What a Manual Penalty Looks Like

Expedia saw a 25% decline in search visibility due to an unnatural links penalty , causing their stock to fall 6.4%. Both Google & Expedia declined to comment. It appears that the eventual Expedia undoing stemmed from Hacker News feedback & coverage about an outing story on an SEO blog that certainly sounded like it stemmed from an extortion attempt. USA Today asked if the Expedia campaign was a negative SEO attack.

While Expedia’s stock drop was anything but trivial, they will likely recover within a week to a month.

Smaller players can wait and wait and wait and wait … and wait.

Manual penalties are no joke, especially if you are a small entity with no political influence. The impact of them can be absolutely devastating. Such penalties are widespread too.

In Google’s busting bad advertising practices post they highlighted having zero tolerance, banning more than 270,000 advertisers, removing more than 250,000 publishers accounts, and disapproving more than 3,000,000 applications to join their ad network. All that was in 2013 & Susan Wojcicki mentioned Google having 2,000,000 sites in their display ad network. That would mean that something like 12% of their business partners were churned last year alone.

If Google’s churn is that aggressive on their own partners (where Google has an economic incentive for the relationship) imagine how much broader the churn is among the broader web. In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who do reply are wasting their time.

The Disavow Threat

Originally when disavow was launched it was pitched as something to be used with extreme caution:

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.

Recently Matt Cutts has encouraged broader usage. He has one video which discusses proatively disavowing bad links as they come in & another where he mentioned how a large company disavowed 100% of their backlinks that came in for a year.

The idea of proactively monitoring your backlink profile is quickly becoming mainstream – yet another recurring fixed cost center in SEO with no upside to the client (unless you can convince the client SEO is unstable and they should be afraid – which would ultimately retard their longterm investment in SEO).

Given the harshness of manual actions & algorithms like Penguin, they drive companies to desperation, acting irrationally based on fear.

People are investing to undo past investments. It’s sort of like riding a stock down 60%, locking in the losses by selling it, and then using the remaining 40% of the money to buy put options or short sell the very same stock. :D

Some companies are so desperate to get links removed that they “subscribe” sites that linked to them organically with spam email messages asking the links be removed.

Some go so far that they not only email you on and on, but they created dedicated pages on their site claiming that the email was real.

What’s so risky about the above is that many webmasters will remove links sight unseen, even from an anonymous Gmail account. Mix in the above sort of “this message is real” stuff and how easy would it be for a competitor to target all your quality backlinks with a “please remove my links” message? Further, how easy would it be for a competitor aware of such a campaign to drop a few hundred Dollars on Fiverr or Xrummer or other similar link sources, building up your spam links while removing your quality links?

A lot of the “remove my link” messages are based around lying to the people who are linking & telling them that the outbound link is harming them as well: “As these links are harmful to both yours and our business after penguin2.0 update, we would greatly appreciate it if you would delete these backlinks from your website.”

Here’s the problem though. Even if you spend your resources and remove the links, people will still likely add your site to their disavow file. I saw a YouTube video recording of an SEO conference where 4 well known SEO consultants mentioned that even if they remove the links “go ahead and disavow anyhow,” so there is absolutely no upside for publishers in removing links.

How Aggregate Disavow Data Could Be Used

Recovery is by no means guaranteed. In fact of the people who go to the trouble to remove many links & create a disavow file, only 15% of people claim to have seen any benefit.

The other 85% who weren’t sure of any benefit may not have only wasted their time, but they may have moved some of their other projects closer toward being penalized.

Let’s look at the process:

  • For the disavow to work you also have to have some links removed.

    • Some of the links that are removed may not have been the ones that hurt you in Google, thus removing them could further lower your rank.
    • Some of the links you have removed may be the ones that hurt you in Google, while also being ones that helped you in Bing.
    • The Bing & Yahoo! Search traffic hit comes immediately, whereas the Google recovery only comes later (if at all).
  • Many forms of profits (from client services or running a network of sites) come systematization. If you view everything that is systematized or scalable as spam, then you are not only disavowing to try to recover your penalized site, but you are send co-citation disavow data to Google which could have them torch other sites connected to those same sources.
    • If you run a network of sites & use the same sources across your network and/or cross link around your network, you may be torching your own network.
    • If you primarily do client services & disavow the same links you previously built for past clients, what happens to the reputation of your firm when dozens or hundreds of past clients get penalized? What happens if a discussion forum thread on Google Groups or elsewhere starts up where your company gets named & then a tsunami of pile on stuff fills out in the thread? Might that be brand destroying?

The disavow and review process is not about recovery, but is about collecting data and distributing pain in a game of one-way transparency. Matt has warned that people shouldn’t lie to Google…

…however Google routinely offers useless non-information in their responses.

Some Google webmaster messages leave a bit to be desired.

Recovery is uncommon. Your first response from Google might take a month or more. If you work for a week or two on clean up and then the response takes a month, the penalty has already lasted at least 6 weeks. And that first response might be something like this

Reconsideration request for site.com: Site violates Google’s quality guidelines

We received a reconsideration request from a site owner for site.com/.

We’ve reviewed your site and we believe that site.com/ still violates our quality guidelines. In order to preserve the quality of our search engine, pages from site.com/ may not appear or may not rank as highly in Google’s search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines.

For more specific information about the status of your site, visit the Manual Actions page in Webmaster Tools. From there, you may request reconsideration of your site again when you believe your site no longer violates the quality guidelines.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum.

Absolutely useless.

Zero useful information whatsoever.

As people are unsuccessful in the recovery process they cut deeper and deeper. Some people have removed over 90% of their profile without recovering & been nearly a half-year into the (12-step) “recovery” process before even getting a single example of a bad link from Google. In some cases these bad links Google identified were links were obviously created by third party scraper sites & were not in Google’s original sample of links to look at (so even if you looked at every single link they showed you & cleaned up 100% of issues you would still be screwed.)

Another issue with aggregate disavow data is there is a lot of ignorance in the SEO industry in general, and people who try to do things cheap (essentially free) at scale have an outsized footprint in the aggregate data. For instance, our site’s profile links are nofollowed & our profiles are not indexed by Google. In spite of this, examples like the one below are associated with not 1 but 3 separate profiles for a single site.

Our site only has about 20,000 to 25,000 unique linking domains. However over the years we have had well over a million registered user profiles. If only 2% of the registered user profiles were ignorant spammers who spammed our profile pages and then later added our site to a disavow file, we would have more people voting *against* our site than we have voting for it. And that wouldn’t be because we did anything wrong, but rather because Google is fostering an environment of mixed messaging, fear & widespread ignorance.

And if we are ever penalized, the hundreds of scraper sites built off scraping our RSS feed would make the recovery process absolutely brutal.

Another factor with Google saying “you haven’t cut out enough bone marrow yet” along with suggesting that virtually any/every type of link is spam is that there is going to be a lot of other forms of false positives in the aggregate data.

I know some companies specializing in link recovery which in part base some aspects of their disavows on the site’s ranking footprint. Well if you get a manual penalty, a Panda penalty, or your site gets hacked, then those sorts of sites which you are linking to may re-confirm that your site deserves to be penalized (on a nearly automated basis with little to no thought) based on the fact that it is already penalized. Good luck on recovering from that as Google folds in aggregate disavow data to justify further penalties.


All large ecosystems are gamed. We see it with app ratings & reviews, stealth video marketing, advertising, malware installs, and of course paid links.

Historically in search there has been the view that you are responsible for what you have done, but not the actions of others. The alternate roadmap would lead to this sort of insanity:

Our system has noticed that in the last week you received 240 spam emails. In result, your email account was temporarily suspended. Please contact the spammers and once you have a proof they unsuscribed you from their spam databases, we will reconsider reopening your email account.

As Google has closed down their own ecosystem, they allow their own $ 0 editorial to rank front & center even if it is pure spam, but third parties are now held to a higher standard – you could be held liable for the actions of others.

At the extreme, one of Google’s self-promotional automated email spam messages sent a guy to jail. In spite of such issues, Google remains unfazed, adding a setting which allows anyone on Google+ to email other members.

Ask Google if they should be held liable for the actions of third parties and they will tell you to go to hell. Their approach to copyright remains fuzzy, they keep hosting more third party content on their own sites, and even when that content has been deemed illegal they scream that it undermines their first amendment rights if they are made to proactively filter:

Finally, they claimed they were defending free speech. But it’s the courts which said the pictures were illegal and should not be shown, so the issue is the rule of law, not freedom of speech.

the non-technical management, particularly in the legal department, seems to be irrational to the point of becoming adolescent. It’s almost as if they refuse to do something entirely sensible, and which would save them and others time and trouble, for no better reason than that someone asked them to.

Monopolies with nearly unlimited resources shall be held liable for nothing.

Individuals with limited resources shall be liable for the behavior of third parties.

Google Duplicity (beta).

Torching a Competitor

As people have become more acclimated toward link penalties, a variety of tools have been created to help make sorting through the bad ones easier.

“There have been a few tools coming out on the market since the first Penguin – but I have to say that LinkRisk wins right now for me on ease of use and intuitive accuracy. They can cut the time it takes to analyse and root out your bad links from days to minutes…” – Dixon Jones

But as there have been more tools created for sorting out bad links & more tools created to automate sending link emails, two things have happened

  • Google is demanding more links be removed to allow for recovery

  • people are becoming less responsive to link removal requests as they get bombarded with them
    • Some of these tools keep bombarding people over and over again weekly until the link is removed or the emails go to the spam bin
    • to many people the link removal emails are the new link request emails ;)
    • one highly trusted publisher who participates in our forums stated they filtered the word “disavow” to automatically go to their trash bin
    • on WebmasterWorld a member decided it was easier to delete their site than deal with the deluge of link removal spam emails

The problem with Google rewarding negative signals is there are false positives and it is far cheaper to kill a business than it is to build one. The technically savvy teenager who created the original version of the software used in the Target PoS attack sold the code for only $ 2,000.

There have been some idiotic articles like this one on The Awl suggesting that comment spamming is now dead as spammers run for the hills, but that couldn’t be further from the truth. Some (not particularly popular) blogs are getting hundreds to thousands of spam comments daily & WordPress can have trouble even backing up the database (unless the comment spam is regularly deleted) as the database can quickly get a million records.

The spam continues but the targets change. A lot of these comments are now pointed at YouTube videos rather than ordinary websites.

As Google keeps leaning into negative signals, one can expect a greater share of spam links to be created for negative SEO purposes.

Maybe this maternity jeans comment spam is tied to the site owner, but if they didn’t do it, how do they prove it?

Once again, I’ll reiterate Bill Black

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black

The cost of “an academic test” can be as low as $ 5. You know you might be in trouble when you see fiverr.com/conversations/theirusername in your referrers:

Our site was hit with negative SEO. We have manually collected about 24,000 bad links for our disavow file (so far). It probably cost the perp $ 5 on Fiverr to point these links at our site. Do you want to know how bad that sucks? I’ll tell you. A LOT!! Google should be sued enmass by web masters for wasting our time with this “bad link” nonsense. For a company with so many Ph.D’s on staff, I can’t believe how utterly stupid they are

Or, worse yet, you might see SAPE in your referrers

And if the attempt to get you torched fails, they can try & try again. The cost of failure is essentially zero. They can keep pouring on the fuel until the fire erupts.

Even Matt Cutts complains about website hacking, but that doesn’t mean you are free of risk if someone else links to your site from hacked blogs. I’ve been forwarded unnatural link messages from Google which came about after person’s site was added in on a SAPE hack by a third party in an attempt to conceal who the beneficial target was. When in doubt, Google may choose to blame all parties in a scorched Earth strategy.

If you get one of those manual penalties, you’re screwed.

Even if you are not responsible for such links, and even if you respond on the same day, and even if Google believes you, you are still likely penalized AT LEAST for a month. Most likely Google will presume you are a liar and you have at least a second month in the penalty box. To recover you might have to waste days (weeks?) of your life & remove some of your organic links to show that you have went through sufficient pain to appease the abusive market monopoly.

As bad as the above is, it is just the tip of the iceberg.

  • People can redirect torched websites.
  • People can link to you from spam link networks which rotate links across sites, so you can’t possibly remove or even disavow all the link sources.
  • People can order you a subscription of those rotating spam links from hacked sites, where new spam links appear daily. Google mentioned discovering 9,500 malicious sites daily & surely the number has only increased from there.
  • People can tie any/all of the above with cloaking links or rel=canonical messages to GoogleBot & then potentially chain that through further redirects cloaked to GoogleBot.
  • And on and on … the possibilities are endless.


Another thing this link removal fiasco subsidizes is various layers of extortion.

Not only are there the harassing emails threatening to add sites to disavow lists if they don’t remove the links, but some companies quickly escalate things from there. I’ve seen hosting abuse, lawyer threat letters, and one friend was actually sued in court (and the people who sued him actually had the link placed!)

Google created a URL removal tool which allows webmasters to remove pages from third party websites. How long until that is coupled with DDoS attacks? Once effective with removing one page, a competitor might decide to remove another.

Another approach to get links removed is to offer payment. But payment itself might encourage the creation of further spammy links as link networks look to replace their old cashflow with new sources.

The recent Expedia fiasco started as an extortion attempt: “If I wanted him to not publish it, he would “sell the post to the highest bidder.”

Another nasty issue here is articles like this one on Link Research Tools, where they not only highlight client lists of particular firms, but then state which URLs have not yet been penalized followed by “most likely not yet visible.” So long as that sort of “publishing” is acceptable in the SEO industry, you can bet that some people will hire the SEOs nearly guaranteeing a penalty to work on their competitor’s sites, while having an employee write a “case study” for Link Research Tools. Is this the sort of bullshit we really want to promote?

Some folks are now engaging in overt extortion:

I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn’t pay him £10 per month to NOT do this.

Branding / Rebranding / Starting Over

Sites that are overly literal in branding likely have no chance at redemption. That triple hyphenated domain name in a market that is seen as spammy has zero chance of recovery.

Even being a generic unbranded site in a YMYL category can make you be seen as spam. The remote rater documents stated that the following site was spam…

… even though the spammiest thing on it was the stuff advertised in the AdSense ads:

For many (most?) people who receive a manual link penalty or are hit by Penguin it is going to be cheaper to start over than to clean up.

At the very minimum it can make sense to lay groundwork for a new project immediately just in case the old site can’t recover or takes nearly a year to recover. However, even if you figure out the technical bits, as soon as you have any level of success (or as soon as you connect your projects together in any way) you once again become a target.

And you can’t really invest in higher level branding functions unless you think the site is going to be around for many years to earn off the sunk cost.

Succeeding at SEO is not only about building rank while managing cashflow and staying unpenalized, but it is also about participating in markets where you are not marginalized due to Google inserting their own vertical search properties.

Even companies which are large and well funded may not succeed with a rebrand if Google comes after their vertical from the top down.

Hope & Despair

If you are a large partner affiliated with Google, hope is on your side & you can monetize the link graph: “By ensuring that our clients are pointing their links to maximize their revenue, we’re not only helping them earn more money, but we’re also stimulating the link economy.”

You have every reason to be Excited, as old projects like Excite or Merchant Circle can be relaunched again and again.

Even smaller players with the right employer or investor connections are exempt from these arbitrary risks.

You can even be an SEO and start a vertical directory knowing you will do well if you can get that Google Ventures investment, even as other similar vertical directories were torched by Panda.

For most other players in that same ecosystem, the above tailwind is a headwind. Don’t expect much 1 on 1 help in webmaster tools.

In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who reply are wasting their time. How many confirmed Penguin 1.0 recoveries are you aware of?

Even if a recovery is deserved, it does not mean one will happen, as errors do happen. And on the off chance recovery happens, recovery does not mean a full restoration of rankings.

There are many things we can learn from Google’s messages, but probably the most important is this:

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way – in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only. – Charles Dickens, A Tale of Two Cities


SEO Book

Posted in IM NewsComments Off

Google’s URL/Content Removal Tool Now A Wizard

Google has updated their URL removal tool to make it easier and smarter to remove content specifically from third-party web sites…

Search Engine Roundtable

Posted in IM NewsComments Off

Ultimate Guide to Google Penalty Removal

Posted by PinpointDesigns


A few months back, I wrote an article on Moz all about a penalty our web agency received for unnatural links pointing to our website. At first, this was a bit of a shock to the system, but since then, we’ve learned so much about Google’s webmaster guidelines and we’ve helped lots of companies get their businesses back on track and remove manual penalties associated with their websites.

What did we get hit for?

Cutting a long story short, the main reason we were hit with a manual penalty was for adding followed anchor text into the bottom of clients websites that said ‘Web Design Yorkshire by Pinpoint Designs‘ (both ‘web design yorkshire’ and ‘Pinpoint Designs’ linked to our website). At the time, we were just doing this out of habit, but we never varied anchor text, always had followed links and we were basically violating Google’s quality guidelines.

After a lot of work and research, we managed to remove the penalty from our website and since then have worked on lots of other clients websites to help them lift penalties. We’ve worked with clients who’ve had both unnatural link penalties along with on-site penalties for low quality content, cloaking issues and malware issues. We have a great success rate and are consistently trying to improve our processes to become even better!

What are we doing to improve?

Over the past few months, we’ve been trying out different tools to aid link recovery, speeding up our process of getting in touch with webmasters, finding new ways to contact webmasters and ultimately just trying to streamline the process of getting spammy links removed. In this guide, I’m going to review a few of the different link removal tools we’ve tried out and also give you some ideas of how you can carry out link removal work manually to get your website penalty removed.

This guide is mainly for people who’ve specifically received a penalty for unnatural links warnings. To find out if you’ve received a penalty for unnatural links, simply log in to Google Webmaster Tools and use Google’s new ‘Manual actions‘ tab to see what type of penalty you have. If it’s for ‘Unnatural links‘ with a yellow warning symbol next to it, then you’re on the right guide!

Unnatural Links to your site

My aim over the next few months is to try to write guides on different types of penalties in order to help people out further. We’re also working on a tool ourselves called Peel, which we’re building from the ground up in order to try and deliver exceptional analysis of links. You can sign up to our newsletter for tips and information about the launch of the product using the link above.

Let’s get started!

Step 1: collecting backlink data

First of all, we need to pull a list of all the links pointing to your website from a few different sources. A Google employee on the Webmaster Central Forums has recently stated that they recommend you focus on the Webmaster Tool Links.

Webmaster Central Forums Response

Whether you choose to believe that though is another question, there’s an interesting discussion on this available at the Webmaster Central Forums:


Matt Cutts has also responded recently saying the following:

“It’s certainly the case that we endeavour to show examples from the set of links returned by Webmaster Tools, and likewise we prefer to assess reconsideration requests on that basis of those links. However, if there’s a really good example link that illustrates a problem, we do leave enough room to share that link, especially because it can help point the webmaster in a better direction to diagnose and fix issues.”

Whilst it may be that Google just want you to remove the majority of bad links and not necessarily every single one of them, I would personally recommend doing the job properly and start by collecting as much data as possible from various different link sources:

  • Google Webmaster Tools – Login to Google Webmaster Tools and click into the website with the issues, go to ‘Search Traffic’ > ‘Links To Your Site’, then click the ‘more’ link under the ‘Who links the most’ tab. Once in here, click the ‘Download more sample links’ from the top.

    Google Webmaster Tools Export

  • Open Site Explorer – Visit http://www.opensiteexplorer.org and type in your domain name (make sure you get this absolutely correct as having a slight variation of your domain such as missing out the www. can cause a different set of results). Once loaded, click the ‘Advanced Reports’ tab, select ‘Links that come from: External linking page’ and leave all the other settings as they are, then click export.
  • Ahrefs – Visit http://www.ahrefs.com and enter in your domain. Click the ‘CSV’ tab and export the list of ‘Backlinks/Ref pages’. Ahrefs is an absolutely brilliant tool that’s helped us so many times with link removal campaigns. It allows you to narrow down quickly sitewide links, followed / nofollowed links and more.
  • Majestic SEO – Visit http://www.majesticseo.com, enter your domain name and select ‘Historic Index’. This will show all your previous links instead of just the most recent. Click on the backlinks tab, scroll to the bottom of the page and click ‘download data’.

Most of the above sites will require you to have subscriptions in order to gather the data. Sites such as Majestic SEO will allow you to use the tool free of charge (or a limited amount) if you have the domain verified in your Webmaster Tools account. That being said, for the sake of one or two month’s membership, it’s worth paying for the wealth of data you’ll get.

Note: Google have previously recommended you add both the www. and the non-www. version of the domain to Webmaster tool and gather links from both sources. It’s worth search different variations of your domain to get as much data as possible.

Step 2: documenting your work

Start by creating yourself a Google Docs Spreadsheet where you can track all of your work to see exactly where you are at with the link removal process. This will be extremely useful when you come to submitting a reconsideration request with Google, as they’ll be able to see exactly what you’ve done to sort out the issues with your site.

I should also point out at this point that Google say they don’t tend to trust external links from sources they don’t trust. For that reason, I recommend only using a Google Spreadsheet when documenting your work as it’s a Google product and trusted by them. You can also include a link to this document in your reconsideration request very easily and it’s free!

We would usually start by creating tabs at the bottom of spreadsheet for each of our data sources. For example, Google Webmaster Tools, Ahrefs, Majestic SEO & Open Site Explorer. You don’t have to separate your data into different sheets, but I personally think it shows more work being carried out and allows more in-depth analysis.

The only disadvantage to doing this is that you’ll have lots of repeat links – we do however have this covered!

If you visit the Pinpoint Designs Blog, we’ve created a Google docs spreadsheet that you can use which helps to combine multiple sheets together and remove duplicates. This is a brilliant asset to use on your link removal campaign.

Google Docs only allows importing of 50,000 characters at a time. This may sound a lot, but it’s surprising how quickly you hit this limit. You can import data from a CSV by going to your Google Docs file, then clicking ‘File > Import’. You can do this for each document you export from OSE, Ahrefs, Majestic and Google Webmaster Tools and import them into separate sheets within the same document. There’s still a limit, but it’s higher than 50,000 and will speed up the process.

IMPORTANT: Once you’ve got all your data into the spreadsheet, make sure you add the following columns to the end of each of the sheets (or alternatively your ‘Master Sheet’ if you’re combining all the data into one sheet):

  • Contact Name
  • Contact Email Address
  • Website Contact Form
  • Twitter URL
  • Facebook URL
  • Google+ URL
  • LinkedIn URL
  • Date of 1st Contact
  • Date of 2nd Contact
  • Date of 3rd Contact
  • Link Status
  • Notes

This may seem like a lot of data, but it’s the best way you can document your work. That way, when you’re carrying out research on your domains, you’ll be able to start populating the spreadsheet and making your life easier in the long term. You’ll also be able to show Google all the work you’ve been doing to get the links removed and making it very easy for them to see everything you’ve done.

If you don’t want to go to the effort of populating all the data above, you could combine all of the different forms of contact method into one cell and just populate that. The chances are, you’re not going to need Twitter, Facebook, Google+, LinkedIn, Email and Website Contact Form for every single URL so it’s just down to preference. The above recommendation is the best way we’ve found of documenting data.

Finally, add an additional sheet to your Google Docs file called ‘Contact Examples’. In here, you can upload images of a few examples of the emails you’ve sent out to the webmasters you’ve been working with. Be careful what you put in here, Google will be reading these messages so make sure you’re not badmouthing them.

Don’t threaten Webmasters by saying Google will block their site and how they’re going to get harmed if they don’t remove your links. Instead, say that you’re trying to clear things up so that you’re fully compliant with Google’s webmaster guidelines. You can apologise for the inconvenience to the webmaster and thank them for their help (examples further down in this article). That way, when Google reads them, they’ll understand you’re genuinely trying to sort things and hopefully be a little more forgiving under the circumstances.

Tip: If you’re on a Mac, you can press ‘Cmd + Shift + 4′. This command allows you to take screenshots quickly and easily of a specific section of your monitor. Perfect for quickly snapping contact forms, emails you’re sending, etc. and uploading them to the ‘Contact Examples’ sheet in your Google docs file.

Step 3: spotting low-quality links

This is a hugely important section of link removal. It sounds simple, but you have to be extremely careful with what links you attempt to remove. Good links generally take a long time to build and if you ask for them to be removed thinking they’re potentially spammy, that hard work may all be for nothing.

Over the past year, we’ve learnt that the best way to identify spammy links is to manually review each and every one of them. If a link is good, mark it in your spreadsheet so you know not to remove it. Don’t delete any links from the sheet, as it’s all research to show Google what you’ve been doing. Either highlight the cell in a colour, or add a note in one of the columns so you know it’s safe / genuine.

So, how do we spot spammy links?

Some links are easy to identify. For example, if you’ve been featured on the BBC News, Guardian or a high quality, authoritative website, you can be fairly sure it’s a strong link that doesn’t need removing.

The definition of a natural link is fairly hard to summarise. I would class these as links that have appeared from other content you’ve written naturally. For example, let’s say you were involved in the automotive industry and wrote an article all about the history or cars, different manufacturers and really went into great details about every aspect. This type of article is obviously going to end up very big, hopefully interesting and should be a brilliant read. If you’ve done a good job and shared the article in the right places, you should hopefully acquire links naturally. People will link to your guide / post naturally without you having to ask for it and those types of links are ok.

Throughout this article, I’ll link to other articles I’ve read on the internet that I believe are helpful for link removal. All the people I link to have created good quality content that I believe will help you in removing your penalties with Google. They have written an article for the sake of it, they’ve written it with an aim in mind so that it’s beneficial.

On the same side, it’s easy to spot some spam links. For example, if you’re listed on a website with 10,000 links on the same page in a list, or if you’ve commented on a blog with xx,xxx other comments just for the backlink. I realise that’s a bit of an extreme situation, but hopefully my point is made. If you know the link has been placed on the site purely for SEO purposes, then it’s most likely unnatural.

Some links however are harder to spot, so here are my top tips for identifying lower quality links:

Whether the URL is indexed in Google or not:

If not, remove the links as the site has most likely received a penalty. You can also see if a domain is indexed in Google by searching Google for ‘site:yourdomain.com’.

Advanced Search Operators - Google

Page Authority and Domain Authority

Don’t pay too much attention to this metric. A new domain name can still be a very strong link even though it has low page and domain authority. A poor quality site can also still have good page authority and domain authority, so just be careful when looking at this.

Site-wide Links:

Site wide links are generally associated with websites you have an affiliation with (friendly links), or links you’ve paid for. Due to this, it’s better to make sure you don’t have links across every page of someone’s sites.In this case, either nofollow the links, or remove them and only put them on one page. Ahrefs does a brilliant job of spotting site-wide links on websites quickly and easily.

That being said, the same rules still apply, if the links looks spammy, remove all of them.

Link Directories:

Link directories are fairly easy to spot, if they contain words like ‘backlinks’, ‘links’, ‘seo’ etc. in the URL, then the chances are that they are low quality links and need removing. If it lists every category under the sun from Website Developers to Sunglasses, then it’s most likely a directory that you need to remove yourself from!

There are some good link directories on the internet, generally, these are targeted to a particular niche, are manually reviewed and sometimes locally targeted. You can look at link directories and think ‘Will I ever receive any traffic from this site?’ or ‘Is this genuinely a valuable link?’ if the answer is likely to be no, then the link should be removed. Be fairly honest with yourself on this one, if it looks like spam, then it most likely is.

If you want some more tips, here are a few bullet points:

Remove a link if:

  • The site is not indexed in Google, this would indicate a penalty.
  • The site automatically accepts links without any manual review.
  • The site has lots of spam links pointing to it (type the URL of the directory into Open Site Explorer and see what you can find!)
  • The site has categories for everything imaginable (cars, holidays, sunglasses, websites, hosting, dresses etc.).
  • The site is obviously part of a network where there are lots of related directories with similar names / similar looking websites etc.
  • The site contains keywords like ‘best’, ‘links’, ‘seo’ etc in the name.

If the site shows ‘powered by phpLD’ or ‘PHP Link Directory’ in the footer, it’s most likely going to be a fairly spammy directory. That’s not always the case, but 9/10 times, it’s most likely true.

During a recent link removal campaign, we managed to get a webmaster to take down a set of 20 link directories that were pointing to our client’s website. They couldn’t be bothered to remove the links from each site individually, so instead they took every site offline so everyone’s links disappeared!

Forum Profiles:

These are usually very spammy. If you have the odd profile link on a forum that you’re very active on, then this is generally fine. However, if you’re listed on a forum in Romania (an example only!) and have never posted before, get the link removed.

If you’re a member of the forum purely to get a link back to your website, then the link should be removed. It’s a very easy to identify spam technique, so stay safe and remove them.

Blog Comments:

Similar to Forum Profiles, blog comments are an easy one to spot. If you’ve commented every now and again on a blog that you generally feel has helped you, or the blog is in your industry and you’ve added value to the discussion with your comment (rather than just posting a boring comment like ‘Good work dude’), then it’s probably ok.

That being said, if you’ve got a large number of blog comments with very little substance, you should remove all the links. If the site has hundreds and hundreds of comments and you’re one of a huge list of spam comments, you should remove the link too.

Social Bookmarks:

Very similar to both the blog comments and forum profiles, social bookmarks are ok if they’re genuine. Remember that the penalties you have received are manual actions and when you put in a reconsideration request, the chances are that a Google employee with manually be looking through some of your links. If your social bookmarks look spammy, remove them.

Paid Links:

If you’ve been paying for links, make sure you remove them or add a rel=”nofollow” attribute to the link. When you’re writing your reconsideration request, mention the fact that you have previously purchased links and have now rectified the issue by ensuring they are all nofollow / been removed.

Google is getting much smarter at detecting advertorial links, so don’t try to trick the system.

Blog Posts:

A slightly trickier one to detect straight off. Usually, you can spot the spammy blog posts generally by looking at the URLs. If they’re dynamic URLs which end in something like ‘index.php?id=99′, then it’s usually a sign of a site being launched very quickly. The best way to identify spam blog posts is to load up each blog. Use these tell-tale signs to spot the low quality posts:

  • Does the article make sense? – Is it English, do the sentences make sense? or is it spun, low quality rubbish that benefits nobody.
  • Is the site nicely designed? – Does the website look like a genuine blog that’s been looked after, cared for and regularly updated? Or is it using a standard template with awful layouts and content.
  • Is the website content unique? - You can use a website such as http://www.copyscape.com to find duplicate / spun content.
  • Is the article high quality? – Again, this is down to interpretation, but does the article provide any value to your business or is it there for the sake of being there? If it’s high quality, but just linked in the wrong way, ask the webmaster to add a nofollow attribute assigned to it.
  • Is the blog too consistent? - Does the blog post about shoes one day, software the following day and medicine the following day? Also, does every post have an outgoing link to a different website, the same length and full of anchor text using ‘money keywords’? If so, they’re not focused and most likely either automating their posting / trying to build a site for SEO purposes. Avoid these sites and remove the links.
  • Is there an easy way to contact the owners? - Lots of lower quality blogs will remove the contact form, or completely hide who is behind them. If you can’t get in touch with the owner, it’s likely to be low quality and should be removed.

If the answer to any of the above questions is no, then have the links removed. Other things you can look at are as follows:

  • Are you using money terms and anchor text in the article? – This is ok as long as it’s not overdone and as long as it’s genuine. You have to use common sense when it comes to looking at blog posts. If it could be considered a spam article, remove it to stay on the safe side).
  • Are there lots of keywords stuffed into the article? – If your article reads like something that’s been written for a search engine, remove it

If the answers to the above are yes, then remove the articles.

Link Networks:

Link networks are bad news, but many ‘SEO’ agencies still use them. You can use some of the metrics gathered in the reports above to detect link networks. For example:

  • IP Address – If the IP address is the same on multiple different domains, this is a sign that you could be part of a link network. You could run a quick whois search on the various domains to see who the owner is. We’ve carried out link removal campaigns on clients’ websites where over 150 domains are from the same IP address.
  • Whois Searches – By running a whois search on a domain name, you can see who the registered owner of the domain is (in most cases). If you find that the same owner owns a large number of domains pointing to your site are from the same owner, this should send danger signals.
  • Google Analytics IDs – Look at the source code of the sites and search for the Google analytics code. These follow the format UA-XXXXXXXX-X (where the X’s are replaced with numbers). If you find the same UA- code being used on multiple sites, this shows signs of a link network.
  • Site Design – Does the site look exactly the same as other sites pointing to your domain? This can sometimes be the same links in the footer, a ‘sponsored company’ or even just the same look and layout. If so, it’s possibly built by the same person and an example of a network.

Some of the tools listed further down in this article will help you to speed up the locating of link networks by automatically pulling in Whois Results, IP address and Analytics Details for each domain. You can then use their system, or software such as Excel to manually filter through the data and spot offending links straight away.

Over Optimised Anchor Text:

If you have articles out on the internet which just contain anchor text pointing to keywords you want to appear on Google for, then these need removing. There has been a lot of talk online about the correct ‘Anchor text ratio’ to have for brand vs commercial anchor text terms, but I don’t think you should think of SEO in this way.

Instead, focus on building your brand. If it’s appropriate to link to your money keywords, or keywords surrounding the money terms, link to it. If your trying to build your brand, you’ll find that you’re linking to your brand name and it’s URL more regularly and you’re going to have a much more natural and organic link profile.

Tip: Use Eppie Vojt’s Link Detective tool to see what your current link profile looks like. You’ll be able to see very quickly which keywords you’ve been targeting too regularly.

If your anchor text is only targeting money terms, remove the links. Chances are, it’s probably not the best article / content anyway. If it is, it’s probably worthy of a money term link.

Link Exchanges:

This is a fairly old tactic these days, but many people still do it. Don’t create a page on your site called links and swap these with other peoples ‘links’ pages. This creates a huge footprint of spammy link building which should be avoided.

Press Releases:

I still believe press releases are a great way to carry out SEO, but only if they’re done correctly. If your press release is simply put out there and contains 3-5 links back to your site with a money keyword as the anchor text, you’re doing it all wrong.

Press releases should be used to build your brand, shout out to the world about what your company is doing and only used when you have something relevant to tell people. If you’ve just received investment from a company, helped out your local community or had your best ever years profits, then write a really good quality press release that people find engaging and interesting. You can then put in links to your website under its brand name or the website URL.

If you are adding money terms in, I’d recommend adding a nofollow attribute in order to stay on the safe side as you can bet this is where Google will be targeting very soon. If you’ve got press releases out there which are blatantly spam with no real value, have them removed.


A simple one. If the site contains malware, have your link removed. Alternatively, if it’s a site that you know is hugely reputable, contact the owner and have them fix the problem as soon as possible.

The above section hopefully gives you an idea of how to spot some poor quality links. It’s not a definitive guide, but it should give you an idea of some of the more common sections we come across when carrying out work for clients.

The biggest tip I would be able to provide is use common sense. Deep down, you’ll know if a link is worth having or not. Sometimes likes are good, but they may need a nofollow attribute adding to them. So just work through the list systematically and put yourself in Google’s shoes.

All you need to do is go through each link manually and make a note of whether it needs to be removed or not. If you consider it a safe link, mark it so you know not to remove the link. You could do this either by a colour marker, or just by adding a piece of text next to each link. If you’ve got a lot of colours going on, make sure to make a sheet at the bottom called ‘Key’ so that the Google employees can see what work you’ve carried out.

Tools for making your life easier…

There are a lot of tools available online that say they’ll help to identify spammy links in bulk. Some of these tools are brilliant for speeding up analysis, but I wouldn’t rely on them 100%. I’ve put some comments below on systems we’ve used in the past and my thoughts on using them.

Websites / Tools available for full link removal:

Link Research Tools / Link Detox

This is probably one of the better tools we’ve used over the past year. The detox tool assigns a risk score to each of your backlinks which it pulls from 22 different link sources. In addition to this, it will pull in lots of statistics about the links and will try to retrieve an email address / contact name for the owner of the offending website. Personally, I’ve found the contact finder to be brilliant and some of the additional metrics (such as IP address of the website and domain registrant) have helped hugely in discovering link networks. They still have a lot of work to do as far as the link scores are concerned. Sometimes, the ‘Very Low Risk’ links are obviously spammy and should be removed, so if you use this tool, you need to be very careful and double-check the links that you’re removing. Overall, a very good system though.

Link Risk

I’ve never actually used Link Risk, but I’ve heard some good things about it. The team behind Link Risk are certainly very talented and know their stuff, so I can’t really comment on the results seen with it.


A new tool on the block – Linkquidator offer a 30 day free trial with a limited number of backlinks. I have signed up to look at the tool but haven’t had chance to give it a full test drive.


I’ve used Remove’em on a couple of campaigns we’ve done for clients as a tool to help us identify bad links. Personally, I found the interface very clunky in terms of outreach to webmasters and a very slow process. As for uncovering bad links, it did a fairly good job of this but didn’t show me the safe links so it is hard to see how many it may have missed. I found the outreach to webmasters very slow with this system and would opt to use mail instead for speed.


Rmoov have personally tried not to identify which links are classed as spammy. Personally, the idea behind this system is brilliant, instead, it just helps to speed up the process of contacting webmasters by pulling contact information for each domain, allowing you to create and send emails then following up each of the emails with reminders. It will check the status of your links periodically and record each step of the process in a Google Doc.

Barry Schwartz from Search Engine Roundtable has done a brilliant write-up on link tools identifying toxic links. It really highlights that manual review is still necessary when using tools such as the above and you should never rely on them 100%.

Our agency Pinpoint Designs is currently working on a link removal system called Peel App which we’re hoping will launch in 2014. Whilst we still push that manual research is required, we’ll be trying to make our system identify spam links in bulk as we do feel there are still algorithms we can use to detect this better than some of the tools already on the market. If you’re interested in hearing more about our tool, you can sign up for our prelaunch newsletter at on the Peel App website.

Step 4: finding webmaster contact details

This is obviously one of the most important parts of your link removal process. First of all, you’re going to need to collect contact details for each of the webmasters. Lets look at the data we wanted to collect again:

Contact Name, Email Address, Website Contact Form, Twitter, Facebook, Google+ and LinkedIn.

Some methods of contact details are easy to spot. For example, if we use the example low quality website miriblack.com, we can see that they have a contact form. Simply copy and paste this URL into your spreadsheet and that’s one method of outreach sorted. Easy! Others are a little more time consuming.

There are a variety of ways to find out the contact name of the owner of the blog. You could do this by carrying out a whois search on the websites domain name. This will sometimes show the registrant of the domain along with other information. This doesn’t always work, but sometimes the information can be useful.

Using the example as ‘Miriblack’ again, look at the whois response from below. We searched the domain name, but we could only find out the Registrants name ‘Matthew Hesser’. From here, we now have a name to search for other contact methods.

Basic Whois Search on Domain

First of all, if we go back to Google, we can type in different search terms to try and retrieve different methods of contact for a ‘Matthew Hesser’. A few examples are below:

Matthew Hesser Miriblack
Matthew Hesser Miri Black

Matthew Hesser LinkedIn
Matthew Hesser Facebook
Matthew Hesser Google+

We can also try to use the domain name to find is someone is associated with it:

Miriblack.com LinkedIn
Miriblack.com Google+

and so on…

It just so happens that on this particular search, I came across Matthew Hesser as the president of ‘Majon International’. For anyone who has done Link Removal campaigns in the past, Majon have a network of websites and charge $ 25 for link removal from all of them. In all honesty, I usually just pay it, as it’s easier than arguing with the webmaster to remove the link and they’re always very quick.

That being said, you aren’t always going to know that a company is associated with this particular site, so you need to look out for people who are associated with ‘seo companies’, ‘marketing and advertising’ etc., as this can be a big giveaway.

Another thing to keep an eye out for is a section of the above whois search, which explains you can go to Godaddy for full whois information. By doing this, we can get the following information:

Full Whois Details

So now, we have an email address for the contact, their full postal address, their name etc. Their email address is obviously targeted towards link removal, so this one’s a fairly easy case. Sometimes though, you can use the email address and search Facebook, Twitter and LinkedIn for the users. You can then outreach to the user from there explaining the situation and requesting to have your links removed.

You can now start to populate your spreadsheet with the data you’ve found and then contact each of the webmasters to request links to be removed. This particular webmaster usually sends an automated response back saying you have to pay a fee for the links to be removed from all of their networks. As long as they’re not charging extortionate rates, I personally would recommend you pay it and carry on. Otherwise, you’ll waste a lot more time arguing and most likely get nowhere.

There are always tools available to help make our lives easier:

Domain Contact Information:

http://tools.seogadget.co.uk/ – This tool is brilliant (and free!). It allows you to enter in the URL of the site with the penalty and pull a large number of different contact methods for each website. You can then populate your Google Docs spreadsheet with the information and use it for outreaching to the webmasters.

Contact Finder by Citation Labs – Garrett French has built a tool where you can input a list of URLs for which you’d like to collate email addresses, contact forms and ‘contact us’ pages. From here, you can export a CSV which could be useful when coming to finding additional forms of contact method.

Contact Finder by Link Research Tools – Another tool from Cemper that helps to find different forms of contact method including Facebook, Twitter, LinkedIn, Xing, Google+ and more. This is a paid tool, but will really help save time and legwork, especially if you’ve got a lot of data!

All the tools above will help you save time. Unfortunately, as they’re automated tools, they’re not necessarily going to find all of the relevant pages, but they’ll do a brilliant job of finding a large chunk of them. Manual work always helps to cover missing gaps, so bear that in mind when going through the link removal process.

Step 5: reaching out to webmasters

Outreach is a very important part of the link removal campaign. Your aim here is to get as many of those bad links removed to your website as possible. Everyone works slightly differently, so by this point, you should either have a large document showing all of your spam / safe links along with lots of different contact methods, or you will just have a list of your spam / safe links and will carry out the contact methods as you go along. This is really down to preference.

There are a few tips I would recommend when sending emails to webmasters:

  • Don’t send multiple emails for each link you need removing – This is a big mistake that we’ve been guilty of in the past. Make sure you sort your data so you can easily spot all of the links coming from the one domain. Otherwise, if a webmaster opens their inbox and finds that you’re asking for links to be removed from 10 pages from 10 different emails, they’re likely to close the email and bin it. Try to collate a list of all the URLs you want removing in one neat place and send them an email. If there’s a lot of links, put them in a Google Docs file and send them a link to the file to take a look at.
  • Don’t spam the webmaster – Don’t contact the webmaster via every method all at once. First try email and wait a week, if that doesn’t work, send a follow up email. If you don’t hear from them after a few days, try another method of contact (LinkedIn, Facebook, Twitter, Google+ etc.). Don’t always send the exact same message, make it personal each time. Log everything in your spreadsheet.
  • Be polite – The webmasters are doing you a favour. Whether it’s you or your previous SEO company that’s submitted the links to their website in the first place, it’s not really their problem. Be polite, don’t accuse them or threaten them and try to be as helpful as possible. Some link removal templates on the internet suggest telling the blog owner that if they don’t remove the link, you’ll add it to the disavow report which will also decrease their rankings. This is almost a threat and I would definitely recommend avoiding this. Don’t make out that it’s the webmasters fault your on there, it’s probably not.
  • Try to find the webmasters name – Don’t start the email as just ‘hello’. Try to get the webmasters first name and send an email that’s a bit more personal. You are much more likely to get a response if you address the contact directly.
  • Try to email from the domain that you’re trying to remove – This one sounds simple, but if you are an agency like us, you sometimes would outreach from the agencies email address. Instead, try to use an email address on the domain that you’re trying to get removed. We’ve seen a much better response rate when doing this.
  • Keep it short and sweet – The shorter it is, the less time a webmaster has to spend on it and the more likely they are to help you. Thinking about things logically, they’re probably going to have had a few of these requests, so don’t make them spend a lot of time reading, especially on the first email.

There are a few posts online about people who’ve received emails from companies requesting link removal. Learn from their mistakes and do your research beforehand. An example can be found on Matt Bors blog here – http://www.mattbors.com/blog/2013/08/02/link-removal-request/

Our link removal templates change fairly regularly depending on the number of responses we appear to be getting. Generally, I’d recommend something along the lines of:

Hi %name,

We have recently received a notification from Google stating that our website has unnatural links pointing towards it. This has really damaged our rankings on Google and as a result, we’re trying to clear things up. Our website url is %website_url.

We noticed the following links are pointing to our website from your site:


I appreciate this is inconvenient and isn’t a reflection on your website at all, but if you’re able to remove the links, we would really appreciate it and would be very grateful.

I look forward to hearing from you.



The email should be personalized to your exact requirements. It may be that an SEO company carried out the work, if so, you could name and shame the company. You should also personalize the outreach email depending on the type of link it is you’re trying to remove. For example, if you’ve spammed someone’s website and you’re asking for a blog comment to be removed, they’re probably not going to be the most forgiving so you need to beg a little harder. If it’s an article you’ve written, but it’s just the link that you want removing from the article, explain that.

To help save time each time you send an email, you can setup canned responses in Gmail. This will help you to create link removal templates very quickly and just fill in the missing gaps. A guide on this can be found here.

If it was you spamming their site, you could apologise for the inconvenience and explain it was naivety. Chances are, you’re going to have to test the water a little bit with your outreach email and see what works best. Own up and try to be personal, put yourself in the webmasters position and think about how likely you would be to help someone out in the same position.

Be honest, be polite and keep it short.

Each time you contact a webmaster, make a note on your spreadsheet of the date and the method of contact. It’s also good to create an additional sheet at the bottom of your Google Docs file with ‘Example emails’ in. Remember that a Google employee will be reading these, so when you’re writing the outreach, don’t make out that Google are the devil, or that it’s everyone’s fault but your own. You don’t need to do this for every single email, but keeping a small selection of emails written in contact forms, messages sent through email, Tweets, Facebook messages etc. will certainly help fight your battle when it comes to the reconsideration request.

I would recommend contacting a webmaster a minimum of three times. That being said, if you have every form of contact method for a webmaster, make sure you use all the different types and try your absolute best. The harder you try, the more likely you are to have the link removed.

Hopefully by this point, you’re starting to see how you’re building a picture up for Google to look through and see that you’re trying very hard to remove the links. Unfortunately, you’re never going to get a link penalty removed unless you put in the effort, so the above is all required. Take your time, by thorough and work through the list systematically.

Tip: It’s always worth checking http://deletebacklinks.com/ to see if they can remove any of your backlinks in bulk. With one client, we managed to remove 92 backlinks for just over $ 50 via the delete backlinks service.

Step 6: the disavow tool

In my previous article, I stated that people really shouldn’t use the disavow tool unless absolutely necessary. Whilst my views on that are still very similar, I think that using the disavow tool before a reconsideration request should be used if you’re unable to remove all the offending links.

Note: Do not use the disavow tool as a way to skip any of the above work, that will not work. Google wants to know that you’ve been working very hard to rectify your mistakes and for you to prove to them that you won’t be carrying out poor quality work again.

By the time you get to the stage of using the disavow tool, you’ve hopefully got a very healthy spreadsheet full of information about links which have been removed, all the methods you’ve contacted webmasters by and a list of all the links you’ve been unable to remove.

Sort your data so that you can see exactly which links you’ve been unable to remove – Try to sort this in order of domain.

Most of the time, you’re probably going to want to disavow a full domain, instead of just a URL. That being said, you will sometimes only want to disavow one link from a domain, so just keep an eye out as to which route you’re going to take.

For this example, I’m going to assume that we’re only disavowing full domains.

Look through your file and figure out which domains need disavowing, then open notepad and copy and paste the main URL into a file with the prefix ‘domain:’. For example:

domain: yourdomain.com

Some tips for disavow files:

  • Don’t include the http://www. prior to a domain
  • Don’t include anything after the domain extension. Eg: .com, .co.uk (unless you’re only disavowing a specific URL).
  • Put each domain on a new line and add comments to the file so Google knows why you’re disavowing them. Prefix all comments with a # symbol.

By the time you’re finished, your disavow file should look something like this:

Just make sure to comment your disavow file well, so Google can see exactly what you’re doing. This should match your Google Docs Spreadsheet so Google can see that you’re not trying to pull the wool over their eyes!

Google have written a very comprehensive guide on how to use the Disavow Tool. You should visit their page for more information on disavowing specific URLs.

Once you’re done, save the file and head over to the disavow tool page. Submit the file and you’ll be good to go.

Note: Google make a point of saying that you should make “every effort to clean up unnatural links pointing to your site. Simply disavowing them isn’t enough” – Don’t try to trick the system.

Step 7: submitting the reconsideration request

The reconsideration process is your final part to removing a penalty (hopefully!). In the request, you’re basically telling Google the following:

  • What you’ve done wrong in the past and that you’ve stopped doing it
  • What you’ve done to fix the problems (include a link to your Google Docs file)
  • How you know it won’t happen again (steps you’ve put in place, training etc.)
  • A little bit about the company (optional – see if it feels necessary)
  • Apologising for the issues and inconvenience

It’s important to be specific in your reconsideration requests. This means owning up to anything that you’ve personally done wrong (whether it be paid links, comment spam on blogs, forum profiles etc.). If you previously employed the services of an ‘SEO company’ that built the poor quality links, tell Google the name of the company and explain that whilst you accept you employed them in the first place, you’ve learnt a lot about Google’s webmaster guidelines and are fully committed to ensuring your website stays within them.

In the past, we have tried mentioning a client’s Adwords budget as a way to try to show Google that they’re a genuine business. Unfortunately, this doesn’t work. Don’t mention irrelevant information that Google isn’t interested in. They want to know that you’ve removed the spam, you know where you went wrong and you won’t do it again.

We always mention if a client has good reviews online (especially from a 3rd party source such as TrustPilot), but I don’t believe this makes any real difference. It’s just a nice touch to show that you’re not trying to upset anyone.

Matt Cutts has explained what should be included in a proper reconsideration request – https://www.youtube.com/watch?feature=player_embedded&v=8MfPe1NbsoA

Only submit a reconsideration request once you’re happy that you’ve taken all the appropriate action to fix the issues associated with your website. Google have provided some good notes on their website blog, so this is worth reading through.

With all the above points in place, here is an example reconsideration request. Remember to tailor this to your own needs, every client is completely different and it needs to be personal to your circumstances.

Dear Google Webspam Team,

On the %date_of_penalty, we received a penalty from Google for unnatural links pointing to our website, %website_url.

Firstly, we would really like to apologise for the inconvenience we have caused and thank you in advance for reading our reconsideration request. In the past, we have carried out some techniques on our website which we now realise are outside your Google Webmaster Guidelines. This includes purchasing links which pass page rank, blog comments and forum spam (**example**).

After reading your webmaster guidelines, we realise we were completely in the wrong. To rectify the issue, we have checked all the backlinks pointing to our website using Google Webmaster Tools, Ahrefs, Open Site Explorer and Majestic SEO and contacted as many of the webmasters as possible in order to have these links removed.

All of the work we have carried out can be found in the Google Spreadsheet below:


In the above spreadsheet, you will see a list of all the links pointing to our site, the webmasters we have contacted and the links we have successfully removed. There are multiple sheets included in the above spreadsheet, which show screenshots of emails we have sent to webmasters, contact forms etc. Any links which we haven’t been able to remove have been added to a disavow file which we have submitted to Google.

I am very sorry for the work we have previously carried out. We now realise that we were completely in the wrong and we will ensure this will never happen again. I’m confident that our website is now in line with Google’s webmaster guidelines and we will ensure that this is always the case.

We look forward to hearing from you and once again, sorry for the inconvenience.



Another brilliant example of a reconsideration request written by Dev Basu can be found on the Powered By Search website.

Important Note:

Make sure the spreadsheet link you attach allows everyone with a link to view it. The worst thing you could do is attach a spreadsheet that Google don’t have permission to see. To do this, go to your Google document, click ‘share’ (top right) and then change the ‘who has access’ to ‘anyone with a link’. Simply copy that link into your reconsideration request.

Sharing Options on Google Docs

Step 8: sit back and wait

At this point, there’s nothing you can do except for wait and cross your fingers. Reconsideration requests generally only take a few days, but in some cases can take weeks to get responses.

Fingers crossed you’ll open your email one sunny morning and end up with a message in your webmaster tools that’s similar to the following:

Manual Spam Action Revoked

Good luck and if you have any questions at all, post them below and I’ll try my best to answer them!

Remember, if you want a head start on your link removal template, you can download a template by visiting our website here.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

Google Penalty Removal Won’t Necessarily Improve Rankings

I’ve said this before, espesially with the disavow tool and removing links in general.

If your site has a link penalty or another penalty…

Search Engine Roundtable

Posted in IM NewsComments Off

Removal Requests Actually Down, Following Google Algorithm Change

On August 10, Google announced that it would be updating its algorithm the following week to include a new ranking signal for the number of “valid copyright removal notices” it receives for a given site.

“Sites with high numbers of removal notices may appear lower in our results,” said Google SVP, Engineering, Amit Singhal, at the time. “This ranking change should help users find legitimate, quality sources of content more easily—whether it’s a song previewed on NPR’s music website, a TV show on Hulu or new music streamed from Spotify.”

One might have expected the removal request floodgates to have been opened upon this news, but that does not appear to be the case. In fact, interestingly, it has been kind of the opposite, according to Google’s Transparency Report.

Barry Schwartz at Search Engine Roundtable points out that from August 13 to August 20, the number of URLs requested to be removed from Google search per week, actually decreased, going from 1,496,220 to 1,427369. It’s only a slight decrease, but the fact that it decreased at all, following this news, is noteworthy.

URLs requested to be removed

August 20 is the latest date Google has data available for, so we’ll see what the following week looked like soon enough. As you can see from the graph, the number has been trending upward, and has jumped quite significantly over the course of this summer.

For the past month, Google says 5,680,830 URLs have been requested to be removed from 31,677 domains by 1,833 and 1,372 reporting organizations. The top copyright owners in the past month have been Froytal Services, RIAA member companies, Microsoft, NBCUniversal and BPI. The top specified domains have been filestube.com, torrenthound.com, isohunt.com, downloads.nl and filesonicsearch.com.


Posted in IM NewsComments Off