Tag Archive | "Understanding"

SearchCap: Google tests AMP labels, AdWords personalization & understanding user intent

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

The changing SERP: Understanding and adapting to dynamic search results

Search results have become more personalized and dynamic over the years, creating a more challenging SEO environment for search and content marketers. But columnist Jim Yu shows how these changes can create opportunities for those willing to do the work.

The post The changing SERP: Understanding…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Understanding and Harnessing the Flow of Link Equity to Maximize SEO Ranking Opportunity – Whiteboard Friday

Posted by randfish

How does the flow of link equity work these days, and how can you harness its potential to help improve your rankings? Whether you’re in need of a refresher or you’ve always wanted a firmer grasp of the concept, this week’s Whiteboard Friday is required watching. Rand covers the basic principles of link equity, outlines common flow issues your site might be encountering, and provides a series of action items to ensure your site is riding the right currents.

Link equity flow

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about understanding and harnessing link equity flow, primarily internal link equity flow, so that you can get better rankings and execute on your SEO. A big thank you to William Chou, @WChouWMX on Twitter, for suggesting this topic. If you have a topic or something that you would like to see on Whiteboard Friday, tweet at me. We’ll add it to the list.

Principles of link equity

So some principles of link equity first to be aware of before we dive into some examples.

1. External links generally give more ranking value and potential ranking boosts than internal links.

That is not to say, though, that internal links provide no link equity, and in fact, many pages that earn few or no external links can still rank well if a domain itself is well linked to and that page is on that site and has links from other good, important pages on the domain. But if a page is orphaned or if a domain has no links at all, extremely difficult to rank.

2. Well-linked-to pages, both internal and external, pass more link equity than those that are poorly linked to.

I think this makes intuitive sense to all of us who have understood the concept of PageRank over the years. Basically, if a page accrues many links, especially from other important pages, that page’s ability to pass its link equity to other pages, to give a boost in ranking ability is stronger than if a page is very poorly linked to or not linked to at all.

3. Pages with fewer links tend to pass more equity to their targets than pages with more links.

Again, going off the old concept of PageRank, if you have a page with hundreds or thousands of links on it, each of those receives a much more fractional, smaller amount of the link equity that could be passed to it than if you have a page with only a few links on it. This is not universally… well, I just want to say this doesn’t scale perfectly. So it’s not the case that if you were to trim down your high link earning pages to having only one link and point to this particular page on your site, then you suddenly get tremendously more benefit than if you had your normal navigation on that page and you link to your homepage and About page and products page. That’s not really the case. But if you had a page that had hundreds of links in a row and you instead made that page have only a few links to the most important, most valuable places, you’ll get more equity out of that, more rank boosting ability.

4. Hacks and tricks like “nofollow” are often ineffective at shaping the flow of link equity.

Using rel=”no follow” or embedding a remotely executable JavaScript file that makes it so that browsers can see the links and visitors can, but Google is unlikely to see or follow those links, to shape the flow of your link equity is generally (a) a poor use of your time, because it doesn’t affect things that much. The old-school PageRank algorithm not that hugely important anymore. And (b) Google is often pretty good at interpreting and discounting these things. So it tends to not be worth your time at all.

5. Redirects and canonicalization lose a small amount of link equity. Non-ideal ones like 302s, JS redirects, etc. may lose more than 301, rel=canonical, etc.

So if I have a 301 or a rel=canonical from one page to another, those will lose or cost you a small, a very small amount of link equity. But more potentially costly would be using non-ideal types of redirects or canonicalization methods, like a JavaScript-based redirect or a 302 or a 307 instead of a 301. If you’re going to do a redirect or if you’re going to do canonicalization, 301s or rel=canonicals are the way to go.

So keeping in mind these principles, let’s talk through three of the most common link equity flow issues that we see websites facing.

Common link equity flow issues

A. A few pages on a large site get all the external links:

You have a relatively large site, let’s say thousands to tens of thousands, maybe even hundreds of thousands of pages, and only a few of those pages are earning any substantial quantity of external links. I have highlighted those in pink. So these pages are pointing to these pink ones. But on this website you have other pages, pages like these purple ones, where you essentially are wanting to earn link equity, because you know that you need to rank for these terms and pages that these purple ones are targeting, but they’re not getting the external links that these pink pages are. In these cases, it’s important to try a few things.

  1. We want to identify the most important non-link earning pages, these purple ones. We’ve got to figure out what these actually are. What are the pages that you wish would rank that are not yet ranking for their terms and phrases that they’re targeting?
  2. We want to optimize our internal links from these pink pages to these purple ones. So in an ideal world, we would say, “Aha, these pages are very strong. They’ve earned a lot of link equity.” You could use Open Site Explorer and look at Top Pages, or Ahrefs or any of our other competitors and look at your pages, the ones that have earned the most links and the most link equity. Then you could say, “Hey, can I find some relevance between these two or some user stories where someone who reaches this page needs something over here, and thus I’m going to create a link to and from there?” That’s a great way to pass equity.
  3. Retrofitting and republishing. So what I mean by this is essentially I’m going to take these pages, these purple ones that I want to be earning links, that are not doing well yet, and consider reworking their content, taking the lessons that I have learned from the pink pages, the ones that have earned link equity, that have earned external links and saying, “What did these guys do right that we haven’t done right on these guys, and what could we do to fix that situation?” Then I’m going to republish and restart a marketing, a link building campaign to try and get those links.

B. Only the homepage of a smaller site gets any external links.

This time we’re dealing with a small site, a very, very small site, 5 pages, 10 pages, maybe even up to 50 pages, but generally a very small site. Often a lot of small businesses, a lot of local businesses have this type of presence, and only the homepage gets any link equity at all. So what do we do in those cases? There’s not a whole lot to spread around. The homepage can only link to so many places. We have to serve users first. If we don’t, we’re definitely going to fall in the search engine rankings.

So in this case, where the pink link earner is the homepage, there are two things we can do:

  1. Make sure that the homepage is targeting and serves the most critical keyword targets. So we have some keyword targets that we know we want to go after. If there’s one phrase in particular that’s very important, rather than having the homepage target our brand, we could consider having the homepage target that specific query. Many times small businesses and small websites will make this mistake where they say, “Oh, our most important keyword, we’ll make that this page. We’ll try and rank it. We’ll link to it from the homepage.” That is generally not nearly as effective as making a homepage target that searcher intent. If it can fit with the user journey as well, that’s one of the best ways you can go.
  2. Consider some new pages for content, like essentially saying, “Hey, I recognize that these other pages, maybe they’re About and my Terms of Service and some of my products and services and whatnot, and they’re just not that link-worthy. They don’t deserve links. They’re not the type of pages that would naturally earn links.” So we might need to consider what are two or three types of pages or pages that we could produce, pieces of content that could earn those links, and think about it this way. You know who the people who are already linking to you are. It’s these folks. I have just made up some domains here. But the folks who are already linking to your homepage, those are likely to be the kinds of people who will link to your internal pages as well. So I would think about them as link targets and say, “What would I be pretty confident that they would link to, if only they knew that it existed on our website?” That’s going to give you a lot of success. Then I would check out some of our link building sections here on Whiteboard Friday and across the Moz Blog for more tips.

C. Mid-long tail KW-targeting pages are hidden or minimized by the site’s nav/IA.

So this is essentially where I have a large site, and I have pages that are targeting keywords that don’t get a ton of volume, but they’re still important. They could really boost the value that we get from our website, because they’re hyper-targeted to good customers for us. In this case, one of the challenges is they’re hidden by your information architecture. So your top-level navigation and maybe even your secondary-level navigation just doesn’t link to them. So they’re just buried deep down in the website, under a whole bunch of other stuff. In these cases, there are some really good solutions.

  1. Find semantic and user intent relationships. So semantic is these words appeared on those pages. Let’s say one of these pages here is targeting the word “toothpaste,” for example, and I find that, oh, you know what, this page over here, which is well linked to in our navigation, mentions the word “toothpaste,” but it doesn’t link over here yet. I’m going to go create those links. That’s a semantic relationship. A user intent relationship would be, hey, this page over here talks about oral health. Well, oral health and toothpaste are actually pretty relevant. Let me make sure that I can create that user journey, because I know that people who’ve read about oral health on our website probably also later want to read about toothpaste, at least some of them. So let’s make that relationship also happen between those two pages. That would be a user intent type of relationship. You’re going find those between your highly linked to external pages and your well-linked-to internal pages and these long tail pages that you’re trying to target. Then you’re going to create those new links.
  2. Try and leverage the top-level category pages that you already have. If you have a top-level navigation and it links to whatever it is — home, products, services, About Us, Contact, the usual types of things — it’s those pages that are extremely well linked to already internally where you can add in content links to those long-tail pages and potentially benefit.
  3. Consider new top-level or second-level pages. If you’re having trouble adding them to these pages, they already have too many links, there’s no user story that make good sense here, it’s too weird to jam them in, maybe engineering or your web dev team thinks that that’s ridiculous to try and jam those in there, consider creating new top-level pages. So essentially saying, “Hey, I want to add a page to our top-level navigation that is called whatever it is, Additional Resources or Resources for the Curious or whatever.” In this case in my oral health and dentistry example, potentially I want an oral health page that is linked to from the top-level navigation. Then you get to use that new top-level page to link down and flow the link equity to all these different pages that you care about and currently are getting buried in your navigation system.

All right, everyone. Hope you’ve enjoyed this edition of Whiteboard Friday. Give us your tips in the comments for how you’ve seen link equity flow, the benefits or drawbacks that you’ve seen to try and controlling and optimizing that flow. We’ll see again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Understanding the Customer Makes Mobile Search Meaningful #SESAtlanta

Understanding the growing mobile PPC market is about understanding the ways in which consumers use mobile, and targeting users with the right message at the right time.

Home – SearchEngineWatch

Posted in IM NewsComments Off

Giovanni Gallucci on Images as Content and Understanding Usage Rights

tt-giovanni-gallucci

Giovanni Gallucci is one of the most generous people Technology Translated host Scott Ellis knows when it comes to sharing his knowledge, and he’s been teaching about image usage and optimization since 2008.

Giovanni is a successful social media consultant and practitioner, videographer, and photographer. He also has a knack for pushing the boundaries of SEO.

He stays on the “light side” of SEO, but by pushing the edges, he is able to find opportunities and gain advantages that most people don’t know about.

Let’s dig in …

In this 45-minute episode of Technology Translated, host Scott Ellis and Giovanni Gallucci discuss:

  • The importance of images in your content
  • The image as content
  • Image SEO and EXIF Data
  • Where you can find images you can use on your site
  • Image usage rights
  • Audience Q&A
  • Above all else … what’s most important
  • What constitutes Fair Use
  • DPI Standards

Click Here to Listen to

Technology Translated on iTunes

Click Here to Listen on Rainmaker.FM

About the author

Rainmaker.FM

Rainmaker.FM is the premier digital marketing and sales podcast network. Get on-demand business advice from experts, whenever and wherever you want it.

The post Giovanni Gallucci on Images as Content and Understanding Usage Rights appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

3 Keys To Understanding Your SEO Needs

Many businesses know they need search engine optimization, but they don’t know much more beyond that. Columnist Casie Gillette has tips for determining the specifics.

The post 3 Keys To Understanding Your SEO Needs appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Understanding The Google Penguin Algorithm

Whenever Google does a major algorithm update we all rush off to our data to see what changed in terms of rankings, search traffic, and then look for the trends to try to figure out what changed.

The two people I chat most with during periods of big algorithmic changes are Joe Sinkwitz and Jim Boykin. I recently interviewed them about the Penguin algorithm.

Topics include:

  • what it is
  • its impact
  • why there hasn’t been an update in a while
  • how to determine if issues are related to Penguin or something else
  • the recovery process (from Penguin and manual link penalties)
  • and much, much more

Here’s a custom drawing we commissioned for this interview.
Pang Win.

Want to embed this image on your website?

To date there have been 5 Penguin updates:

  • April 24, 2012
  • May 25, 2012
  • October 5, 2012
  • May 22, 2013 (Penguin 2.0)
  • October 4, 2013

There hasn’t been one in quite a while, which is frustrating many who haven’t been able to recover. On to the interview…

At its core what is Google Penguin?

Jim: It is a link filter that can cause penalties.

Joe: At its core, Penguin can be viewed as an algorithmic batch filter designed to punish lower quality link profiles.

What sort of ranking and traffic declines do people typically see from Penguin?

Jim: 30-98%. actually, seen some “manual partial matches” some, where traffic was hardly hit…but that’s rare.

Joe: Near total. I should expand. Penguin 1.0 has been a different beast than its later iterations; the first one has been nearly a fixed flag whereas later iterations haven’t been quite as severe.

After the initial update there was another one about a month later & then one about every 6 months for a while. There hasn’t been one for about 10 months now. So why have the updates been so rare? And why hasn’t there been one for a long time?

Jim: Great question. We all believed there’d be an update every 6 months, and now it’s been way longer than 6 months…maybe because Matt’s on vacation…or maybe he knew it would be a long time until the next update, so he took some time off…or perhaps Google wants those with a algorithmic penalty to feel the pain for longer than 6 months.

Joe: 1.0 was temporarily escapable if you were willing to 301 your site; after 1.1 the redirect began to pass on the damage. My theory on why it has been so very long on the most recent update has to do with maximizing pain – Google doesn’t intend to lift its boot off the throats of webmasters just yet; no amount of groveling will do. Add to that the complexity of every idiot disavowing 90%+ of their clean link profiles and ‘dirty’ vs ‘clean’ links is difficult to ascertain on that signal.

Jim: Most people disavow some, then the disavow some more…then next month they disavow more…wait a year and they may disavow them all :)

Joe: Agreed.

Jim: Then Google will let them out…hehe, tongue in cheek…a little.

Joe: I’ve seen disavow files with over 98% of links in there, including Wikipedia, the Yahoo! Directory, and other great sources – absurd.

Jim: Me too. Most of the people are clueless … there’s tons of people who are disavowing links just because their traffic has gone down, so they feel they must have been hit by penguin, so they start disavowing links.

Joe: Yes; I’ve seen a lot of panda hits where the person wants to immediately disavow. “whoa, slow down there Tex!”

Jim: I’ve seen services where they guarantee you’ll get out of a penguin penalty, and we know that they’re just disavowing 100% of the links. Yes, you get your manual penalty removed that way, but then you’re left with nothing.

Joe: Good time to mention that any guarantee of getting out of a penalty is likely sold as a bag of smoke.

Jim: or as they are disavowing 100% of the links they can find going to the site.

OK. I think you mentioned an important point there Jim about “100% of the links they can find.” What are the link sources people should use & how comprehensive is the Google Webmaster Tools data? Is WMT data enough to get you recovered?

Joe: Rarely. I’ve seen where the examples listed in a manual action might be discoverable on Ahrefs, Majestic SEO, or in WMT, but upon cleaning them up (and disavowing further of course) that Google will come back with a few more links that weren’t initially in the WMT data dump. I’m dealing with a client on this right now that bought a premium domain as-is and has been spending about a year constantly disavowing and removing links. Google won’t let them up for air and won’t do the hard reset.

Jim: well first…if you’re getting your backlinks from Google, be sure to pull your backlinks from the www and the non www version of your site. You can’t just use one: you HAVE to pull backlinks from both, so you have to verify both your www and your non www version of your site with Google Webmaster Tools.

We often start with that. When we find big patterns that we feel are the cause, we’ll then go into OSE, Majestic SEO, and Ahrefs, and pull those backlinks too, and pull out those that fit the patterns, but that’s after the Google backlink analysis.

Joe, you mentioned people getting hit by Panda and mistakenly going off to the races to disavow links. What are the distinguishing characteristics between Penguin, Panda & manual link penalties?

Joe: Given they like to sandwich updates to make it difficult to discern, I like this question. Penguin is about links; it is the easiest to find but hardest to fix. When I first am looking at a URL I’ll quickly look at anchor % breakdowns, sources of links, etc. The big difference between penguin and a manual link penalty (if you aren’t looking on WMT) is the timing — think of a bomb going off vs a sniper…everyone complaining at once? probably an algorithm; just a few? probably some manual actions. For manual actions, you’ll get a note too in WMT. With panda I like to look first at the on-page to see if I can spot the egregious KW stuffing, weird infrastructure setups that result in thin/duplicated content, and look into engagement metrics and my favorite…externally supported pages – to – total indexed pages ratios.

Jim: Manual, at least you can keep resubmitting and get a yes or no. With an algorithmic, you’re screwed….because you’re waiting for the next refresh…hoping you did enough to get out.

I don’t mind going back and forth with Google with a manual penalty…at least I’m getting an answer.

If you see a drop in traffic, be sure to compare that to the dates of Panda and Penguin updates…if you see a drop on one of the update days, then you can know if you have Panda or Penguin….and if you’re traffic is just falling, it could be just that, and no penalty.

Joe: While this interview was taking place an employee pinged me to let me know a manual action that was denied, with an example URL being something akin to domain.com/?var=var&var=var – the entire domain was already disavowed. Those 20 second manual reviews by 3rd parties without much of an understanding of search doesn’t generate a lot of confidence for me

Jim: Yes, I posted this yesterday to SEOchat. Reviewers are definitely not looking at things.

You guys mentioned that anyone selling a guaranteed 100% recovery solution is likely selling a bag of smoke. What are the odds of recovery? When does it make sense to invest in recovery, when does it make sense to start a different site, and when does it make sense to do both in parallel?

Jim: Well, I’m one for trying to save a site. I haven’t once said “it’s over for that site, let’s start fresh.” Links are so important, that if I can even save a few links going to a site, I’ll take it. I’m not a fan of doing two sites, causes duplicate content issues, and now your efforts are on two sites.

Joe : It depends on the infraction. I have a lot more success getting stuff out of panda, manual actions, and the later iterations of penguin (theoretically including the latest one once a refresh takes place); I won’t take anyone’s money for those hit on penguin 1.0 though…I give free advice and add it to my DB tracking, but the very few examples I have where a recovery took place that I can confirm were penguin 1.0 and not something else, happened due to being a beta user of the disavow tool and likely occurred for political reasons vs tech reasons.

For churn and burn, redirects and canonicals can still work if you’re clever…but that’s not reinvestment so much as strategy shift I realize.

You guys mentioned the disavow process, where a person does some, does some more over time, etc. Is Google dragging out the process primarily to drive pain? Or are they leveraging the aggregate data in some way?

Joe: Oh absolutely they drag it out. Mathematically I think of triggers where a threshold to trigger down might be at X%, but the trigger for recovery might be X-10%. Further though, I think initially they looooooved all the aggregate disavow data, until the community freaked out and started disavowing everything. Let’s just say I know of a group of people that have a giant network where lots of quality sites are purposefully disavowed in an attempt to screw with the signal further. :)

Jim: pain :) … not sure if they’re leveraging the data yet, but they might be. It shouldn’t be too hard for Google to see that a ton of people are disavowing links from a site like get-free-links-directory.com, for Google to say, “no one else seems to trust these links, we should just nuke that site and not count any links from there.”

we can do this ourselves with our own tools we have..I can see how many times I’ve seen a domain in my disavows, and how many times I disavowed that…ie, If I see spamsite.com in 20 disavows I’ve done, and I’d disavowed it all 20 times I saw it, I can see this data… or if I’ve seen goodsite.com 20 times, and never once disavowed it, I can see that too. I’d assume Google must do something like this as well.

Given that they drag it out, on the manual penalties does it make sense to do a couched effort on the first rejection or two, in order to give the perception of a greater level of pain and effort as you scale things up on further requests? What level of resources does it make sense to devote to the initial effort vs the next one and so on? When does recovery typically happen (in terms of % of links filtered and in terms of how many reconsideration requests were filed)?

Joe: When I deliver “disavow these” and “say this” stuff, I give multiple levels, knowing full well that there might be deeper and deeper considerations of the pain. Now, there have been cases where the 1st try gets a site out, but I usually see 3 or more.

Jim: I figure it will take a few reconsideration requests…and yes, I start “big” and get “bigger.”

but that’s for a sitewide penalty…

We’ve seen sitewides get reduced to a partial penalty. And once we have a partial penalty, it’s much easier to identify this issues and take care of those, while leaving links that go to pages that were not effected.

A sitewide manual penalty kills the site…a partial match penalty usually has some stuff that ranks good, and some stuff that no longer ranks…once we’re at a partial match, I feel much more confident in getting that resolved.

Jim, I know you’ve mentioned the errors people make in either disavowing great links or disavowing links when they didn’t need to. You also mentioned the ability to leverage your old disavow data when processing new sites. When does it make sense to DIY on recovery versus hiring a professional? Are there any handy “rule of thumb” guidelines in terms of the rough cost of a recovery process based on the size of their backlink footprint?

Joe: It comes down to education, doesn’t it? Were you behind the reason it got dinged? You might try that first vs immediately hiring. Psychologically it could even look like you’re more serious after the first disavow is declined by showing you “invested” in the pain. Also, it comes down to opportunity cost. What is your personal time worth divided by your perceived probability of fixing

Jim: We charge $ 5000 for the analysis, and $ 5000 for the link removal process…some may think that’s expensive…but removing good links will screw you, and not removing bad links will screw you…it’s a real science, and getting is wrong can cost you a lot more than this…of course I’d recommend seeing a professional, as I sell this service…but I can’t see anyone who’s not a true expert in links doing this themselves.

Oh…and once we start work for someone, we keep going at no further cost until they get out.

Joe: That’s a nice touch Jim.

Jim: Thank you.

Joe, during this interview you mentioned a reconsideration request rejection where the person cited a link on a site that has already been disavowed. Given how many errors Google’s reviewers make, does it make sense to aggressively push to remove links rather than using disavow? What are the best strategies to get links removed?

Joe: DDoS

Jim: hehe

Joe: Really though, be upfront and honest when using those link removal services (which I’d do vs trying to do them one-by-one-by-one)

Jim: Only 1% of the people will remove links anyways; it’s more to show Google that to you really tried to get the links removed.

Joe: Let the link holder know that you got hit with a penalty, you’re just trying to clean it up because your business is suffering, and ask politely that they do you a solid favor.

I’ve been on the receiving end of a lot of different strategies given the size of my domain portfolio. I’ve been sued before (as a first course of action!) by someone that PAID to put a link on my site….they never even asked, just filed the case.

Jim: We send 3 removal requests..and ping the links too..so when we do a reconsideration request we can show Google the spreadsheet of who we emailed, when we emailed them, and who removed or no followed the links…but it’s more about “show” to Google.

Joe: Yep, not a ton of compliance; webmasters have link removal fatigue by now.

This is more of a business question than an SEO question, but … as much as budgeting for the monetary cost of recovery, an equally important form of budgeting is dealing with the reduced cashflow while the site is penalized. How many months does it typically take to recover from a manual penalty? When should business owners decide to start laying people off? Do you guys suggest people aggressively invest in other marketing channels while the SEO is being worked on in the background?

Jim: manual penalty typically take 2-4 months to recover. Recover is a relative term. Some people get “your manual penalty has been removed” and thier recovery is a tiny blip -up 5%, but still down 90% from what is was prior. Getting a “manual penalty removed” is great. IF there’s good links left in your profile…if you’ve disavow everything, and your penalty is removed…so what…you’ve got nothing….people often ask where they’ll be once they “recover” and I say “it depends on what you have left for links”…but it won’t be where you were.

Joe: It depends on how exposed they are per variable costs. If the costs are fixed, then one can generally wait longer (all things being equal) before cutting. If you have a quarter million monthly link budget *cough* then, you’re going to want to trim as quickly as possible just in order to survive.

Per investing in other channels, I absolutely wholeheartedly cannot emphasize how important it is to become an expert in one channel and at least a generalist in several others…even better, hire an expert in another channel to partner up with. In payday one of the big players did okay in SEO but even with a lot of turbulence was doing great due to their TV and radio capabilities. Also, collect the damn email addresses; email is still a gold mine if you use it correctly.

One of my theories for why there hasn’t been a penguin update in a long time was that as people have become more afraid of links they’ve started using them as a weapon & Google doesn’t want a bunch of false positives caused by competitors killing sites. One reason I’ve thought this versus the pain first motive is that Google could always put a time delay on recoveries while still allowing new sites to get penalized on updates. Joe, you mentioned that after the second Penguin update penalties started passing forward on redirects. Do people take penalized sites and point them at competitors?

Joe: Yes, they do. They also take them and pass them into the natural links of their competitors. I’ve been railing on negative SEO for several years now…right about when the first manual action wave came out in Jan 2012; that was a tipping point. It is now more economical to take someone else’s ranking down than it is to (with a strong degree of confidence) invest in a link strategy to leapfrog them naturally

I could speak for days straight in a congressional filibuster on link strategies used for Negative SEO. It is almost magical how pervasive it has become. I get a couple requests a week to do it even…by BIG companies. Brands being the mechanism to sort out the cesspool and all that.

Jim: Soon, everyone will be monitoring they backlinks on a monthly basis. I know one big company that submits an updated disavow list every week to google.

That leads to a question about preemptive disavows. When does it make sense to do that? What businesses need to worry about that sort of stuff?

Joe: Are you smaller than a Fortune 500? Then the cards are stacked against you. At the very least, be aware of your link profile — I wouldn’t go so far as to preemptively disavow unless something major popped up.

Jim: I’ve done a preemptive disavow for my site. I’d say everyone should do a preemptive disavow to clean out the crap backlinks.

Joe: I can’t wait to launch an avow service…basically go around to everyone and charge a few thousand dollars to clean up their disavows. :)

Jim: We should team up Joe and do them together :)

Joe: I’ll have my spambots call your spambots.

Jim: saving the planet from penguin penalties. cleaning up the links of the web for Google.

Joe: For Google or from Google? :) The other dig, if there’s time, is that not all penalties are created equal because there are several books of law in terms of how long a penalty might last. If I take an unknown site and do what RapGenius did, I’d still be waiting, even after fixing (which rapgenius really didn’t do) largely because Google is not one of my direct or indirect investors.

Perhaps SEOs will soon offer a service for perfecting your pitch deck for the Google Ventures or Google Capital teams so it is easier to BeatThatPenalty? BanMeNot ;)

Joe: Or to extract money from former Googlers…there’s a funding bubble right now where those guys can write their own ticket by VCs chasing the brand. Sure the engineer was responsible for changing the font color of a button, but they have friends on the inside still that might be able to reverse catastrophe.

Outside of getting a Google investment, what are some of the best ways to minimize SEO risk if one is entering a competitive market?

Jim: Don’t try to rank for specific phrases anymore. It’s a long slow road now.

Joe: Being less dependent on Google gives you power; think of it like a job interview. Do you need that job? The less you do, the more bargaining power you have. If you have more and more income coming in to your site from other channels, chances are you are also hitting on some important brand signals.

Jim: You must create great things, and build your brand…that has to be the focus…unless you want to do things to rank higher quicker, and take the risk of a penalty with Google.

Joe: Agreed. I do far fewer premium domaining + SEO-only plays anymore. For a while they worked; just a different landscape now.

Some (non-link builders) mention how foolish SEOs are for wasting so many thought cycles on links. Why are core content, user experience, and social media all vastly more important than link building?

Jim: links are still the biggest part of the Google algorithm – they can not be ignored. People must have things going on that will get them mentions across the web, and ideally some links as well. Links is #1 still today… but yes, after links, you need great content, good user experience, and more.

Joe: CopyPress sells content (please buy some content people; I have three kids to feed here), however it is important to point out that the most incredible content doesn’t mean anything in a vacuum. How are you going to get a user experience with 0 users? Link building, purchasing traffic, DRIVING attention are crucial not just to SEO but to marketing in general. Google is using links as votes; while the variability has changed and evolved over time, it is still very much there. I don’t see it going away in the next year or two.

An analogy: I wrote two books of poetry in college; I think they are ok, but I never published them and tried to get any attention, so how good are they really? Without promotion and amplification, we’re all just guessing.

Thanks guys for sharing your time & wisdom!


About our contributors:

Jim Boykin is the Founder and CEO of Internet Marketing Ninjas, and owner of Webmasterworld.com, SEOChat.com, Cre8asiteForums.com and other community websites. Jim specializes in creating digital assets for sites that attract natural backlinks, and in analyzing links to disavow non-natural links for penalty recoveries.

Joe Sinkwitz, known as Cygnus, is current Chief of Revenue for CopyPress.com. He enjoys long walks on the beach, getting you the content you need, and then whispering in your ear how to best get it ranking.

Categories: 

SEO Book

Posted in IM NewsComments Off

Disavow & Link Removal: Understanding Google

Fear Sells

Few SEOs took notice when Matt Cutts mentioned on TWIG that “breaking their spirits” was essential to stopping spammers. But that single piece of information add layers of insights around things like:

  • duplicity on user privacy on organic versus AdWords
  • benefit of the doubt for big brands versus absolute apathy toward smaller entities
  • the importance of identity versus total wipeouts of those who are clipped
  • mixed messaging on how to use disavow & the general fear around links

From Growth to No Growth

Some people internalize failure when growth slows or stops. One can’t raise venture capital and keep selling the dream of the growth story unless the blame is internalized. If one understands that another dominant entity (monopoly) is intentionally subverting the market then a feel good belief in the story of unlimited growth flames out.

Most of the growth in the search channel is being absorbed by Google. In RKG’s Q4 report they mentioned that mobile ad clicks were up over 100% for the year & mobile organic clicks were only up 28%.

Investing in Fear

There’s a saying in investing that “genius is declining interest rates” but when the rates reverse the cost of that additional leverage surfaces. Risks from years ago that didn’t really matter suddenly do.

The same is true with SEO. A buddy of mine mentioned getting a bad link example from Google where the link was in place longer than Google has been in existence. Risk can arbitrarily be added after the fact to any SEO activity. Over time Google can keep shifting the norms of what is acceptable. So long as they are fighting off WordPress hackers and other major issues they are kept busy, but when they catch up on that stuff they can then focus on efforts to shift white to gray and gray to black – forcing people to abandon techniques which offered a predictable positive ROI.

Defunding SEO is an essential & virtuous goal.

Hiding data (and then giving crumbs of it back to profile webmasters) is one way of doing it, but adding layers of risk is another. What panda did to content was add a latent risk to content where the cost of that risk in many cases vastly exceeded the cost of the content itself. What penguin did to links was the same thing: make the latent risk much larger than the upfront cost.

As Google dials up their weighting on domain authority many smaller sites which competed on legacy relevancy metrics like anchor text slide down the result set. When they fall down the result set, many of those site owners think they were penalized (even if their slide was primarily driven by a reweighting of factors rather than an actual penalty). Since there is such rampant fearmongering on links, they start there. Nearly every widely used form of link building has been promoted by Google engineers as being spam.

  • Paid links? Spam.
  • Reciprocal links? Spam.
  • Blog comments? Spam.
  • Forum profile links? Spam.
  • Integrated newspaper ads? Spam.
  • Article databases? Spam.
  • Designed by credit links? Spam.
  • Press releases? Spam.
  • Web 2.0 profile & social links? Spam.
  • Web directories? Spam.
  • Widgets? Spam.
  • Infographics? Spam.
  • Guest posts? Spam.

It doesn’t make things any easier when Google sends out examples of spam links which are sites the webmaster has already disavowed or sites which Google explicitly recommended in their webmaster guidelines, like DMOZ.

It is quite the contradiction where Google suggests we should be aggressive marketers everywhere EXCEPT for SEO & basically any form of link building is far too risky.

It’s a strange world where when it comes to social media, Google is all promote promote promote. Or even in paid search, buy ads, buy ads, buy ads. But when it comes to organic listings, it’s just sit back and hope it works, and really don’t actively go out and build links, even those are so important. – Danny Sullivan

Google is in no way a passive observer of the web. Rather they actively seek to distribute fear and propaganda in order to take advantage of the experiment effect.

They can find and discredit the obvious, but most on their “spam list” done “well” are ones they can’t detect. So, it’s easier to have webmasters provide you a list (disavows), scare the ones that aren’t crap sites providing the links into submission and damn those building the links as “examples” – dragging them into town square for a public hanging to serve as a warning to anyone who dare disobey the dictatorship. – Sugarrae

This propaganda is so effective that email spammers promoting “SEO solutions” are now shifting their pitches from grow your business with SEO to recover your lost traffic

Where Do Profits Come From?

I saw Rand tweet this out a few days ago…

… and thought “wow, that couldn’t possibly be any less correct.”

When ecosystems are stable you can create processes which are profitable & pay for themselves over the longer term.

I very frequently get the question: ‘what’s going to change in the next 10 years?’ And that is a very interesting question; it’s a very common one. I almost never get the question: ‘what’s not going to change in the next 10 years?’ And I submit to you that that second question is actually the more important of the two – because you can build a business strategy around the things that are stable in time….in our retail business, we know that customers want low prices and I know that’s going to be true 10 years from now. They want fast delivery, they want vast selection. It’s impossible to imagine a future 10 years from now where a customer comes up and says, ‘Jeff I love Amazon, I just wish the prices were a little higher [or] I love Amazon, I just wish you’d deliver a little more slowly.’ Impossible. And so the effort we put into those things, spinning those things up, we know the energy we put into it today will still be paying off dividends for our customers 10 years from now. When you have something that you know is true, even over the long-term, you can afford to put a lot of energy into it. – Jeff Bezos at re: Invent, November, 2012

When ecosystems are unstable, anything approaching boilerplate has an outsized risk added by the dominant market participant. The quicker your strategy can be done at scale or in the third world, the quicker Google shifts it from a positive to a negative ranking signal. It becomes much harder to train entry level employees on the basics when some of the starter work they did in years past now causes penalties. It becomes much harder to manage client relationships when their traffic spikes up and down, especially if Google sends out rounds of warnings they later semi-retract.

What’s more, anything that is vastly beyond boilerplate tends to require a deeper integration and a higher level of investment – making it take longer to pay back. But the budgets for such engagement dry up when the ecosystem itself is less stable. Imagine the sales pitch, “I realize we are off 35% this year, but if we increase the budget 500% we should be in a good spot a half-decade from now.”

All great consultants aim to do more than the bare minimum in order to give their clients a sustainable competitive advantage, but by removing things which are scalable and low risk Google basically prices out the bottom 90% to 95% of the market. Small businesses which hire an SEO are almost guaranteed to get screwed because Google has made delivering said services unprofitable, particularly on a risk-adjusted basis.

Being an entrepreneur is hard. Today Google & Amazon are giants, but it wasn’t always that way. Add enough risk and those streams of investment in innovation disappear. Tomorrow’s Amazon or Google of other markets may die a premature death. You can’t see what isn’t there until you look back from the future – just like the answering machine AT&T held back from public view for decades.

Meanwhile, the Google Venture backed companies keep on keeping on – they are protected.

When ad agencies complain about the talent gap, what they are really complaining about is paying people what they are worth. But as the barrier to entry in search increases, independent players die, leaving more SEOs to chase fewer corporate jobs at lower wages. Even companies servicing fortune 500s are struggling.

On an individual basis, creating value and being fairly compensated for the value you create are not the same thing. Look no further than companies like Google & Apple which engage in flagrantly illegal anti-employee cartel agreements. These companies “partnered” with their direct competitors to screw their own employees. Even if you are on a winning team it does not mean that you will be a winner after you back out higher living costs and such illegal employer agreements.

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black

Meanwhile, complaints about the above sorts of inequality or other forms of asset stripping are pitched as being aligned with Nazi Germany’s treatment of Jews. Obviously we need more H-1B visas to further drive down wages even as graduates are underemployed with a mountain of debt.

A Disavow For Any (& Every) Problem

Removing links is perhaps the single biggest growth area in SEO.

Just this week I got an unsolicited email from an SEO listing directory

We feel you may qualify for a Top position among our soon to be launched Link Cleaning Services Category and we would like to learn more about Search Marketing Info. Due to the demand for link cleaning services we’re poised to launch the link cleaning category. I took a few minutes to review your profile and felt you may qualify. Do you have time to talk this Monday or Tuesday?

Most of the people I interact with tend to skew toward the more experienced end of the market. Some of the folks who join our site do so after their traffic falls off. In some cases the issues look intimately tied to Panda & the sites with hundreds of thousands of pages maybe only have a couple dozen inbound links. In spite of having few inbound links & us telling people the problem looks to be clearly aligned with Panda, some people presume that the issue is links & they still need to do a disavow file.

Why do they make that presumption? It’s the fear message Google has been selling nonstop for years.

Punishing people is much different, and dramatic, from not rewarding. And it feeds into the increasing fear that people might get punished for anything. – Danny Sullivan

What happens when Google hands out free all-you-can-eat gummy bear laxatives to children at the public swimming pool? A tragedy of the commons.

Rather than questioning or countering the fear stuff, the role of the SEO industry has largely been to act as lap dogs, syndicating & amplifying the fear.

  • link tool vendors want to sell proprietary clean up data
  • SEO consultants want to tell you that they are the best and if you work with someone else there is a high risk hidden in the low price
  • marketers who crap on SEO to promote other relabeled terms want to sell you on the new term and paint the picture that SEO is a self-limiting label & a backward looking view of marketing
  • paid search consultants want to enhance the perception that SEO is unreliable and not worthy of your attention or investment

Even entities with a 9 figure valuation (and thus plenty of resources to invest in a competent consultant) may be incorrectly attributing SEO performance problems to links.

A friend recently sent me a link removal request from Buy Domains referring to a post which linked to them.

On the face of this, it’s pretty absurd, no? A company which does nothing but trade in names themselves asks that their name reference be removed from a fairly credible webpage recommending them.

The big problem for Buy Domains is not backlinks. They may have had an issue with some of the backlinks from PPC park pages in the past, but now those run through a redirect and are nofollowed.

Their big issue is that they have less than great engagement metrics (as do most marketplace sites other than eBay & Amazon which are not tied to physical stores). That typically won’t work if the entity has limited brand awareness coupled with having nearly 5 million pages in Google’s index.

They not only have pages for each individual domain name, but they link to their internal search results from their blog posts & those search pages are indexed. Here’s part of a recent blog post

And here are examples of the thin listing sorts of pages which Panda was designed in part to whack. These pages were among the millions indexed in Google.

A marketplace with millions of pages that doesn’t have broad consumer awareness is likely to get nailed by Panda. And the websites linking to it are likely to end up in disavow files, not because they did anything wrong but because Google is excellent at nurturing fear.

What a Manual Penalty Looks Like

Expedia saw a 25% decline in search visibility due to an unnatural links penalty , causing their stock to fall 6.4%. Both Google & Expedia declined to comment. It appears that the eventual Expedia undoing stemmed from Hacker News feedback & coverage about an outing story on an SEO blog that certainly sounded like it stemmed from an extortion attempt. USA Today asked if the Expedia campaign was a negative SEO attack.

While Expedia’s stock drop was anything but trivial, they will likely recover within a week to a month.

Smaller players can wait and wait and wait and wait … and wait.

Manual penalties are no joke, especially if you are a small entity with no political influence. The impact of them can be absolutely devastating. Such penalties are widespread too.

In Google’s busting bad advertising practices post they highlighted having zero tolerance, banning more than 270,000 advertisers, removing more than 250,000 publishers accounts, and disapproving more than 3,000,000 applications to join their ad network. All that was in 2013 & Susan Wojcicki mentioned Google having 2,000,000 sites in their display ad network. That would mean that something like 12% of their business partners were churned last year alone.

If Google’s churn is that aggressive on their own partners (where Google has an economic incentive for the relationship) imagine how much broader the churn is among the broader web. In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who do reply are wasting their time.

The Disavow Threat

Originally when disavow was launched it was pitched as something to be used with extreme caution:

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.

Recently Matt Cutts has encouraged broader usage. He has one video which discusses proatively disavowing bad links as they come in & another where he mentioned how a large company disavowed 100% of their backlinks that came in for a year.

The idea of proactively monitoring your backlink profile is quickly becoming mainstream – yet another recurring fixed cost center in SEO with no upside to the client (unless you can convince the client SEO is unstable and they should be afraid – which would ultimately retard their longterm investment in SEO).

Given the harshness of manual actions & algorithms like Penguin, they drive companies to desperation, acting irrationally based on fear.

People are investing to undo past investments. It’s sort of like riding a stock down 60%, locking in the losses by selling it, and then using the remaining 40% of the money to buy put options or short sell the very same stock. :D

Some companies are so desperate to get links removed that they “subscribe” sites that linked to them organically with spam email messages asking the links be removed.

Some go so far that they not only email you on and on, but they created dedicated pages on their site claiming that the email was real.

What’s so risky about the above is that many webmasters will remove links sight unseen, even from an anonymous Gmail account. Mix in the above sort of “this message is real” stuff and how easy would it be for a competitor to target all your quality backlinks with a “please remove my links” message? Further, how easy would it be for a competitor aware of such a campaign to drop a few hundred Dollars on Fiverr or Xrummer or other similar link sources, building up your spam links while removing your quality links?

A lot of the “remove my link” messages are based around lying to the people who are linking & telling them that the outbound link is harming them as well: “As these links are harmful to both yours and our business after penguin2.0 update, we would greatly appreciate it if you would delete these backlinks from your website.”

Here’s the problem though. Even if you spend your resources and remove the links, people will still likely add your site to their disavow file. I saw a YouTube video recording of an SEO conference where 4 well known SEO consultants mentioned that even if they remove the links “go ahead and disavow anyhow,” so there is absolutely no upside for publishers in removing links.

How Aggregate Disavow Data Could Be Used

Recovery is by no means guaranteed. In fact of the people who go to the trouble to remove many links & create a disavow file, only 15% of people claim to have seen any benefit.

The other 85% who weren’t sure of any benefit may not have only wasted their time, but they may have moved some of their other projects closer toward being penalized.

Let’s look at the process:

  • For the disavow to work you also have to have some links removed.

    • Some of the links that are removed may not have been the ones that hurt you in Google, thus removing them could further lower your rank.
    • Some of the links you have removed may be the ones that hurt you in Google, while also being ones that helped you in Bing.
    • The Bing & Yahoo! Search traffic hit comes immediately, whereas the Google recovery only comes later (if at all).
  • Many forms of profits (from client services or running a network of sites) come systematization. If you view everything that is systematized or scalable as spam, then you are not only disavowing to try to recover your penalized site, but you are send co-citation disavow data to Google which could have them torch other sites connected to those same sources.
    • If you run a network of sites & use the same sources across your network and/or cross link around your network, you may be torching your own network.
    • If you primarily do client services & disavow the same links you previously built for past clients, what happens to the reputation of your firm when dozens or hundreds of past clients get penalized? What happens if a discussion forum thread on Google Groups or elsewhere starts up where your company gets named & then a tsunami of pile on stuff fills out in the thread? Might that be brand destroying?

The disavow and review process is not about recovery, but is about collecting data and distributing pain in a game of one-way transparency. Matt has warned that people shouldn’t lie to Google…

…however Google routinely offers useless non-information in their responses.

Some Google webmaster messages leave a bit to be desired.

Recovery is uncommon. Your first response from Google might take a month or more. If you work for a week or two on clean up and then the response takes a month, the penalty has already lasted at least 6 weeks. And that first response might be something like this

Reconsideration request for site.com: Site violates Google’s quality guidelines

We received a reconsideration request from a site owner for site.com/.

We’ve reviewed your site and we believe that site.com/ still violates our quality guidelines. In order to preserve the quality of our search engine, pages from site.com/ may not appear or may not rank as highly in Google’s search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines.

For more specific information about the status of your site, visit the Manual Actions page in Webmaster Tools. From there, you may request reconsideration of your site again when you believe your site no longer violates the quality guidelines.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum.

Absolutely useless.

Zero useful information whatsoever.

As people are unsuccessful in the recovery process they cut deeper and deeper. Some people have removed over 90% of their profile without recovering & been nearly a half-year into the (12-step) “recovery” process before even getting a single example of a bad link from Google. In some cases these bad links Google identified were links were obviously created by third party scraper sites & were not in Google’s original sample of links to look at (so even if you looked at every single link they showed you & cleaned up 100% of issues you would still be screwed.)

Another issue with aggregate disavow data is there is a lot of ignorance in the SEO industry in general, and people who try to do things cheap (essentially free) at scale have an outsized footprint in the aggregate data. For instance, our site’s profile links are nofollowed & our profiles are not indexed by Google. In spite of this, examples like the one below are associated with not 1 but 3 separate profiles for a single site.

Our site only has about 20,000 to 25,000 unique linking domains. However over the years we have had well over a million registered user profiles. If only 2% of the registered user profiles were ignorant spammers who spammed our profile pages and then later added our site to a disavow file, we would have more people voting *against* our site than we have voting for it. And that wouldn’t be because we did anything wrong, but rather because Google is fostering an environment of mixed messaging, fear & widespread ignorance.

And if we are ever penalized, the hundreds of scraper sites built off scraping our RSS feed would make the recovery process absolutely brutal.

Another factor with Google saying “you haven’t cut out enough bone marrow yet” along with suggesting that virtually any/every type of link is spam is that there is going to be a lot of other forms of false positives in the aggregate data.

I know some companies specializing in link recovery which in part base some aspects of their disavows on the site’s ranking footprint. Well if you get a manual penalty, a Panda penalty, or your site gets hacked, then those sorts of sites which you are linking to may re-confirm that your site deserves to be penalized (on a nearly automated basis with little to no thought) based on the fact that it is already penalized. Good luck on recovering from that as Google folds in aggregate disavow data to justify further penalties.

Responsibility

All large ecosystems are gamed. We see it with app ratings & reviews, stealth video marketing, advertising, malware installs, and of course paid links.

Historically in search there has been the view that you are responsible for what you have done, but not the actions of others. The alternate roadmap would lead to this sort of insanity:

Our system has noticed that in the last week you received 240 spam emails. In result, your email account was temporarily suspended. Please contact the spammers and once you have a proof they unsuscribed you from their spam databases, we will reconsider reopening your email account.

As Google has closed down their own ecosystem, they allow their own $ 0 editorial to rank front & center even if it is pure spam, but third parties are now held to a higher standard – you could be held liable for the actions of others.

At the extreme, one of Google’s self-promotional automated email spam messages sent a guy to jail. In spite of such issues, Google remains unfazed, adding a setting which allows anyone on Google+ to email other members.

Ask Google if they should be held liable for the actions of third parties and they will tell you to go to hell. Their approach to copyright remains fuzzy, they keep hosting more third party content on their own sites, and even when that content has been deemed illegal they scream that it undermines their first amendment rights if they are made to proactively filter:

Finally, they claimed they were defending free speech. But it’s the courts which said the pictures were illegal and should not be shown, so the issue is the rule of law, not freedom of speech.

the non-technical management, particularly in the legal department, seems to be irrational to the point of becoming adolescent. It’s almost as if they refuse to do something entirely sensible, and which would save them and others time and trouble, for no better reason than that someone asked them to.

Monopolies with nearly unlimited resources shall be held liable for nothing.

Individuals with limited resources shall be liable for the behavior of third parties.

Google Duplicity (beta).

Torching a Competitor

As people have become more acclimated toward link penalties, a variety of tools have been created to help make sorting through the bad ones easier.

“There have been a few tools coming out on the market since the first Penguin – but I have to say that LinkRisk wins right now for me on ease of use and intuitive accuracy. They can cut the time it takes to analyse and root out your bad links from days to minutes…” – Dixon Jones

But as there have been more tools created for sorting out bad links & more tools created to automate sending link emails, two things have happened

  • Google is demanding more links be removed to allow for recovery

  • people are becoming less responsive to link removal requests as they get bombarded with them
    • Some of these tools keep bombarding people over and over again weekly until the link is removed or the emails go to the spam bin
    • to many people the link removal emails are the new link request emails ;)
    • one highly trusted publisher who participates in our forums stated they filtered the word “disavow” to automatically go to their trash bin
    • on WebmasterWorld a member decided it was easier to delete their site than deal with the deluge of link removal spam emails

The problem with Google rewarding negative signals is there are false positives and it is far cheaper to kill a business than it is to build one. The technically savvy teenager who created the original version of the software used in the Target PoS attack sold the code for only $ 2,000.

There have been some idiotic articles like this one on The Awl suggesting that comment spamming is now dead as spammers run for the hills, but that couldn’t be further from the truth. Some (not particularly popular) blogs are getting hundreds to thousands of spam comments daily & WordPress can have trouble even backing up the database (unless the comment spam is regularly deleted) as the database can quickly get a million records.

The spam continues but the targets change. A lot of these comments are now pointed at YouTube videos rather than ordinary websites.

As Google keeps leaning into negative signals, one can expect a greater share of spam links to be created for negative SEO purposes.

Maybe this maternity jeans comment spam is tied to the site owner, but if they didn’t do it, how do they prove it?

Once again, I’ll reiterate Bill Black

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?” – William K Black

The cost of “an academic test” can be as low as $ 5. You know you might be in trouble when you see fiverr.com/conversations/theirusername in your referrers:

Our site was hit with negative SEO. We have manually collected about 24,000 bad links for our disavow file (so far). It probably cost the perp $ 5 on Fiverr to point these links at our site. Do you want to know how bad that sucks? I’ll tell you. A LOT!! Google should be sued enmass by web masters for wasting our time with this “bad link” nonsense. For a company with so many Ph.D’s on staff, I can’t believe how utterly stupid they are

Or, worse yet, you might see SAPE in your referrers

And if the attempt to get you torched fails, they can try & try again. The cost of failure is essentially zero. They can keep pouring on the fuel until the fire erupts.

Even Matt Cutts complains about website hacking, but that doesn’t mean you are free of risk if someone else links to your site from hacked blogs. I’ve been forwarded unnatural link messages from Google which came about after person’s site was added in on a SAPE hack by a third party in an attempt to conceal who the beneficial target was. When in doubt, Google may choose to blame all parties in a scorched Earth strategy.

If you get one of those manual penalties, you’re screwed.

Even if you are not responsible for such links, and even if you respond on the same day, and even if Google believes you, you are still likely penalized AT LEAST for a month. Most likely Google will presume you are a liar and you have at least a second month in the penalty box. To recover you might have to waste days (weeks?) of your life & remove some of your organic links to show that you have went through sufficient pain to appease the abusive market monopoly.

As bad as the above is, it is just the tip of the iceberg.

  • People can redirect torched websites.
  • People can link to you from spam link networks which rotate links across sites, so you can’t possibly remove or even disavow all the link sources.
  • People can order you a subscription of those rotating spam links from hacked sites, where new spam links appear daily. Google mentioned discovering 9,500 malicious sites daily & surely the number has only increased from there.
  • People can tie any/all of the above with cloaking links or rel=canonical messages to GoogleBot & then potentially chain that through further redirects cloaked to GoogleBot.
  • And on and on … the possibilities are endless.

Extortion

Another thing this link removal fiasco subsidizes is various layers of extortion.

Not only are there the harassing emails threatening to add sites to disavow lists if they don’t remove the links, but some companies quickly escalate things from there. I’ve seen hosting abuse, lawyer threat letters, and one friend was actually sued in court (and the people who sued him actually had the link placed!)

Google created a URL removal tool which allows webmasters to remove pages from third party websites. How long until that is coupled with DDoS attacks? Once effective with removing one page, a competitor might decide to remove another.

Another approach to get links removed is to offer payment. But payment itself might encourage the creation of further spammy links as link networks look to replace their old cashflow with new sources.

The recent Expedia fiasco started as an extortion attempt: “If I wanted him to not publish it, he would “sell the post to the highest bidder.”

Another nasty issue here is articles like this one on Link Research Tools, where they not only highlight client lists of particular firms, but then state which URLs have not yet been penalized followed by “most likely not yet visible.” So long as that sort of “publishing” is acceptable in the SEO industry, you can bet that some people will hire the SEOs nearly guaranteeing a penalty to work on their competitor’s sites, while having an employee write a “case study” for Link Research Tools. Is this the sort of bullshit we really want to promote?

Some folks are now engaging in overt extortion:

I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn’t pay him £10 per month to NOT do this.

Branding / Rebranding / Starting Over

Sites that are overly literal in branding likely have no chance at redemption. That triple hyphenated domain name in a market that is seen as spammy has zero chance of recovery.

Even being a generic unbranded site in a YMYL category can make you be seen as spam. The remote rater documents stated that the following site was spam…

… even though the spammiest thing on it was the stuff advertised in the AdSense ads:

For many (most?) people who receive a manual link penalty or are hit by Penguin it is going to be cheaper to start over than to clean up.

At the very minimum it can make sense to lay groundwork for a new project immediately just in case the old site can’t recover or takes nearly a year to recover. However, even if you figure out the technical bits, as soon as you have any level of success (or as soon as you connect your projects together in any way) you once again become a target.

And you can’t really invest in higher level branding functions unless you think the site is going to be around for many years to earn off the sunk cost.

Succeeding at SEO is not only about building rank while managing cashflow and staying unpenalized, but it is also about participating in markets where you are not marginalized due to Google inserting their own vertical search properties.

Even companies which are large and well funded may not succeed with a rebrand if Google comes after their vertical from the top down.

Hope & Despair

If you are a large partner affiliated with Google, hope is on your side & you can monetize the link graph: “By ensuring that our clients are pointing their links to maximize their revenue, we’re not only helping them earn more money, but we’re also stimulating the link economy.”

You have every reason to be Excited, as old projects like Excite or Merchant Circle can be relaunched again and again.

Even smaller players with the right employer or investor connections are exempt from these arbitrary risks.

You can even be an SEO and start a vertical directory knowing you will do well if you can get that Google Ventures investment, even as other similar vertical directories were torched by Panda.

For most other players in that same ecosystem, the above tailwind is a headwind. Don’t expect much 1 on 1 help in webmaster tools.

In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who reply are wasting their time. How many confirmed Penguin 1.0 recoveries are you aware of?

Even if a recovery is deserved, it does not mean one will happen, as errors do happen. And on the off chance recovery happens, recovery does not mean a full restoration of rankings.

There are many things we can learn from Google’s messages, but probably the most important is this:

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way – in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only. – Charles Dickens, A Tale of Two Cities

Categories: 

SEO Book

Posted in IM NewsComments Off

Digital Marketing: Understanding customer sentiment

Metrics should be large part of marketers’ lives, as they can provide valuable analysis of their customers. Read how IBM uses the analytics of consumer sentiment to segment customers and prospects by their value engagement within concentric circles.
MarketingSherpa Blog

Posted in IM NewsComments Off

Understanding How Your Marketing Analytics Gives Credit for Conversions

touchintroductory3

When chatting with marketers, one of the most common questions we hear at HubSpot is regarding “first touch” versus “last touch” attribution in marketing analytics. First touch, last touch, and assist reports are all different ways to attribute conversions on your website, and each of these attribution methods will tell you something different and important about the effectiveness of your marketing and the behavior of your visitors.

The following guide will help you understand the difference between “last touch,” “first touch,” and “assists” attribution, as well as give you a sense of the primary use-cases for each approach. As a wise man once said, you should always give credit where credit is due!

What Are ‘Attributions’ in Marketing Analytics?

Before we begin, first a definition …

‘Attribution’ is a way of understanding which marketing channels or campaigns contributed to a conversion on your website. In HubSpot software, for example, you’ll notice that our Marketing Analytics tools report on the number of leads and customers generated through various marketing efforts — that information is what you’d call an attribution. But because a lead’s or customer’s lifecycle with your company is made up of a number of different interactions, there are multiple ways to report on attribution. Understanding how attribution works will help you understand which of your marketing efforts are actually generating results.

Now that we’ve gotten that out of the way, let’s discuss the different attribution methods that can be used in your marketing analytics.

Last Touch Attribution

Most analytics packages, including Google Analytics, use last touch attribution as their main method of reporting. Last touch data shows you the most recent interactions and conversions your leads had on your website before they converted.

When It’s Useful

As its name suggests, last touch reporting is useful in determining what happened right before your leads converted. If we were presenting last touch data for a given soccer game, for example, it would attribute the winning goal to whoever kicked the ball into the net. Last touch analytics, therefore, is often a good measure of the effectiveness of different landing pages, email campaigns, or other efforts that tend to lead to a direct conversion. What it doesn’t tell you, however, is anything else that led up to that conversion. So, if we were to extend that same soccer analogy, it wouldn’t give credit to the defender who made that great forward pass that made the goal possible.  

HubSpot’s Landing Page Analytics report (pictured below), for instance, uses last touch attribution to help marketers evaluate which landing pages were most effective at generating leads and customers. Looking at first touch attribution for the two customers who converted on the Introduction to Business Blogging ebook offer, however, would show marketers an entirely different view. 

hubspot landing page analytics resized 600

First Touch Attribution

First touch attribution answers the question, “How did this lead or customer originally find you?” What brought him or her across your digital doorstep for the very first time? In HubSpot software, for example, first touch attribution is used in the Sources report, which shows marketers a breakdown of which channels brought in leads and customers in a given time frame.

(Note: Google Analytics doesn’t have a report for first touch attribution out-of-the-box, but if you are tech-savvy, Will Critchlow of Distilled put together some helpful instructions on how to use a .js code to adapt Google Analytics to show first touch attribution.)

When It’s Useful

First touch attribution is useful for evaluating the effectiveness of different channels at generating website visitors and leads. Often, first touch reveals valuable, closed-loop ROI information for channels that are traditionally difficult to measure, like social media or search. Below, you can see that organic search brought us at HubSpot more than 1,400 leads and one customer since the beginning of the month. That one customer may not have purchased our software the very first time he or she visited us through search, but it was search that brought the customer in originally, so through first touch attribution, search is credited with bringing in that customer. 

hubspot sources resized 600

Assists Attribution

If first touch attribution shows you how a lead originally came across your website, and last touch attribution shows you the final interaction that triggered a conversion, I bet you can guess what assists attribution reveals. Marketers use assists reporting to identify the pages that were viewed throughout the lifecycle of people who ended up converting.

(Note: Different analytics platforms handle assists reporting in different ways. Google’s multi-channel funnels detail assisting interactions in the 30 days prior to a conversion. HubSpot’s Conversion Assists version, pictured below, shows you the web pages, blog articles, and landing pages that were most commonly viewed by people who ended up converting as leads or customers.)

When It’s Useful

Just because a page wasn’t the first page people saw or the final page they viewed before converting or buying, doesn’t mean it was insignificant in their decision-making process. Assists reports can help you identify and optimize influential pages on your site, and we’ve actually written an in-depth article about how assist reports can help marketers do this.

Ultimately, you’ll want to use an assist report for insight into the middle of your marketing funnel. For example, Olympia Steel Buildings, a HubSpot customer, used assists data to find that a photo gallery of its pre-engineered steel buildings was influential to a sizeable number of people who ended up converting into leads. Armed with that information, Olympia Steel made that gallery easier to find by integrating it into their homepage navigation and including it in their lead nurturing emails. Below is another example of HubSpot’s own Conversion Assists report and some valuable information our own marketing team could glean from assists data:

hubspot conversion assists resized 600

Which Attribution Method Does Your Marketing Analytics Software Use?

Because you can slice marketing data a number of different ways, it can sometimes be difficult to understand exactly what you’re measuring. The best approach to marketing analytics is to start with a question. Determine what it is you want to know, and then find the attribution method and analytics report that will get you the closest to the answer.

If you’re not sure how your marketing analytics service provider handles attribution, make it your prerogative to find out. As you witnessed in this post, HubSpot’s analytics tools leverage different attribution reporting methods depending on the goals of its various reports. Your analytics package might do things differently. Either way, it behooves you to know how your analytics is reporting attribution so you can fully and completely understand the data you’re gathering from your marketing efforts. 

Image Credit: A6U571N

mqlbanner_demo

mqlbanner_ima


HubSpot’s Inbound Internet Marketing Blog

Posted in IM NewsComments Off

Advert