Tag Archive | "Graph"

Yoast SEO 11.3 lets you add an image of a person to its structured data graph

In its update notes, Yoast reminds us that sites below WordPress 5.2 may no longer be supported.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

How Mobile-First Indexing Disrupts the Link Graph

Posted by rjonesx.

It’s happened to all of us. You bring up a webpage on your mobile device, only to find out that a feature you were accustomed to using on desktop simply isn’t available on mobile. While frustrating, it has always been a struggle for web developers and designers alike to simplify and condense their site on mobile screens without needing to strip features or content that would otherwise clutter a smaller viewport. The worst-case scenario for these trade-offs is that some features would be reserved for desktop environments, or perhaps a user might be able to opt out of the mobile view. Below is an example of how my personal blog displays the mobile version using a popular plugin by ElegantThemes called HandHeld. As you can see, the vast page is heavily stripped down and is far easier to read… but at what cost? And at what cost to the link graph?

My personal blog drops 75 of the 87 links, and all of the external links, when the mobile version is accessed. So what happens when the mobile versions of sites become the primary way the web is accessed, at scale, by the bots which power major search engines?

Google’s announcement to proceed with a mobile-first index raises new questions about how the link structure of the web as a whole might be influenced once these truncated web experiences become the first (and sometimes only) version of the web Googlebot encounters.

So, what’s the big deal?

The concern, which no doubt Google engineers have studied internally, is that mobile websites often remove content and links in order to improve user experience on a smaller screen. This abbreviated content fundamentally alters the link structure which underlies one of the most important factors in Google’s rankings. Our goal is to try and understand the impact this might have.

Before we get started, one giant unknown variable which I want to be quick to point out is we don’t know what percentage of the web Google will crawl with both its desktop and mobile bots. Perhaps Google will choose to be “mobile-first” only on sites that have historically displayed an identical codebase to both the mobile and desktop versions of Googlebot. However, for the purposes of this study, I want to show the worst-case scenario, as if Google chose not only to go “mobile-first,” but in fact to go “mobile-only.”

Methodology: Comparing mobile to desktop at scale

For this brief research, I decided to grab 20,000 random websites from the Quantcast Top Million. I would then crawl two levels deep, spoofing both the Google mobile and Google desktop versions of Googlebot. With this data, we can begin to compare how different the link structure of the web might look.

Homepage metrics

Let’s start with some descriptive statistics of the home pages of these 20,000 randomly selected sites. Of the sites analyzed, 87.42% had the same number of links on their homepage regardless of whether the bot was mobile- or desktop-oriented. Of the remaining 12.58%, 9% had fewer links and 3.58% had more. This doesn’t seem too disparate at first glance.

Perhaps more importantly, only 79.87% had identical links on the homepage when visited by desktop and mobile bots. Just because the same number of links were found didn’t mean they were actually the same links. This is important to take into consideration because links are the pathways which bots use to find content on the web. Different paths mean a different index.

Among the homepage links, we found a 7.4% drop in external links. This could mean a radical shift in some of the most important links on the web, given that homepage links often carry a great deal of link equity. Interestingly, the biggest “losers” as a percentage tended to be social sites. In retrospect, it seems reasonable that one of the common types of links a website might remove from their mobile version would be social share buttons because they’re often incorporated into the “chrome” of a page rather than the content, and the “chrome” often changes to accommodate a mobile version.

The biggest losers as a percentage in order were:

  1. linkedin.com
  2. instagram.com
  3. twitter.com
  4. facebook.com

So what’s the big deal about 5–15% differences in links when crawling the web? Well, it turns out that these numbers tend to be biased towards sites with lots of links that don’t have a mobile version. However, most of those links are main navigation links. When you crawl deeper, you just find the same links. But those that do deviate end up having radically different second-level crawl links.

Second-level metrics

Now this is where the data gets interesting. As we continue to crawl out on the web using crawl sets that are influenced by the links discovered by a mobile bot versus a desktop bot, we’ll continue to get more and more divergent results. But how far will they diverge? Let’s start with size. While we crawled an identical number of home pages, the second-tier results diverged based on the number of links found on those original home pages. Thus, the mobile crawlset was 977,840 unique URLs, while the desktop crawlset was 1,053,785. Already we can see a different index taking shape — the desktop index would be much larger. Let’s dig deeper.

I want you to take a moment and really focus on this graph. Notice there are three categories:

  • Mobile Unique: Blue bars represent unique items found by the mobile bot
  • Desktop Unique: Orange bars represent unique items found by the desktop bot
  • Shared: Gray bars represent items found by both

Notice also that there are there are four tests:

  • Number of URLs discovered
  • Number of Domains discovered
  • Number of Links discovered
  • Number of Root Linking Domains discovered

Now here is the key point, and it’s really big. There are more URLs, Domains, Links, and Root Linking Domains unique to the desktop crawl result than there are shared between the desktop and mobile crawler. The orange bar is always taller than the gray. This means that by just the second level of the crawl, the majority of link relationships, pages, and domains are different in the indexes. This is huge. This is a fundamental shift in the link graph as we have come to know it.

And now for the big question, what we all care about the most — external links.

A whopping 63% of external links are unique to the desktop crawler. In a mobile-only crawling world, the total number of external links was halved.

What is happening at the micro level?

So, what’s really causing this huge disparity in the crawl? Well, we know it has something to do with a few common shortcuts to making a site “mobile-friendly,” which include:

  1. Subdomain versions of the content that have fewer links or features
  2. The removal of links and features by user-agent detecting plugins

Of course, these changes might make the experience better for your users, but it does create a different experience for bots. Let’s take a closer look at one site to see how this plays out.

This site has ~10,000 pages according to Google and has a Domain Authority of 72 and 22,670 referring domains according to the new Moz Link Explorer. However, the site uses a popular WordPress plugin that abbreviates the content down to just the articles and pages on the site, removing links from descriptions in the articles on the category pages and removing most if not all extraneous links from the sidebar and footer. This particular plugin is used on over 200,000 websites. So, what happens when we fire up a six-level-deep crawl with Screaming Frog? (It’s great for this kind of analysis because we can easily change the user-agent and restrict settings to just crawl HTML content.)

The difference is shocking. First, notice that in the mobile crawl on the left, there is clearly a low number of links per page and that number of links is very steady as you crawl deeper through the site. This is what produces such a steady, exponential growth curve. Second, notice that the crawl abruptly ended at level four. The site just didn’t have any more pages to offer the mobile crawler! Only ~3,000 of the ~10,000 pages Google reports were found.

Now, compare this to the desktop crawler. It explodes in pages at level two, collecting nearly double the total pages of the mobile crawl at this level alone. Now, recall the graph before showing that there were more unique desktop pages than there were shared pages when we crawled 20,000 sites. Here is confirmation of exactly how it happens. Ultimately, 6x the content was made available to the desktop crawler in the same level of crawl depth.

But what impact did this have on external links?

Wow. 75% of the external, outbound links were culled in the mobile version. 4,905 external links were found in the desktop version while only 1,162 were found in the mobile. Remember, this is a DA 72 site with over twenty thousand referring domains. Imagine losing that link because the mobile index no longer finds the backlink. What should we do? Is the sky falling?

Take a deep breath

Mobile-first isn’t mobile-only

The first important caveat to all this research is that Google isn’t giving up on the desktop — they’re simply prioritizing the mobile crawl. This makes sense, as the majority of search traffic is now mobile. If Google wants to make sure quality mobile content is served, they need to shift their crawl priorities. But they also have a competing desire to find content, and doing so requires using a desktop crawler so long as webmasters continue to abbreviate the mobile versions of their sites.

This reality isn’t lost on Google. In the Original Official Google Mobile First Announcement, they write…

If you are building a mobile version of your site, keep in mind that a functional desktop-oriented site can be better than a broken or incomplete mobile version of the site.

Google took the time to state that a desktop version can be better than an “incomplete mobile version.” I don’t intend to read too much into this statement other than to say that Google wants a full mobile version, not just a postcard.

Good link placements will prevail

One anecdotal outcome of my research was that the external links which tended to survive the cull of a mobile version were often placed directly in the content. External links in sidebars like blog-rolls were essentially annihilated from the index, but in-content links survived. This may be a signal Google picks up on. External links that are both in mobile and desktop tend to be the kinds of links people might click on.

So, while there may be fewer links powering the link graph (or at least there might be a subset that is specially identified), if your links are good, content-based links, then you have a chance to see improved performance.

I was able to confirm this by looking at a subset of known good links. Using Fresh Web Explorer, I looked up fresh links to toysrus.com which is currently gaining a great deal of attention due to stores closing. We can feel confident that most of these links will be in-content because the articles themselves are about the relevant, breaking news regarding Toys R Us. Sure enough, after testing 300+ mentions, we found the links to be identical in the mobile and desktop crawls. These were good, in-content links and, subsequently, they showed up in both versions of the crawl.

Selection bias and convergence

It is probably the case that popular sites are more likely to have a mobile version than non-popular sites. Now, they might be responsive — at which point they would yield no real differences in the crawl — but at least some percentage would likely be m.* domains or utilize plugins like those mentioned above which truncate the content. At the lower rungs of the web, older, less professional content is likely to have only one version which is shown to mobile and desktop devices alike. If this is the case, we can expect that over time the differences in the index might begin to converge rather than diverge, as my study looked only at sites that were in the top million and only crawled two levels deep.

Moreover (this one is a bit speculative), but I think over time that there will be convergence between a mobile and desktop index. I don’t think the link graphs will grow exponentially different as the linked web is only so big. Rather, the paths to which certain pages are reached, and the frequency with which they are reached, will change quite a bit. So, while the link graph will differ, the set of URLs making up the link graph will largely be the same. Of course, some percentage of the mobile web will remain wholly disparate. The large number of sites that use dedicated mobile subdomains or plugins that remove substantial sections of content will remain like mobile islands in the linked web.

Impact on SERPs

It’s difficult at this point to say what the impact on search results will be. It will certainly not leave the SERPs unchanged. What would be the point of Google making and announcing a change to its indexing methods if it didn’t improve the SERPs?

That being said, this study wouldn’t be complete without some form of impact assessment. Hat tip to JR Oakes for giving me this critique, otherwise I would have forgotten to take a look.

First, there are a couple of things which could mitigate dramatic shifts in the SERPs already, regardless of the veracity of this study:

  • A slow rollout means that shifts in SERPs will be lost to the natural ranking fluctuations we already see.
  • Google can seed URLs found by mobile or by desktop into their respective crawlers, thereby limiting index divergence. (This is a big one!)
  • Google could choose to consider, for link purposes, the aggregate of both mobile and desktop crawls, not counting one to the exclusion of the other.

Second, the relationships between domains may be less affected than other index metrics. What is the likelihood that the relationship between Domain X and Domain Y (more or less links) is the same for both the mobile- and desktop-based indexes? If the relationships tend to remain the same, then the impact on SERPs will be limited. We will call this relationship being “directionally consistent.”

To accomplish this part of the study, I took a sample of domain pairs from the mobile index and compared their relationship (more or less links) to their performance in the desktop index. Did the first have more links than the second in both the mobile and desktop? Or did they perform differently?

It turns out that the indexes were fairly close in terms of directional consistency. That is to say that while the link graphs as a whole were quite different, when you compared one domain to another at random, they tended in both data sets to be directionally consistent. Approximately 88% of the domains compared maintained directional consistency via the indexes. This test was only run comparing the mobile index domains to the desktop index domains. Future research might explore the reverse relationship.

So what’s next?: Moz and the mobile-first index

Our goal for the Moz link index has always been to be as much like Google as possible. It is with that in mind that our team is experimenting with a mobile-first index as well. Our new link index and Link Explorer in Beta seeks to be more than simply one of the largest link indexes on the web, but the most relevant and useful, and we believe part of that means shaping our index with methods similar to Google. We will keep you updated!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

Knowledge Graph Eats Featured Snippets, Jumps +30%

Posted by Dr-Pete

Over the past two years, we’ve seen a steady and substantial increase in Featured Snippets on Google SERPs. In our 10,000-keyword daily tracking set, Featured Snippets have gone from about 5.5% of queries in November 2015 to a recent high of just over 16% (roughly tripling). Other data sets, with longer tail searches, have shown even higher prevalence.

Near the end of October (far-right of the graph), we saw our first significant dip (spotted by Brian Patterson on SEL). This dip occurred over about a 4-day period, and represents roughly a 10% drop in searches with Featured Snippets. Here’s an enhanced, 2-week view (note: Y-axis is expanded to show the day-over-day changes more clearly):

Given the up-and-to-the-right history of Featured Snippets and the investments people have been making optimizing for these results, a 10% drop is worthy of our attention.

What happened, exactly?

To be honest, when we investigate changes like this, the best we can usually do is produce a list of keywords that lost Featured Snippets. Usually, we focus on high-volume keywords, which tend to be more interesting. Here’s a list of keywords that lost Featured Snippets during that time period:

  • CRM
  • ERP
  • MBA
  • buddhism
  • web design
  • anger management
  • hosting
  • DSL
  • ActiveX
  • ovulation

From an explanatory standpoint, this list isn’t usually very helpful – what exactly do “web design”, “buddhism”, and “ovulation” have in common (please, don’t answer that)? In this case, though, there was a clear and interesting pattern. Almost all of the queries that lost Featured Snippets gained Knowledge Panels that look something like this one:

These new panels account for the vast majority of the lost Featured Snippets I’ve spot-checked, and all of them are general Knowledge Panels coming directly from Wikipedia. In some cases, Google is using a more generic Knowledge Graph entry. For example, “HDMI cables”, which used to show a Featured Snippet (dominated by Amazon, last I checked), now shows no snippet and a generic panel for “HDMI”:

In very rare cases, a SERP added the new Knowledge Panel but retained the Featured Snippet, such as the top of this search for “credit score”:

These situations seemed to be the exceptions to the rule.

What about other SERPs?

The SERPs that lost Featured Snippets were only one part of this story. Over the same time period, we saw an explosion (about +30%) in Knowledge Panels:

This Y-axis has not been magnified – the jump in Knowledge Panels is clearly visible even at normal scale. Other tracking sites saw similar, dramatic increases, including this data from RankRanger. This jump appears to be a similar type of descriptive panel, ranging from commercial keywords, like “wedding dresses” and “Halloween costumes”…

…to brand keywords, like “Ray-Ban”…

Unlike definition boxes, many of these new panels appear on words and phrases that appear to be common knowledge and add little value. Here’s a panel on “job search”…

I suspect that most people searching for “job search” or “job hunting” don’t need it defined. Likewise, people searching for “travel” probably weren’t confused about what travel actually is…

Thanks for clearing that up, Google. I’ve decided to spare you all and leave out a screenshot for “toilet” (go ahead and Google it). Almost all of these new panels appear to be driven by Wikipedia (or Wikidata), and most of them are single-paragraph definitions of terms.

Were there other changes?

During the exact same period, we also noticed a drop in SERPs with inline image results. Here’s a graph of the same 2-week period reported for the other features:

This drop almost exactly mirrors the increase in Knowledge Panels. In cases where the new panels were added, those panels almost always contain a block of images at the top. This block seems to have replaced inline image results. It’s interesting to note that, because image blocks in the left-hand column consume an organic position, this change freed up an organic spot on the first page of results for those terms.

Why did Google do this?

It’s likely that Google is trying to standardize answers for common terms, and perhaps they were seeing quality or consistency issues in Featured Snippets. In some cases, like “HDMI cables”, Featured Snippets were often coming from top e-commerce sites, which are trying to sell products. These aren’t always a good fit for unbiased definitions. Its also likely that Google would like to beef up the Knowledge Graph and rely less, where possible, on outside sites for answers.

Unfortunately, this also means that the answers are coming from a much less diverse pool (and, from what we’ve seen, almost entirely from Wikipedia), and it reduces the organic opportunity for sites that were previously ranking for or trying to compete for Featured Snippets. In many cases, these new panels also seem to add very little. Someone searching for “ERP” might be helped by a brief definition, but someone searching for “travel” is unlikely looking to have it explained to them.

As always, there’s not much we can do but monitor the situation and adapt. Featured Snippets are still at historically high levels and represent a legitimate organic opportunity. There’s also win-win, since efforts invested in winning Featured Snippets tend to improve organic ranking and, done right, can produce a better user experience for both search and website visitors.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

SearchCap: Google Assistant, AdWords ad labels & knowledge graph

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Assistant, AdWords ad labels & knowledge graph appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Brand vs. local Knowledge Graph result: Which is better?

Columnist Tony Edward explains which Knowledge Graph results are appropriate in different scenarios. Which one is right for your business?

The post Brand vs. local Knowledge Graph result: Which is better? appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Google Drops The Knowledge Graph Snippet Overlay

It looks like the feature where you can see a miniature knowledge graph overall for individual snippets is gone from Google…

Search Engine Roundtable

Posted in IM NewsComments Off

Extract SEO Value from SERPs with Knowledge Graph and Answer Boxes – Whiteboard Friday

Posted by randfish

Many SEOs are frustrated by the ever-expanding repertoire of answer boxes and results from the Knowledge Graph on Google’s SERPs. One thing we can be sure of is that they’re not going away anytime soon, so in today’s Whiteboard Friday, Rand offers some strategic advice (as well as several tactics) for getting SEO value from those SERPs, and even from those boxes.

For reference, here’s a still of this week’s whiteboard. Click on it to open a high-resolution image in a new tab!

SERPS in Knowledge Graph Whiteboard

Howdy, Moz Fans!

…and welcome to another edition of Whiteboard Friday. This week we’re talking about the rich answers, instant answers, direct answers, whatever you want to call them, that Google and Bing are providing in search results, that are taking away a lot of clicks from those search results that people look for.

4 Types of challenging results

I think the big four that SEOs are concerned about are what I’m going to walk through first, and then we’ll talk about some tactical ways that marketers can actually work around them or even take advantage of them.

#1 – Customized instant answer (a.k.a. answer boxes)

So the first one here is what I’m calling the “customized instant answer.” This is where Google has essentially said, “Hey, we’re going to custom-build something for this type of a query that shows this kind of an answer box.” You can see these for math equations, for weather kinds of searches.

The one I’m showing here is specifically around sports and schedules. So I search for Seattle Mariners, the local baseball team here in Seattle, who today, as of filming, it is their opening day, April 7. So Seattle Mariners’ schedule is actually showing. Well, it’s actually showing the score in real time, since the game’s going on as I’m filming, but below that it shows scores and schedules. It says, “4-6 versus the Angels, live at the bottom of the 6th, 4-7 versus the Angels 7:10 p.m.” etc., etc.

Then, it actually goes all the way down here, and if you click on any of these, you’ll get more detail about where the game’s being hosted and where you can watch it on television. Google’s essentially said, “Hey, you know what? No one should ever have to click on any results to get all of the key information about the schedule.” If you’re looking for far-out scheduling stuff or very specific kinds of things, maybe you might go to there, but they even have links in here directly to buy tickets online. So really, they’re taking away a lot of the clicks here.

#2 – Knowledge Graph answer

Second, the Knowledge Graph answer. This is where Google essentially is using their Knowledge Graph to provide a specific answer. You can think of this as connecting up with entities or concepts, brands, those kinds of things. I search for “Mariners mascot,” Google will give me this little box from their Knowledge Graph that says, “Mariner Moose, the Seattle Mariners, mascot,” and they have a little logo there. They don’t actually show a picture. I have to scroll down if I want to get images. But the next link there, of course, is the Mariner Moose webpage on Seattle.Mariners.mlb.com, that’s a lot of sub-domains, Major League Baseball, but we’ll deal with that later.

#3 – Knowledge Graph sidebar

Then the third one, Knowledge Graph sidebar, this is probably what we’re most accustomed to when it comes to the Knowledge Graph, where I search, I get a list of results, but then there’s also a Knowledge Graph piece on the right-hand side. This one is showing Seattle Mariners baseball team. There’s the logo, arena, location, our manager, some details about the team, where some of this data is extracted from, etc. Typical Knowledge Graph kind of result off to the side.

#4 – Extracted instant answers

Then fourth, and potentially most perturbing, I think for many SEOs, many marketers, is the extracted instant answer. This is an answer that Google has pulled from a website, potentially your website, and they’re showing right in the results that full answer, without a searcher needing to visit the page. Most of the time they will cite the page. Some of the time they don’t even cite where the page, where the answer came from, which means you don’t even have an opportunity to earn that click. Even more insanely frustrating.

But in this case I’ve searched for baseball, how many players on a baseball team. You can phrase this query in a bunch of different ways, and you will get Major League Baseball rosters, a roster of players able to play, blah, blah, blah, blah, the 25-man roster, and the 40-man roster. Do there are two different kinds of rosters in baseball, and Google is building a big, long answer to try and explain this and then sending you to the Wikipedia page if you need more detail.

So these four kinds of things are causing a ton of consternation in our field. There was a study from Stone Temple Consulting, out in Boston, that Eric Enge and Mark Traphagen put together, where they looked and saw that over a large quantity of search results, I believe it was 800,000 search results set, almost 20%, 19% of those had rich answers in some format, direct answers in the SERPS, like one of these, and I think that even excluded number three here, the Knowledge Graph. So that’s a lot of queries where Google is taking away, potentially, a ton of traffic.

You might say “Hey, well, in the long tail and in the chunky middle, it’s probably not as bad as it is in the fat head of query demand.” But this is still very significant for a lot of folks. So there are two ways to think about this. One is strategically, and one is tactically.

Strategic plans to consider

My first advice is on the strategy side of SEO. So when you’re thinking about Knowledge Graph and instant answers and these kinds of things, how they affect your results, I’d ask;

#1 – Decide whether branding is worth it

“Is the branding of extracted answers a worthwhile SEO investment?” If you get this, like Wikipedia has here, is that actually worthwhile for you? Or is that something where you say, “You know what? We’re not going to concentrate on it?” Therefore from saying, “Hey, that’s not an investment for us, any time we start to see these types of results, we’re no longer going to make a considerable SEO investment there. We don’t really care if our competition gets it. We’d rather focus our energy, attention, dollars, people, time on the organic results, where we think we can earn a higher click-through rate and actual traffic, rather than just the brand association.” Or you might say, “Brand association matters hugely. We want very much to be associated with baseball. We’re trying to build this up. It would be great if we could replace Wikipedia here. We’d be thrilled even if we didn’t get the traffic.”

Then, you need to go through the step of actually building some analytics. We need to say, “Hey, how are we going to measure, when we get these kinds of results, the potential volume that’s going on there, and then how are we going to record that as a success?” You won’t be able to see it in your visitor analytics or in any metric that is directly associated with your website.

#2 – Evaluate the likelihood of Google replacing your content

What kinds of content investments could Google replace with instant answers? Content investments that we are making or that we are planning to make. If they could replace them, how likely do we think that is, and does that change our strategy around what we want to invest in from a content perspective, from an SEO perspective, for the future?

If we say “Hey, you know what? We are in the online printing business, and we think that Google will soon have a price comparison, in-search, direct answer kind of a thing, like they have for flight search, in our field. You know what? Maybe we want to shy away from that, and we want to go down a different avenue for the content that we’re going to create.” That could be something that goes into your calculus around decision-making. I would urge you to at least consider the possibility and know where your threat vectors are from a “Google taking us over in the SERPS” perspective.

#3 – Decide if customized answers will help or hinder your plans

Do we want more customized answers? If you’re Major League Baseball or the Seattle Mariners, this is actually probably a godsend. This is a wonderful thing, reason being it helps folks directly find, so fast, where they can watch it on television, where to buy tickets online. This is actually probably wonderful for the Seattle Mariners. They don’t actually care that much, at least from a strategic perspective and overall perspective, whether this is costing them a lot of traffic to their website, because it’s bringing great value to the brand. Google is sending folks directly to the authoritative site. So this means, from an SEO perspective, you don’t get spammers or manipulators or ticket resellers, who are taking over this search space for them.

So depending on the kind of brand you are, the kind of organization you are, instant answers might be a great thing, in which case you might want to think about, “How can we partner more deeply with Google? What can we provide in a structured format? How do we get that information to them in that kind of way, where they will, hopefully, replace a lot of these fat-head queries with exactly what we’re hoping they do in a fast, efficient manner for searchers?”

Tactical plans to consider

Next up is tactical plans to consider, and I think the first one’s most important here.

#1 – Evaluate SERP opportunity

When you are doing your keyword research and your keyword evaluation, I think one of the things that many, many folks are still missing from this is a column that looks at keyword opportunity. So historically, we’ve had a bunch of things when we do keyword research. Here’s our keyword column. We look at difficulty. We might look at volume. We might look at potential value to the business. Maybe we’re looking at how successful it was when we purchased that keyword and what the conversion rates were like, all those kinds of things, path to conversion, etc., etc.

But one of the things we have not historically focused on is keyword opportunity, meaning the click-through rate opportunity. You could do something like a bucket — high, medium and low. I put HML here. You could say something like, “Hey, we think that this alters the click-through rate curve, random guess 30%, 40%, whatever it is,” and use some numbers to classify those so that when you’re actually doing keyword research and choosing which keywords to consider, you make the right kinds of decisions, because a lot of the time you might see, hey, this has high volume, the difficulty’s not that bad, oh shoot, but Google has an instant extracted answer that is taking up 30% of the above-the-fold space or 40% of the above-the-fold space. SERP number one is probably getting 20% of the click-through rate that it would ordinarily get if that instant answer weren’t there. So that needs to be a part of our calculations going forward.

#2 – Identify content and intent gaps in Knowledge Graph and answer boxes

What kinds of content and kinds of event are searchers who are not satisfied by the Knowledge Graph or instant answer listing, what are they searching for and how? That is an opportunity for us to get around this problem. If I search for “Seattle Mariners schedule,” but what I’m actually looking for is I only want to see away games that are in three states that I’m going to be visiting, well, you know what? Actually, this isn’t enough. I need to go directly to the page, and so that might be an intent that I’m going to try and serve very easily from a user experience perspective on the page that ranks first that’s on seattle.mariners.mlb.com.

That question, if you can answer that effectively and find a bunch of those, you may, in fact, over time be able to get rid of those instant answers. You’ve probably seen, there have been examples, where Google had an instant answer, had a Knowledge Graph, got rid of it, and my perception is that a big reason for that is that searchers weren’t clicking those. They weren’t taking advantage of them and instead they were choosing results below the fold, below the instant answer, and so Google got rid of it. So nice thing there.

#3 – Decide if structuring data for Google helps or hurts your cause

Should we be creating or avoiding structured data for Google to use, and will our competition do that? So you need to make a decision. Hey, should we create structured data that Google can easily pull into Knowledge Graph, easily pull into instant answers? If we don’t do it, will our competition do it? Do we care if they do it, if we don’t do it? It’s a little bit of prisoner’s dilemma sometimes, but you’ve got to make the call there, and I think that’s something SEOs should do in their tactical plan around keywords.

#4 – Create control data (where possible) for search traffic analytics

Next, do we need to control for search traffic changes with Knowledge Graph and instant answers in our analytics or our forward-looking estimates? So if you say to yourself, “Hey, we started seeing some tests here. We expect Google’s going to roll this out in our space, around our site. How is that going to impact us, and what does that mean for our year-over-year search estimates for SEO, traffic estimates for SEO? Or what does it mean for how much we think we can grow this year or in the future, that kind of thing?” Looking backwards, if this has been introduced, how much did that change our results sets and our traffic, and do we think that could happen more or less? Have we put that into our analytics so that we recognize, hey, our SEOs did great work. Google just took a lot of the traffic away from us, from an opportunity perspective?”

#5 – Focus on longer-tail searcher intent

Then the last one I think we need to think about is very deep in the tactical trenches, but that is: Did the titles, descriptions, and even the keyword targets that were focused on, those need to focus on longer-tail or more specific types of questions and searcher intent. So we might say, “Hey, you know what? I’m willing to sacrifice ranking for ‘Mariners mascot’, but I really want to rank for ‘Mariner Moose videos.’ Or I really want to rank for ‘Mariner Moose costumes.’ I really want to rank for whatever those extra intents, those things deeper down the funnel, and those long-tail parts of the query might be.” That could change the types of content and keywords that you invest in.

All right everyone, I know Knowledge Graph and instant answers can sometimes be very frustrating for us. But I hope you’ll apply these tactics and recommend some more in the comments and that we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Moz Blog

Posted in IM NewsComments Off

SearchCap: Google Adds 2015 Oscar Nominations To Knowledge Graph, Local Search Challenges For Franchises & More

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Adds 2015 Oscar Nominations To Knowledge Graph, Local Search Challenges For Franchises & More appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

SearchCap: Google Maps App Revised, Social Links In Knowledge Graph, Google Structured Data Updated

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google Maps App Revised, Social Links In Knowledge Graph, Google Structured Data Updated appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Video Games Added To Google’s Knowledge Graph

Google is upping its game for gamers, now including video game information in its knowledge graph. Search queries on video games will result in a knowledge graph panel that includes details like the game’s release date, supported platforms, developers, review scores and more. In a report on…

Please visit Search Engine Land for the full article.

Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off