Tag Archive | "Ranking"

Friday The 13th Google Search Ranking Algorithm Update Chatter

I am seeing some early chatter of a Google search ranking algorithm update today on Friday the 13th, September 13, 2019. The chatter is super early but I am seeing it not just across WebmasterWorld, but also Black Hat World and social media. Some of the automated tracking tools are also picking up on some of these signs.


Search Engine Roundtable

Posted in IM NewsComments Off

Content accuracy is not a ranking factor

Google’s Danny Sullivan explained that its systems rely on topic relevance and authority to rank content.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

View Velocity Is The Secret To Ranking On YouTube

“To determine rankings on their platform, YouTube uses a metric called the View Velocity,” says HubSpot SEO expert Braden Becker. “The View Velocity metric measures the number of subscribers who watch your video right after it’s been published. The higher your videos view velocity the higher your videos will rank. YouTube also accounts for the number of active subscribers you have when they rank your videos.”

Braden Becker, Senior SEO Strategist at HubSpot, reveals the secrets of YouTube’s Ranking Algorithm in his latest video:

The Secrets of YouTube’s Ranking Algorithm

Since marketers are at the mercy of algorithms on nearly every publishing channel, knowing how each of these unique algorithms work is crucial to attracting and maintaining an audience. Luckily, while some channels are rather reserved about the secrets of their algorithms, YouTube has been remarkably transparent. To figure out which videos and channels that users are most likely to enjoy watching, YouTube follows their audience. This means they pay attention to which videos each user watches, what they don’t watch, how much time they spend watching each video, their likes, their dislikes, and “they’re not interested in” feedback. 


What YouTube Pays The Most Attention To

Ranking High In YouTube Search Results

YouTube’s algorithm also uses different signals and metrics to rank and recommend videos on each section of their platform. With this in mind, let’s go over how the algorithm decides to serve content to its users on their search results, homepage, suggested videos, trending, and subscription sections. First, are the search results. The two biggest factors that affect your video search rankings are its keywords and relevance. When ranking videos in search, YouTube will consider how well your titles, descriptions, and content, match each user’s queries. They’ll also consider how many videos users have watched from your channel and the last time they watched other videos surrounding the same topic as your video.

Positive Engagement With Your Videos Is Key

Next is the home page and suggested videos. No two users will have the same experience on YouTube. They want to serve the most relevant personalized recommendations to each of their viewers. To do this they first analyze user’s activity history and find hundreds of videos that could be relevant to them. Then they rank these videos by how well each video has engaged and satisfied similar users, how often each viewer watches videos from a particular channel or topic, and how many times YouTube has already shown each video to its users. 

Ranking On The Trending Page

Next is trending. The trending page is a feast of new and popular videos in a user’s specific country. YouTube wants to balance popularity with novelty when they rank videos in this section, so they heavily consider view count and rate of view growth for each video they rank. 

High “View Velocity” = High Ranking

Last is subscriptions. YouTube has a subscriptions page where users can view all the recently uploaded videos from the channels they subscribe to. But this page isn’t the only benefit that channels get when they acquire a ton of subscribers. To determine rankings on their platform, YouTube uses a metric called the View Velocity, which measures the number of subscribers who watch your video right after it’s been published. The higher your videos view velocity the higher your videos will rank. YouTube also accounts for the number of active subscribers you have when they rank your videos.

The Secrets of YouTube’s Ranking Algorithm with HubSpot SEO expert Braden Becker

The post View Velocity Is The Secret To Ranking On YouTube appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

Signs Of Another Google Search Ranking Algorithm Update

I am seeing signs both in terms of early chatter within the SEO community and industry and the automated tracking tools of an update brewing. The update may have started some time last night, but let’s say this is currently an unconfirmed August 1st update. Google has not pre-announced any core update but who knows.


Search Engine Roundtable

Posted in IM NewsComments Off

Vlog #5: Rand Fishkin On Google Anti-Competitive, Ranking Studies, EAT & YMYL & Bing (Part Two)

In our fifth vlog episode I was in Seattle for Mozcon and I asked Rand Fishkin (@randfish) if I can interview him for my new vlog series. He invited me to his home/office, aka ShedToro, and despite some angry bees, I think the interview went pretty well. This is part two of a two part interview, here we discuss Google being anti-competitive, ranking studies, EAT & YMYL SEO and Microsoft Bing.


Search Engine Roundtable

Posted in IM NewsComments Off

How to Automate Keyword Ranking with STAT and Google Data Studio

Posted by TheMozTeam

This blog post was originally published on the STAT blog.


We asked SEO Analyst, Emily Christon of Ovative Group, to share how her team makes keyword rank reporting easy and digestible for all stakeholders. Read on to see how she combines the power of STAT with Google Data Studio for streamlined reporting that never fails to impress her clients.

Why Google Data Studio

Creating reports for your clients is a vital part of SEO. It’s also one of the most daunting and time-consuming tasks. Your reports need to contain all the necessary data, while also having clear visuals, providing quick wins, and being easy to understand.

At Ovative Group, we’re big advocates for reporting tools that save time and make data easier to understand. This is why we love Google Data Studio.

This reporting tool was created with the user in mind and allows for easy collaboration and sharing with teams. It’s also free, and its reporting dashboard is designed to take the complexity and stress out of visualizing data.

Don’t get us wrong. We still love our spreadsheets, but tools like Excel aren’t ideal for building interactive dashboards. They also don’t allow for easy data pulls — you have to manually add your data, which can eat up a lot of time and cause a lot of feelings.

Data Studio, however, pulls all your data into one place from multiple sources, like spreadsheets, Google Analytics accounts, and Adwords. You can then customize how all that data is viewed so you can surface quick insights.

How does this relate to keyword reporting?

Creating an actionable keyword report that is beneficial for both SEO and your stakeholders can be a challenge. Data Studio makes things a bit easier for us at Ovative in a variety of ways:

Automated data integration

Our team uses the STAT API — which can be connected to Data Studio through a little technical magic and Google Big Query — to pull in all our raw data. You can select what data points you want to be collected from the API, including rank, base rank, competitors, search volume, local information, and more.

Once your data is collected and living in Big Query, you can access it through the Data Studio interface. If you want to learn more about STAT’s API, go here.

Customization

Do you care about current rank? Rank over time? Major movers – those that changed +20 positions week over week? Or are you just after how many keywords you have ranking number one?

All of this is doable — and easy — once you’re comfortable in Data Studio. You can easily customize your reports to match your goals.

“Our team uses the STAT API — which can be connected to Data Studio through a little technical magic and Google Big Query — to pull in all our raw data.” — Emily Christon, SEO Analyst at Ovative Group

Custom dashboards make reporting and insights efficient and client-facing, transforming all that raw data into easy-to-understand metrics, which tell a more compelling story.

How to build your custom Google Data Studio 

There are a myriad of ways to leverage Google Data Studio for major insights. Here are just a few features we use to help visualize our data.

Keyword rank

This report gives you a snapshot of how many keywords you have in each ranking group and how things are trending. You can also scroll through your list of keywords to see what the traffic-driving queries are.

One cool feature of Data Studio when it comes to rank is period over period comparisons. For example, if you set the date range to the previous week, it will automatically pull week over week rank change. If you set the date range to the previous month, it pulls a month over month rank change.

At Ovative, we do weekly, monthly, and yearly keyword rank change reporting.

Keyword look-up tool

If you notice that traffic has declined in a specific keyword set, pop down to the keyword look-up tool to track rank trends over time. This view is extremely helpful — it shows the progress or decline of rank to help explain traffic variability.

Campaign or priority tracker

To support newly launched pages or priority keywords, create a separate section just for these keywords. This will make it easy for you to quickly check the performance and trends of chosen keyword sets.

What’s next? 

Google Data Studio is only as powerful as you make it.

The STAT API integration in Google Data Studio represents one page of our typical client’s reporting studio; we make sure to add in a page for top-level KPI trends, a page for Search Console keyword performance, and other relevant sources for ease of use for ourselves and the client.

Want more? 

Want to dive deeper into STAT? Got questions about our API? You can book a demo with us and get a personalized walk through. 

You can also chat with our rad team at MozCon this July 15–17 to see how you can go seriously deep with your data. Ask about our specialty API — two additional services to give you everything a 100-result SERP has to offer, and perfect if you’ve built your own connector.

Grab my MozCon ticket now!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google Ranking Update Fluctuations Continue (3/29-4/1)

These fluctuations won’t die down, it seems like we’ve been reporting these tremors and fluctuations a lot since the beginning of the year and even more so since the release of the March 12th core update. We are seeing more chatter around these fluctuations from around March 29th through today, April 1st – and no, this is no April Fools joke.


Search Engine Roundtable

Posted in IM NewsComments Off

When Naming The Google 3/12 Broad Core Ranking Update Florida 2

As you know, I reported yesterday morning, well before Google confirmed the Google search update that there was an update. The reason why Google even confirmed the update was because I emailed them asking them to confirm, they did on Twitter as you know. I didn’t want to name it yet but honestly it is a beautiful thing for Brett Tabke of WebmasterWorld to name an update again.


Search Engine Roundtable

Posted in IM NewsComments Off

Local Search Ranking Factors 2018: Local Today, Key Takeaways, and the Future

Posted by Whitespark

In the past year, local SEO has run at a startling and near-constant pace of change. From an explosion of new Google My Business features to an ever-increasing emphasis on the importance of reviews, it’s almost too much to keep up with. In today’s Whiteboard Friday, we welcome our friend Darren Shaw to explain what local is like today, dive into the key takeaways from his 2018 Local Search Ranking Factors survey, and offer us a glimpse into the future according to the local SEO experts.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I’m Darren Shaw from Whitespark, and today I want to talk to you about the local search ranking factors. So this is a survey that David Mihm has run for the past like 10 years. Last year, I took it over, and it’s a survey of the top local search practitioners, about 40 of them. They all contribute their answers, and I aggregate the data and say what’s driving local search. So this is what the opinion of the local search practitioners is, and I’ll kind of break it down for you.

Local search today

So these are the results of this year’s survey. We had Google My Business factors at about 25%. That was the biggest piece of the pie. We have review factors at 15%, links at 16%, on-page factors at 14%, behavioral at 10%, citations at 11%, personalization and social at 6% and 3%. So that’s basically the makeup of the local search algorithm today, based on the opinions of the people that participated in the survey.

The big story this year is Google My Business. Google My Business factors are way up, compared to last year, a 32% increase in Google My Business signals. I’ll talk about that a little bit more over in the takeaways. Review signals are also up, so more emphasis on reviews this year from the practitioners. Citation signals are down again, and that makes sense. They continue to decline I think for a number of reasons. They used to be the go-to factor for local search. You just built out as many citations as you could. Now the local search algorithm is so much more complicated and there’s so much more to it that it’s being diluted by all of the other factors. Plus it used to be a real competitive difference-maker. Now it’s not, because everyone is pretty much getting citations. They’re considered table stakes now. By seeing a drop here, it doesn’t mean you should stop doing them. They’re just not the competitive difference-maker they used to be. You still need to get listed on all of the important sites.

Key takeaways

All right, so let’s talk about the key takeaways.

1. Google My Business

The real story this year was Google My Business, Google My Business, Google My Business. Everyone in the comments was talking about the benefits they’re seeing from investing in a lot of these new features that Google has been adding.

Google has been adding a ton of new features lately — services, descriptions, Google Posts, Google Q&A. There’s a ton of stuff going on in Google My Business now that allows you to populate Google My Business with a ton of extra data. So this was a big one.

✓ Take advantage of Google Posts

Everyone talked about Google Posts, how they’re seeing Google Posts driving rankings. There are a couple of things there. One is the semantic content that you’re providing Google in a Google post is definitely helping Google associate those keywords with your business. Engagement with Google Posts as well could be driving rankings up, and maybe just being an active business user continuing to post stuff and logging in to your account is also helping to lift your business entity and improve your rankings. So definitely, if you’re not on Google Posts, get on it now.

If you search for your category, you’ll see a ton of businesses are not doing it. So it’s also a great competitive difference-maker right now.

✓ Seed your Google Q&A

Google Q&A, a lot of businesses are not even aware this exists. There’s a Q&A section now. Your customers are often asking questions, and they’re being answered by not you. So it’s valuable for you to get in there and make sure you’re answering your questions and also seed the Q&A with your own questions. So add all of your own content. If you have a frequently asked questions section on your website, take that content and put it into Google Q&A. So now you’re giving lots more content to Google.

✓ Post photos and videos

Photos and videos, continually post photos and videos, maybe even encourage your customers to do that. All of that activity is helpful. A lot of people don’t know that you can now post videos to Google My Business. So get on that if you have any videos for your business.

✓ Fill out every field

There are so many new fields in Google My Business. If you haven’t edited your listing in a couple of years, there’s a lot more stuff in there that you can now populate and give Google more data about your business. All of that really leads to engagement. All of these extra engagement signals that you’re now feeding Google, from being a business owner that’s engaged with your listing and adding stuff and from users, you’re giving them more stuff to look at, click on, and dwell on your listing for a longer time, all that helps with your rankings.

2. Reviews

✓ Get more Google reviews

Reviews continue to increase in importance in local search, so, obviously, getting more Google reviews. It used to be a bit more of a competitive difference-maker. It’s becoming more and more table stakes, because everybody seems to be having lots of reviews. So you definitely want to make sure that you are competing with your competition on review count and lots of high-quality reviews.

✓ Keywords in reviews

Getting keywords in reviews, so rather than just asking for a review, it’s useful to ask your customers to mention what service they had provided or whatever so you can get those keywords in your reviews.

✓ Respond to reviews (users get notified now!)

Responding to reviews. Google recently started notifying users that if the owner has responded to you, you’ll get an email. So all of that is really great, and those responses, it’s another signal to Google that you’re an engaged business.

✓ Diversify beyond Google My Business for reviews

Diversify. Don’t just focus on Google My Business. Look at other sites in your industry that are prominent review sites. You can find them if you just look for your own business name plus reviews, if you search that in Google, you’re going to see the sites that Google is saying are important for your particular business.

You can also find out like what are the sites that your competitors are getting reviews on. Then if you just do a search like keyword plus city, like “lawyers + Denver,” you might find sites that are important for your industry as well that you should be listed on. So check out a couple of your keywords and make sure you’re getting reviews on more sites than just Google.

3. Links

Then links, of course, links continue to drive local search. A lot of people in the comments talked about how a handful of local links have been really valuable. This is a great competitive difference-maker, because a lot of businesses don’t have any links other than citations. So when you get a few of these, it can really have an impact.

✓ From local industry sites and sponsorships

They really talk about focusing on local-specific sites and industry-specific sites. So you can get a lot of those from sponsorships. They’re kind of the go-to tactic. If you do a search for in title sponsors plus city name, you’re going to find a lot of sites that are listing their sponsors, and those are opportunities for you, in your city, that you could sponsor that event as well or that organization and get a link.

The future!

All right. So I also asked in the survey: Where do you see Google going in the future? We got a lot of great responses, and I tried to summarize that into three main themes here for you.

1. Keeping users on Google

This is a really big one. Google does not want to send its users to your website to get the answer. Google wants to have the answer right on Google so that they don’t have to click. It’s this zero-click search result. So you see Rand Fishkin talking about this. This has been happening in local for a long time, and it’s really amplified with all of these new features Google has been adding. They want to have all of your data so that they don’t have to send users to find it somewhere else. Then that means in the future less traffic to your website.

So Mike Blumenthal and David Mihm also talk about Google as your new homepage, and this concept is like branded search.

  • What does your branded search look like?
  • So what sites are you getting reviews on?
  • What does your knowledge panel look like?

Make that all look really good, because Google doesn’t want to send people to your new website.

2. More emphasis on behavioral signals

David Mihm is a strong voice in this. He talks about how Google is trying to diversify how they rank businesses based on what’s happening in the real world. They’re looking for real-world signals that actual humans care about this business and they’re engaging with this business.

So there’s a number of things that they can do to track that — so branded search, how many people are searching for your brand name, how many people are clicking to call your business, driving directions. This stuff is all kind of hard to manipulate, whereas you can add more links, you can get more reviews. But this stuff, this is a great signal for Google to rely on.

Engagement with your listing, engagement with your website, and actual humans in your business. If you’ve seen on the knowledge panel sometimes for brick-and-mortar business, it will be like busy times. They know when people are actually at your business. They have counts of how many people are going into your business. So that’s a great signal for them to use to understand the prominence of your business. Is this a busy business compared to all the other ones in the city?

3. Google will monetize everything

Then, of course, a trend to monetize as much as they can. Google is a publicly traded company. They want to make as much money as possible. They’re on a constant growth path. So there are a few things that we see coming down the pipeline.

Local service ads are expanding across the country and globally and in different industries. So this is like a paid program. You have to apply to get into it, and then Google takes a cut of leads. So if you are a member of this, then Google will send leads to you. But you have to be verified to be in there, and you have to pay to be in there.

Then taking a cut from bookings, you can now book directly on Google for a lot of different businesses. If you think about Google Flights and Google Hotels, Google is looking for a way to monetize all of this local search opportunity. That’s why they’re investing heavily in local search so they can make money from it. So seeing more of these kinds of features rolling out in the future is definitely coming. Transactions from other things. So if I did book something, then Google will take a cut for it.

So that’s the future. That’s sort of the news of the local search ranking factors this year. I hope it’s been helpful. If you have any questions, just leave some comments and I’ll make sure to respond to them all. Thanks, everybody.

Video transcription by Speechpad.com


If you missed our recent webinar on the Local Search Ranking Factors survey with Darren Shaw and Dr. Pete, don’t worry! You can still catch the recording here:

Check out the webinar

You’ll be in for a jam-packed hour of deeper insights and takeaways from the survey, as well as some great audience-contributed Q&A.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

7 Search Ranking Factors Analyzed: A Follow-Up Study

Posted by Jeff_Baker

Grab yourself a cup of coffee (or two) and buckle up, because we’re doing maths today.

Again.

Back it on up…

A quick refresher from last time: I pulled data from 50 keyword-targeted articles written on Brafton’s blog between January and June of 2018.

We used a technique of writing these articles published earlier on Moz that generates some seriously awesome results (we’re talking more than doubling our organic traffic in the last six months, but we will get to that in another publication).

We pulled this data again… Only I updated and reran all the data manually, doubling the dataset. No APIs. My brain is Swiss cheese.

We wanted to see how newly written, original content performs over time, and which factors may have impacted that performance.

Why do this the hard way, dude?

“Why not just pull hundreds (or thousands!) of data points from search results to broaden your dataset?”, you might be thinking. It’s been done successfully quite a few times!

Trust me, I was thinking the same thing while weeping tears into my keyboard.

The answer was simple: I wanted to do something different from the massive aggregate studies. I wanted a level of control over as many potentially influential variables as possible.

By using our own data, the study benefited from:

  • The same root Domain Authority across all content.
  • Similar individual URL link profiles (some laughs on that later).
  • Known original publish dates and without reoptimization efforts or tinkering.
  • Known original keyword targets for each blog (rather than guessing).
  • Known and consistent content depth/quality scores (MarketMuse).
  • Similar content writing techniques for targeting specific keywords for each blog.

You will never eliminate the possibility of misinterpreting correlation as causation. But controlling some of the variables can help.

As Rand once said in a Whiteboard Friday, “Correlation does not imply causation (but it sure is a hint).

Caveat:

What we gained in control, we lost in sample size. A sample size of 96 is much less useful than ten thousand, or a hundred thousand. So look at the data carefully and use discretion when considering the ranking factors you find most likely to be true.

This resource can help gauge the confidence you should put into each Pearson Correlation value. Generally, the stronger the relationship, the smaller sample size needed to be be confident in the results.

So what exactly have you done here?

We have generated hints at what may influence the organic performance of newly created content. No more, and no less. But they are indeed interesting hints and maybe worth further discussion or research.

What have you not done?

We have not published sweeping generalizations about Google’s algorithm. This post should not be read as a definitive guide to Google’s algorithm, nor should you assume that your site will demonstrate the same correlations.

So what should I do with this data?

The best way to read this article, is to observe the potential correlations we observed with our data and consider the possibility of how those correlations may or may not apply to your content and strategy.

I’m hoping that this study takes a new approach to studying individual URLs and stimulates constructive debate and conversation.

Your constructive criticism is welcome, and hopefully pushes these conversations forward!

The stat sheet

So quit jabbering and show me the goods, you say? Alright, let’s start with our stats sheet, formatted like a baseball card, because why not?:

*Note: Only blogs with complete ranking data were used in the study. We threw out blogs with missing data rather than adding arbitrary numbers.

And as always, here is the original data set if you care to reproduce my results.

So now the part you have been waiting for…

The analysis

To start, please use a refresher on the Pearson Correlation Coefficient from my last blog post, or Rand’s.

1. Time and performance

I started with a question: “Do blogs age like a Macallan 18 served up neat on a warm summer Friday afternoon, or like tepid milk on a hot summer Tuesday?

Does the time indexed play a role in how a piece of content performs?

Correlation 1: Time and target keyword position

First we will map the target keyword ranking positions against the number of days its corresponding blog has been indexed. Visually, if there is any correlation we will see some sort of negative or positive linear relationship.

There is a clear negative relationship between the two variables, which means the two variables may be related. But we need to go beyond visuals and use the PCC.

Days live vs. target keyword position

PCC

-.343

Relationship

Moderate

The data shows a moderate relationship between how long a blog has been indexed and the positional ranking of the target keyword.

But before getting carried away, we shouldn’t solely trust one statistical method and call it a day. Let’s take a look at things another way: Let’s compare the average age of articles whose target keywords rank in the top ten against the average age of articles whose target keywords rank outside the top ten.

Average age of articles based on position

Target KW position ≤ 10

144.8 days

Target KW position > 10

84.1 days

Now a story is starting to become clear: Our newly written content takes a significant amount of time to fully mature.

But for the sake of exhausting this hint, let’s look at the data one final way. We will group the data into buckets of target keyword positions, and days indexed, then apply them to a heatmap.

This should show us a clear visual clustering of how articles perform over time.

This chart, quite literally, paints a picture. According to the data, we shouldn’t expect a new article to realize its full potential until at least 100 days, and likely longer. As a blog post ages, it appears to gain more favorable target keyword positioning.

Correlation 2: Time and total ranking keywords on URL

You’ll find that when you write an article it will (hopefully) rank for the keyword you target. But often times it will also rank for other keywords. Some of these are variants of the target keyword, some are tangentially related, and some are purely random noise.

Instinct will tell you that you want your articles to rank for as many keywords as possible (ideally variants and tangentially related keywords).

Predictably, we have found that the relationship between the number of keywords an article ranks for and its estimated monthly organic traffic (per SEMrush) is strong (.447).

We want all of our articles to do things like this:

We want lots of variants each with significant search volume. But, does an article increase the total number of keywords it ranks for over time? Let’s take a look.

Visually this graph looks a little murky due to the existence of two clear outliers on the far right. We will first run the analysis with the outliers, and again without. With the outliers, we observe the following:

Days live vs. total keywords ranking on URL (w/outliers)

PCC

.281

Relationship

Weak/borderline moderate

There appears to be a relationship between the two variables, but it isn’t as strong. Let’s see what happens when we remove those two outliers:

Visually, the relationship looks stronger. Let’s look at the PCC:

Days live vs. total keywords ranking on URL (without outliers)

PCC

.390

Relationship

Moderate/borderline strong

The relationship appears to be much stronger with the two outliers removed.

But again, let’s look at things another way.

Let’s look at the average age of the top 25% of articles and compare them to the average age of the bottom 25% of articles:

Average age of top 25% of articles versus bottom 25%

Top 25%

148.9 days

Bottom 25%

73.8 days

This is exactly why we look at data multiple ways! The top 25% of blog posts with the most ranking keywords have been indexed an average of 149 days, while the bottom 25% have been indexed 74 days — roughly half.

To be fully sure, let’s again cluster the data into a heatmap to observe where performance falls on the time continuum:

We see a very similar pattern as in our previous analysis: a clustering of top-performing blogs starting at around 100 days.

Time and performance assumptions

You still with me? Good, because we are saying something BIG here. In our observation, it takes between 3 and 5 months for new content to perform in organic search. Or at the very least, mature.

To look at this one final way, I’ve created a scatterplot of only the top 25% of highest performing blogs and compared them to their time indexed:

There are 48 data plots on this chart, the blue plots represent the top 25% of articles in terms of strongest target keyword ranking position. The orange plots represent the top 25% of articles with the highest number of keyword rankings on their URL. (These can be, and some are, the same URL.)

Looking at the data a little more closely, we see the following:

90% of the top 25% of highest-performing content took at least 100 days to mature, and only two articles took less than 75 days.

Time and performance conclusion

For those of you just starting a content marketing program, remember that you may not see the full organic potential for your first piece of content until month 3 at the earliest. And, it takes at least a couple months of content production to make a true impact, so you really should wait a minimum of 6 months to look for any sort of results.

In conclusion, we expect new content to take at least 100 days to fully mature.

2. Links

But wait, some of you may be saying. What about links, buddy? Articles build links over time, too!

It stands to reason that, over time, a blog will gain links (and ranking potential) over time. Links matter, and higher positioned rankings gain links at a faster rate. Thus, we are at risk of misinterpreting correlation for causation if we don’t look at this carefully.

But what none of you know, that I know, is that being the terrible SEO that I am, I had no linking strategy with this campaign.

And I mean zero strategy. The average article generated 1.3 links from .5 linking domains.

Nice.

Linking domains vs. target keyword position

PCC

-.022

Relationship

None

Average linking domains to top 25% of articles

.46

Average linking domains to bottom 25% of articles

.46

The one thing consistent across all the articles was a shocking and embarrassing lack of inbound links. This is demonstrated by an insignificant correlation coefficient of -.022. The same goes for the total number of links per URL, with a correlation coefficient of -.029.

These articles appear to have performed primarily on their content rather than inbound links.

(And they certainly would have performed much better with a strong, or any, linking strategy. Nobody is arguing the value of links here.) But mostly…

Shame on me.

Shame. Shame. Shame.

But on a positive note, we were able to generate a more controlled experiment on the effects of time and blog performance. So, don’t fire me just yet?

Note: It would be interesting to pull link quality metrics into the discussion (for the precious few links we did earn) rather than total volume. However, after a cursory look at the data, nothing stood out as being significant.

3. Word count

Content marketers and SEOs love talking about word count. And for good reason. When we collectively agreed that “quality content” was the key to rankings, it would stand to reason that longer content would be more comprehensive, and thus do a better job of satisfying searcher intent. So let’s test that theory.

Correlation 1: Target keyword position versus total word count

Will longer articles increase the likelihood of ranking for the keyword you are targeting?

Not in our case. To be sure, let’s run a similar analysis as before.

Word count vs. target keyword position

PCC

.111

Relationship

Negligible

Average word count of top 25% articles

1,774

Average word count of bottom 25% articles

1,919

The data shows no impact on rankings based on the length of our articles.

Correlation 2: Total keywords ranking on URL versus word count

One would think that longer content would result in is additional ranking keywords, right? Even by accident, you would think that the more related topics you discuss in an article, the more keywords you will rank for. Let’s see if that’s true:

Total keywords ranking on URL vs. word count

PCC

-.074

Relationship

None

Not in this case.

Word count, speculative tangent

So how can it be that so many studies demonstrate higher word counts result in more favorable rankings? Some reconciliation is in order, so allow me to speculate on what I think may be happening in these studies.

  1. Most likely: Measurement techniques. These studies generally look at one factor relative to rankings: average absolute word count based on position. (And, there actually isn’t much of a difference in average word count between position one and ten.)
  2. As we are demonstrating in this article, there may be many other factors at play that need to be isolated and tested for correlations in order to get the full picture, such as: time indexed, on-page SEO (to be discussed later), Domain Authority, link profile, and depth/quality of content (also to be discussed later with MarketMuse as a measure). It’s possible that correlation does not imply correlation, and by using word count averages as the single method of measure, we may be painting too broad of a stroke.

  3. Likely: High quality content is longer, by nature. We know that “quality content” is discussed in terms of how well a piece satisfies the intent of the reader. In an ideal scenario, you will create content that fully satisfies everything a searcher would want to know about a given topic. Ideally you own the resource center for the topic, and the searcher does not need to revisit SERPs and weave together answers from multiple sources. By nature, this type of comprehensive content is quite lengthy. Long-form content is arguably a byproduct of creating for quality. Cyrus Shepard does a better job of explaining this likelihood here.
  4. Less likely: Long-form threshold. The articles we wrote for this study ranged from just under 1,000 words to nearly as high as 4,000 words. One could consider all of these as “long-form content,” and perhaps Google does as well. Perhaps there is a word count threshold that Google uses.

This is all speculation. What we can say for certain is that all our content is 900 words and up, and shows no incremental benefit to be had from additional length.

Feel free to disagree with any (or all) of my speculations on my interpretation of the discrepancies of results, but I tend to have the same opinion as Brian Dean with the information available.

4. MarketMuse

At this point, most of you are familiar with MarketMuse. They have created a number of AI-powered tools that help with content planning and optimization.

We use the Content Optimizer tool, which evaluates the top 20 results for any keyword and generates an outline of all the major topics being discussed in SERPs. This helps you create content that is more comprehensive than your competitors, which can lead to better performance in search.

Based on the competitive landscape, the tool will generate a recommended content score (their proprietary algorithm) that you should hit in order to compete with the competing pages ranking in SERPs.

But… if you’re a competitive fellow, what happens if you want to blow the recommended score out of the water? Do higher scores have an impact on rankings? Does it make a difference if your competition has a very low average score?

We pulled every article’s content score, along with MarketMuse’s recommended scores and the average competitor scores, to answer these questions.

Correlation 1: Overall MarketMuse content score

Does a higher overall content score result in better rankings? Let’s take a look:

Absolute MarketMuse score vs. target keyword position

PCC

.000

Relationship

None

A perfect zero! We weren’t able to beat the system by racking up points. I also checked to see if a higher absolute score would result in a larger number of keywords ranking on the URL — it doesn’t.

Correlation 2: Beating the recommended score

As mentioned, based on the competitive landscape, MarketMuse will generate a recommended content score. What happens if you blow the recommended score out of the water? Do you get bonus points?

In order to calculate this correlation, we pulled the content score percentage attainment and compared it to the target keyword position. For example, if we scored a 30 of recommended 25, we hit 120% attainment. Let’s see if it matters:

Percentage content score attainment vs. target keyword position

PCC

.028

Relationship

None

No bonus points for doing extra credit!

Correlation 3: Beating the average competitors’ scores

Okay, if you beat MarketMuse’s recommendations, you don’t get any added benefit, but what if you completely destroy your competitors’ average content scores?

We will calculate this correlation the same way we previously did, with percentage attainment over the average competitor. For example, if we scored a 30 over the average of 10, we hit 300% attainment. Let’s see if that matters:

Percentage attainment over average competitor score versus target KW position

PCC

-.043

Relationship

None

That didn’t work either! Seems that there are no hacks or shortcuts here.

MarketMuse summary

We know that MarketMuse works, but it seems that there are no additional tricks to this tool.

If you regularly hit the recommended score as we did (average 110% attainment, with 81% of blogs hitting 100% attainment or better) and cover the topics prescribed, you should do well. But don’t fixate on competitor scores or blowing the recommended score out of the water. You may just be wasting your time.

Note: It’s worth noting that we probably would have shown stronger correlations had we intentionally bombed a few MarketMuse scores. Perhaps a test for another day.

5. On-page optimization

Ah, old-school technical SEO. This type of work warms the cockles of a seasoned SEO’s heart. But does it still have a place in our constantly evolving world? Has Google advanced to the point where it doesn’t need technical cues from SEOs to understand what a page is about?

To find out, I have pulled Moz’s on-page optimization score for every article and compared them to the target keywords’ positional rankings:

Let’s take a look at the scatterplot for all the keyword targets.

Now looking at the math:

On-page optimization score vs. target keyword position

PCC

-.384

Relationship

Moderate/strong

Average on-page score for top 25%

91%

Average on-page score for bottom 25%

87%

If you have a keen eye you may have noticed a few strong outliers on the scatterplot. If we remove three of the largest outliers, the correlation goes up to -.435, a strong relationship.

Before we jump to conclusions, let’s look at this data one final way.

Let’s take a look at the percentage of articles with their target keywords ranking 1–10 that also have a 90% on-page score or better. We will compare that number to the percentage of articles ranking outside the top ten that also have a 90% on-page score or better.

If our assumption is correct, we will see a much higher percentage of keywords ranking 1–10 with an on-page score of 90% or better, and a lower number for articles ranking greater than 10.

On-page optimization score by rankings

Percentage of KWs ranking 1–10 with ≥ 90% score

73.5%

Percentage of keywords ranking >10 with ≥ 90% score

53.2%

This is enough of a hint for me. I’m implementing a 90% minimum on-page score from here on out.

Old school SEOs, rejoice!

6. The competition’s average word count

We won’t put this “word count” argument to bed just yet…

Let’s ask ourselves, “Does it matter how long the average content of the top 20 results is?”

Is there a relationship between the length of your content versus the average competitor?

What if your competitors are writing very short form, and you want to beat them with long-form content?

We will measure this the same way as before, with percentage attainment. For example, if the average word count of the top 20 results for “content marketing agency” is 300, and our piece is 450 words, we hit 150% attainment.

Let’s see if you can “out-verbose” your opponents.

Percentage word count attainment versus target KW position

PCC

.062

Relationship

None

Alright, I’ll put word count to bed now, I promise.

7. Keyword density

You’ve made it to the last analysis. Congratulations! How many cups of coffee have you consumed? No judgment; this report was responsible for entire coffee farms being completely decimated by yours truly.

For selfish reasons, I couldn’t resist the temptation to dispel this ancient tactic of “using target keywords” in blog content. You know what I’m talking about: when someone says “This blog doesn’t FEEL optimized… did you use the target keyword enough?”

There are still far too many people that believe that littering target keywords throughout a piece of content will yield results. And misguided SEO agencies, along with certain SEO tools, perpetuate this belief.

Yoast has a tool in WordPress that some digital marketers live and die by. They don’t think that a blog is complete until Yoast shows the magical green light, indicating that the content has satisfied the majority of its SEO recommendations:

Uh oh, keyword density is too low! Let’s see if it that ACTUALLY matters.

Not looking so good, my keyword-stuffing friends! Let’s take a look at the PCC:

Target keyword ranking position vs. Yoast keyword density

PCC

.097

Relationship

None/Negligible

Believers would like to see a negative relationship here; as the keyword density goes down, the ranking position decreases, producing a downward sloping line.

What we are looking at is a slightly upward-sloping line, which would indicate losing rankings by keyword stuffing — but fortunately not TOO upward sloping, given the low correlation value.

Okay, so PLEASE let that be the end of “keyword density.” This practice has been disproven in past studies, as referenced by Zyppy. Let’s confidently put this to bed, forever. Please.

Oh, and just for kicks, the Flesch Reading Ease score has no bearing on rankings either (-.03 correlation). Write to a third grade level, or a college level, it doesn’t matter.

TL;DR (I don’t blame you)

What we learned from our data

  1. Time: It took 100 days or more for an article to fully mature and show its true potential. A content marketing program probably shouldn’t be fully scrutinized until month 5 or 6 at the very earliest.
  2. Links: Links matter, I’m just terrible at generating them. Shame.
  3. Word count: It’s not about the length of the content, in absolute terms or relative to the competition. It’s about what is written and how resourceful it is.
  4. MarketMuse: We have proven that MarketMuse works as it prescribes, but there is no added benefit to breaking records.
  5. On-page SEO: Our data demonstrates that it still matters. We all still have a job.
  6. Competitor content length: We weren’t successful at blowing our competitors out of the water with longer content.
  7. Keyword density: Just stop. Join us in modern times. The water is warm.

In conclusion, some reasonable guidance we agree on is:

Wait at least 100 days to evaluate the performance of your content marketing program, write comprehensive content, and make sure your on-page SEO score is 90%+.

Oh, and build links. Unlike me. Shame.

Now go take a nap.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert