Tag Archive | "Score"

Shaan Patel: A Perfect SAT Score Led To A Super-Successful SAT Prep Business And A Life-Changing Deal With Mark Cuban On Shark Tank

[ Download MP3 | Transcript | iTunes | Soundcloud | Stitcher | Spotify | Raw RSS ] Shaan Patel grew up as a son of Indian immigrants, who worked hard to save enough money for a deposit on a property in their hometown of Las Vegas. However, rather than buy a house, the Patels bought a small […]

The post Shaan Patel: A Perfect SAT Score Led To A Super-Successful SAT Prep Business And A Life-Changing Deal With Mark Cuban On Shark Tank appeared first on Yaro.Blog.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

How to Impress and Score Your Next Freelance Writing Client

I have an affinity for service businesses. I love when people: Recognize that they possess specific skills that can help…

The post How to Impress and Score Your Next Freelance Writing Client appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Google (Almost Certainly) Has an Organic Quality Score (Or Something a Lot Like It) that SEOs Need to Optimize For – Whiteboard Friday

Posted by randfish

Entertain the idea, for a moment, that Google assigned a quality score to organic search results. Say it was based off of click data and engagement metrics, and that it would function in a similar way to the Google AdWords quality score. How exactly might such a score work, what would it be based off of, and how could you optimize for it?

While there’s no hard proof it exists, the organic quality score is a concept that’s been pondered by many SEOs over the years. In today’s Whiteboard Friday, Rand examines this theory inside and out, then offers some advice on how one might boost such a score.

Google's Organic Quality Score

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about organic quality score.

So this is a concept. This is not a real thing that we know Google definitely has. But there’s this concept that SEOs have been feeling for a long time, that similar to what Google has in their AdWords program with a paid quality score, where a page has a certain score assigned to it, that on the organic side Google almost definitely has something similar. I’ll give you an example of how that might work.

So, for example, if on my site.com I have these three — this is a very simplistic website — but I have these three subfolders: Products, Blog, and About. I might have a page in my products, 14axq.html, and it has certain metrics that Google associates with it through activity that they’ve seen from browser data, from clickstream data, from search data, and from visit data from the searches and bounces back to the search results, and all these kinds of things, all the engagement and click data that we’ve been talking about a lot this year on Whiteboard Friday.

So they may have these metrics, pogo stick rate and bounce rate and a deep click rate (the rate with which someone clicks to the site and then goes further in from that page), the time that they spend on the site on average, the direct navigations that people make to it each month through their browsers, the search impressions and search clicks, perhaps a bunch of other statistics, like whether people search directly for this URL, whether they perform branded searches. What rate do unique devices in one area versus another area do this with? Is there a bias based on geography or device type or personalization or all these kinds of things?

But regardless of that, you get this idea that Google has this sort of sense of how the page performs in their search results. That might be very different across different pages and obviously very different across different sites. So maybe this blog post over here on /blog is doing much, much better in all these metrics and has a much higher quality score as a result.

Current SEO theories about organic quality scoring:

Now, when we talk to SEOs, and I spend a lot of time talking to my fellow SEOs about theories around this, a few things emerge. I think most folks are generally of the opinion that if there is something like an organic quality score…

1. It is probably based on this type of data — queries, clicks, engagements, visit data of some kind.

We don’t doubt for a minute that Google has much more sophistication than the super-simplified stuff that I’m showing you here. I think Google publicly denies a lot of single types of metric like, “No, we don’t use time on site. Time on site could be very variable, and sometimes low time on site is actually a good thing.” Fine. But there’s something in there, right? They use some more sophisticated format of that.

2. We also are pretty sure that this is applying on three different levels:

This is an observation from experimentation as well as from Google statements which is…

  • Domain-wide, so that would be across one domain, if there are many pages with high quality scores, Google might view that domain differently from a domain with a variety of quality scores on it or one with generally low ones.
  • Same thing for a subdomain. So it could be that a subdomain is looked at differently than the main domain, or that two different subdomains may be viewed differently. If content appears to have high quality scores on this one, but not on this one, Google might generally not pass all the ranking signals or give the same weight to the quality scores over here or to the subdomain over here.
  • Same thing is true with subfolders, although to a lesser extent. In fact, this is kind of in descending order. So you can generally surmise that Google will pass these more across subfolders than they will across subdomains and more across subdomains than across root domains.

3. A higher density of good scores to bad ones can mean a bunch of good things:

  • More rankings in visibility even without other signals. So even if a page is sort of lacking in these other quality signals, if it is in this blog section, this blog section tends to have high quality scores for all the pages, Google might give that page an opportunity to rank well that it wouldn’t ordinarily for a page with those ranking signals in another subfolder or on another subdomain or on another website entirely.
  • Some sort of what we might call “benefit of the doubt”-type of boost, even for new pages. So a new page is produced. It doesn’t yet have any quality signals associated with it, but it does particularly well.

    As an example, within a few minutes of this Whiteboard Friday being published on Moz’s website, which is usually late Thursday night or very early Friday morning, at least Pacific time, I will bet that you can search for “Google organic quality score” or even just “organic quality score” in Google’s engine, and this Whiteboard Friday will perform very well. One of the reasons that probably is, is because many other Whiteboard Friday videos, which are in this same subfolder, Google has seen them perform very well in the search results. They have whatever you want to call it — great metrics, a high organic quality score — and because of that, this Whiteboard Friday that you’re watching right now, the URL that you see in the bar up above is almost definitely going to be ranking well, possibly in that number one position, even though it’s brand new. It hasn’t yet earned the quality signals, but Google assumes, it gives it the benefit of the doubt because of where it is.

  • We surmise that there’s also more value that gets passed from links, both internal and external, from pages with high quality scores. That is right now a guess, but something we hope to validate more, because we’ve seen some signs and some testing that that’s the case.

3 ways to boost your organic quality score

If this is true — and it’s up to you whether you want to believe that it is or not — even if you don’t believe it, you’ve almost certainly seen signs that something like it’s going on. I would urge you to do these three things to boost your organic quality score or whatever you believe is causing these same elements.

1. You could add more high-performing pages. So if you know that pages perform well and you know what those look like versus ones that perform poorly, you can make more good ones.

2. You can improve the quality score of existing pages. So if this one is kind of low, you’re seeing that these engagement and use metrics, the SERP click-through rate metrics, the bounce rate metrics from organic search visits, all of these don’t look so good in comparison to your other stuff, you can boost it, improve the content, improve the navigation, improve the usability and the user experience of the page, the load time, the visuals, whatever you’ve got there to hold searchers’ attention longer, to keep them engaged, and to make sure that you’re solving their problem. When you do that, you will get higher quality scores.

3. Remove low-performing pages through a variety of means. You could take a low-performing page and you might say, “Hey, I’m going to redirect that to this other page, which does a better job answering the query anyway.” Or, “Hey, I’m going to 404 that page. I don’t need it anymore. In fact, no one needs it anymore.” Or, “I’m going to no index it. Some people may need it, maybe the ones who are visitors to my website, who need it for some particular direct navigation purpose or internal purpose. But Google doesn’t need to see it. Searchers don’t need it. I’m going to use the no index, either in the meta robots tag or in the robots.txt file.”

One thing that’s really interesting to note is we’ve seen a bunch of case studies, especially since MozCon, when Britney Muller, Moz’s Head of SEO, shared the fact that she had done some great testing around removing tens of thousands of low-quality, really low-quality performing pages from Moz’s own website and seen our rankings and our traffic for the remainder of our content go up quite significantly, even controlling for seasonality and other things.

That was pretty exciting. When we shared that, we got a bunch of other people from the audience and on Twitter saying, “I did the same thing. When I removed low-performing pages, the rest of my site performed better,” which really strongly suggests that there’s something like a system in this fashion that works in this way.

So I’d urge you to go look at your metrics, go find pages that are not performing well, see what you can do about improving them or removing them, see what you can do about adding new ones that are high organic quality score, and let me know your thoughts on this in the comments.

We’ll look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

SearchCap: Google AdWords ad rank, quality score data & SEO strategies

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google AdWords ad rank, quality score data & SEO strategies appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

SearchCap: Google AdWords Play Store, Quality Score Update & Bing Ads Windows 10

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google AdWords Play Store, Quality Score Update & Bing Ads Windows 10 appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

New Features in OSE’s Spam Score & the Mozscape API

Posted by randfish

This week, we launched a feature inside Open Site Explorer that I’m very excited about – the Spam Score Histogram, found by clicking on the “spam analysis” tab:

The histogram is particularly useful for visualizing the distribution of potentially spammy links that show up in a site’s link profile. Above, for example, we’re looking at Moz.com, with a strong distribution of sites that have 0-5 spam flags. According to our research, that means the vast majority of those sites are unlikely to be penalized or banned by Google.

For more detail on Moz’s Spam Score, check out the original blog post and my Whiteboard Friday.

The new histogram view lets us do nice comparisons like these:

Houzz.com has a very large list of sketchy-looking sites linking to them (many seem to be very thin content sites from China, curiously).

Competitor (well, sort-of-competitor), Porch.com, has a much smaller link profile with a very different distribution. Their link profile looks even healthier than Moz’s to me!

But, the new spam score histogram isn’t the only new feature. We’ve also got a new power available to API users – the ability to query data from the previous index. If you want to know what a previous Domain Authority score looked like, or how many links we reported to a page in our last index, you can now do so using the Moz API. If you want to get started, check out the documentation here, or get in touch directly with Chris Airola (email chris.airola at moz.com), who manages paid API accounts and loves to help.

Many thanks to the Research Tools and Big Data teams at Moz, who’ve worked to make this possible. I’m happy to answer questions and will try to be in the comments here frequently. I wish you good spam exploring my friends!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Spam Score: Moz’s New Metric to Measure Penalization Risk

Posted by randfish

Today, I’m very excited to announce that Moz’s Spam Score, an R&D project we’ve worked on for nearly a year, is finally going live. In this post, you can learn more about how we’re calculating spam score, what it means, and how you can potentially use it in your SEO work.

How does Spam Score work?

Over the last year, our data science team, led by 
Dr. Matt Peters, examined a great number of potential factors that predicted that a site might be penalized or banned by Google. We found strong correlations with 17 unique factors we call “spam flags,” and turned them into a score.

Almost every subdomain in 
Mozscape (our web index) now has a Spam Score attached to it, and this score is viewable inside Open Site Explorer (and soon, the MozBar and other tools). The score is simple; it just records the quantity of spam flags the subdomain triggers. Our correlations showed that no particular flag was more likely than others to mean a domain was penalized/banned in Google, but firing many flags had a very strong correlation (you can see the math below).

Spam Score currently operates only on the subdomain level—we don’t have it for pages or root domains. It’s been my experience and the experience of many other SEOs in the field that a great deal of link spam is tied to the subdomain-level. There are plenty of exceptions—manipulative links can and do live on plenty of high-quality sites—but as we’ve tested, we found that subdomain-level Spam Score was the best solution we could create at web scale. It does a solid job with the most obvious, nastiest spam, and a decent job highlighting risk in other areas, too.

How to access Spam Score

Right now, you can find Spam Score inside 
Open Site Explorer, both in the top metrics (just below domain/page authority) and in its own tab labeled “Spam Analysis.” Spam Score is only available for Pro subscribers right now, though in the future, we may make the score in the metrics section available to everyone (if you’re not a subscriber, you can check it out with a free trial). 

The current Spam Analysis page includes a list of subdomains or pages linking to your site. You can toggle the target to look at all links to a given subdomain on your site, given pages, or the entire root domain. You can further toggle source tier to look at the Spam Score for incoming linking pages or subdomains (but in the case of pages, we’re still showing the Spam Score for the subdomain on which that page is hosted).

You can click on any Spam Score row and see the details about which flags were triggered. We’ll bring you to a page like this:

Back on the original Spam Analysis page, at the very bottom of the rows, you’ll find an option to export a disavow file, which is compatible with Google Webmaster Tools. You can choose to filter the file to contain only those sites with a given spam flag count or higher:

Disavow exports usually take less than 3 hours to finish. We can send you an email when it’s ready, too.

WARNING: Please do not export this file and simply upload it to Google! You can really, really hurt your site’s ranking and there may be no way to recover. Instead, carefully sort through the links therein and make sure you really do want to disavow what’s in there. You can easily remove/edit the file to take out links you feel are not spam. When Moz’s Cyrus Shepard disavowed every link to his own site, it took more than a year for his rankings to return!

We’ve actually made the file not-wholly-ready for upload to Google in order to be sure folks aren’t too cavalier with this particular step. You’ll need to open it up and make some edits (specifically to lines at the top of the file) in order to ready it for Webmaster Tools

In the near future, we hope to have Spam Score in the Mozbar as well, which might look like this: 

Sweet, right? :-)

Potential use cases for Spam Analysis

This list probably isn’t exhaustive, but these are a few of the ways we’ve been playing around with the data:

  1. Checking for spammy links to your own site: Almost every site has at least a few bad links pointing to it, but it’s been hard to know how much or how many potentially harmful links you might have until now. Run a quick spam analysis and see if there’s enough there to cause concern.
  2. Evaluating potential links: This is a big one where we think Spam Score can be helpful. It’s not going to catch every potentially bad link, and you should certainly still use your brain for evaluation too, but as you’re scanning a list of link opportunities or surfing to various sites, having the ability to see if they fire a lot of flags is a great warning sign.
  3. Link cleanup: Link cleanup projects can be messy, involved, precarious, and massively tedious. Spam Score might not catch everything, but sorting links by it can be hugely helpful in identifying potentially nasty stuff, and filtering out the more probably clean links.
  4. Disavow Files: Again, because Spam Score won’t perfectly catch everything, you will likely need to do some additional work here (especially if the site you’re working on has done some link buying on more generally trustworthy domains), but it can save you a heap of time evaluating and listing the worst and most obvious junk.

Over time, we’re also excited about using Spam Score to help improve the PA and DA calculations (it’s not currently in there), as well as adding it to other tools and data sources. We’d love your feedback and insight about where you’d most want to see Spam Score get involved.

Details about Spam Score’s calculation

This section comes courtesy of Moz’s head of data science, Dr. Matt Peters, who created the metric and deserves (at least in my humble opinion) a big round of applause. – Rand

Definition of “spam”

Before diving into the details of the individual spam flags and their calculation, it’s important to first describe our data gathering process and “spam” definition.

For our purposes, we followed Google’s definition of spam and gathered labels for a large number of sites as follows.

  • First, we randomly selected a large number of subdomains from the Mozscape index stratified by mozRank.
  • Then we crawled the subdomains and threw out any that didn’t return a “200 OK” (redirects, errors, etc).
  • Finally, we collected the top 10 de-personalized, geo-agnostic Google-US search results using the full subdomain name as the keyword and checked whether any of those results matched the original keyword. If they did not, we called the subdomain “spam,” otherwise we called it “ham.”

We performed the most recent data collection in November 2014 (after the Penguin 3.0 update) for about 500,000 subdomains.

Relationship between number of flags and spam

The overall Spam Score is currently an aggregate of 17 different “flags.” You can think of each flag a potential “warning sign” that signals that a site may be spammy. The overall likelihood of spam increases as a site accumulates more and more flags, so that the total number of flags is a strong predictor of spam. Accordingly, the flags are designed to be used together—no single flag, or even a few flags, is cause for concern (and indeed most sites will trigger at least a few flags).

The following table shows the relationship between the number of flags and percent of sites with those flags that we found Google had penalized or banned:

ABOVE: The overall probability of spam vs. the number of spam flags. Data collected in Nov. 2014 for approximately 500K subdomains. The table also highlights the three overall danger levels: low/green (< 10%) moderate/yellow (10-50%) and high/red (>50%)

The overall spam percent averaged across a large number of sites increases in lock step with the number of flags; however there are outliers in every category. For example, there are a small number of sites with very few flags that are tagged as spam by Google and conversely a small number of sites with many flags that are not spam.

Spam flag details

The individual spam flags capture a wide range of spam signals link profiles, anchor text, on page signals and properties of the domain name. At a high level the process to determine the spam flags for each subdomain is:

  • Collect link metrics from Mozscape (mozRank, mozTrust, number of linking domains, etc).
  • Collect anchor text metrics from Mozscape (top anchor text phrases sorted by number of links)
  • Collect the top five pages by Page Authority on the subdomain from Mozscape
  • Crawl the top five pages plus the home page and process to extract on page signals
  • Provide the output for Mozscape to include in the next index release cycle

Since the spam flags are incorporated into in the Mozscape index, fresh data is released with each new index. Right now, we crawl and process the spam flags for each subdomains every two – three months although this may change in the future.

Link flags

The following table lists the link and anchor text related flags with the the odds ratio for each flag. For each flag, we can compute two percents: the percent of sites with that flag that are penalized by Google and the percent of sites with that flag that were not penalized. The odds ratio is the ratio of these percents and gives the increase in likelihood that a site is spam if it has the flag. For example, the first row says that a site with this flag is 12.4 times more likely to be spam than one without the flag.

ABOVE: Description and odds ratio of link and anchor text related spam flags. In addition to a description, it lists the odds ratio for each flag which gives the overall increase in spam likelihood if the flag is present).

Working down the table, the flags are:

  • Low mozTrust to mozRank ratio: Sites with low mozTrust compared to mozRank are likely to be spam.
  • Large site with few links: Large sites with many pages tend to also have many links and large sites without a corresponding large number of links are likely to be spam.
  • Site link diversity is low: If a large percentage of links to a site are from a few domains it is likely to be spam.
  • Ratio of followed to nofollowed subdomains/domains (two separate flags): Sites with a large number of followed links relative to nofollowed are likely to be spam.
  • Small proportion of branded links (anchor text): Organically occurring links tend to contain a disproportionate amount of banded keywords. If a site does not have a lot of branded anchor text, it’s a signal the links are not organic.

On-page flags

Similar to the link flags, the following table lists the on page and domain name related flags:

ABOVE: Description and odds ratio of on page and domain name related spam flags. In addition to a description, it lists the odds ratio for each flag which gives the overall increase in spam likelihood if the flag is present).

  • Thin content: If a site has a relatively small ratio of content to navigation chrome it’s likely to be spam.
  • Site mark-up is abnormally small: Non-spam sites tend to invest in rich user experiences with CSS, Javascript and extensive mark-up. Accordingly, a large ratio of text to mark-up is a spam signal.
  • Large number of external links: A site with a large number of external links may look spammy.
  • Low number of internal links: Real sites tend to link heavily to themselves via internal navigation and a relative lack of internal links is a spam signal.
  • Anchor text-heavy page: Sites with a lot of anchor text are more likely to be spam then those with more content and less links.
  • External links in navigation: Spam sites may hide external links in the sidebar or footer.
  • No contact info: Real sites prominently display their social and other contact information.
  • Low number of pages found: A site with only one or a few pages is more likely to be spam than one with many pages.
  • TLD correlated with spam domains: Certain TLDs are more spammy than others (e.g. pw).
  • Domain name length: A long subdomain name like “bycheapviagra.freeshipping.onlinepharmacy.com” may indicate keyword stuffing.
  • Domain name contains numerals: domain names with numerals may be automatically generated and therefore spam.

If you’d like some more details on the technical aspects of the spam score, check out the 
video of Matt’s 2012 MozCon talk about Algorithmic Spam Detection or the slides (many of the details have evolved, but the overall ideas are the same):

We’d love your feedback

As with all metrics, Spam Score won’t be perfect. We’d love to hear your feedback and ideas for improving the score as well as what you’d like to see from it’s in-product application in the future. Feel free to leave comments on this post, or to email Matt (matt at moz dot com) and me (rand at moz dot com) privately with any suggestions.

Good luck cleaning up and preventing link spam!



Not a Pro Subscriber? No problem!



Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google Quality Score Algorithm Update?

A couple savvy advertisers posted in WebmasterWorld that they noticed major changes to their quality score metrics for a nice number of keywords in mid-October…


Search Engine Roundtable

Posted in IM NewsComments Off

SearchCap: ForgetMe Link Removals, AdWords Quality Score Primer, Knowledge Graph Shows “How To”

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: Reputation VIP Launches Link Removal Service In Response To EU “Right To Be Forgotten” Ruling In response to the European Court’s “right to be forgotten” ruling…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Google Gets Lowest Score Ever In Customer Satisfaction Survey

There’s a mystery surrounding the latest customer satisfaction numbers released by ForeSee Results (for the American Customer Satisfaction Index [ACSI]). Published late last night, they indicate the lowest levels of consumer satisfaction with search engines (and portals) since 2003. The…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Advert