Tag Archive | "Beginners"

Now Live for Your SEO Learning Pleasure: The NEW Beginner’s Guide to SEO!

Posted by FeliciaCrawford

It feels like it’s been a king’s age since we first began our long journey to rewrite and revamp the Beginner’s Guide to SEO. For all the long months of writing and rewriting, of agonizing over details and deleting/replacing sections every so often as Google threw us for a loop, it’s hard to believe it’s finally ready to share:

The new Beginner’s Guide to SEO is here!

What makes this new version so darn special and sparkly, anyway?

I’m glad you asked! Our design team would breathe a sigh of relief and tell you it’s because this baby is on-brand and ready to rock your eyeballs to next Tuesday with its use of fancy, scalable SVGs and images complete with alt text descriptions. Our team of SEO experts would blot the sweat from their collective brow and tell you it’s because we’ve retooled and completely updated all our recommendations to ensure we’re giving fledgling learners the most accurate push out of the digital marketing nest that we can. Our developers would tell you it’s because it lives on a brand-spankin’-new CMS and they no longer have to glare silently at my thirteenth Slack message of the day asking them to fix the misplaced period on the fourth paragraph from the top in Chapter 7.

All joking aside, every bit of the above is true, and each perspective pulls together a holistic answer: this version of the Beginner’s Guide represents a new era for the number-one resource for learning SEO, one where we can update it at the drop of a Google algorithm-shaped hat, where it’s easier than ever to access and learn for a greater variety of people, where you can rely on the fact that the information is solid, up-to-date, and molded to best fit the learning journey unique to SEO.

I notice the structure is a little different, what gives?

We can’t escape your eagle eyes! We structured the new guide quite differently from the original. Everything is explained in our introduction, but here’s the gist: taking inspiration from Maslow’s hierarchy of needs, we built each chapter based on the core foundation of how one ought to go about doing SEO, covering the most integral needs first before leveling up to the next.

A pyramid of SEO needs mimicking Maslow's Hierarchy of Needs theory of psychology.

We affectionately call this “Mozlow’s Hierarchy of Needs.” Please forgive us.

A small but mighty team

While it may have taken us a full year and a half to get to this point, there was but a small team behind the effort. We owe a huge amount of gratitude to the following folks for balancing their other priorities with the needs of the new Beginner’s Guide and putting their all into making this thing shine:

Britney Muller, our brilliant SEO scientist and the brains behind all the new content. Words cannot do justice to the hours she spent alone and after hours before a whiteboard, Post-Its and dry-erase notes making up the bones and muscles and soul of what would someday become this fully-fleshed-out guide. For all the many, many blog comments answered and incorporated, for all the emails and Twitter messages fielded, for all the love and hard work and extra time she spent pouring into the new content, we have to give a heartfelt and extremely loud and boisterous THANK YOU. This guide wouldn’t exist without her expertise, attention to detail, and commitment to excellence.

Kameron Jenkins, our SEO wordsmith and all-around content superheroine. Her exquisite grasp of the written word and extensive experience as an agency SEO were paramount in pulling together disparate feedback, finessing complicated concepts into simple and understandable terms, and organizing the information in ways most conducive to aiding new learners. Again, this guide wouldn’t be here without her positive attitude and incredible, expert help.

Trevor Klein, editor extraordinaire. His original vision of organizing it according to the SEO hierarchy of needs provided the insight and architecture necessary to structuring the guide in a completely new and utterly helpful way. Many of the words, voice, and tone therein belong to him, and we deeply appreciate the extra polish and shine he lent to this monumental effort.

Skye Stewart, talented designer and UX aficionado. All the delightful images you’ll find within the chapters are compliments of her careful handiwork, from the robo-librarian of Chapter 2 to the meat-grinder-turned-code-renderer of Chapter 5. The new Beginner’s Guide would be an infinitely less whimsical experience without her creativity and vision.

Casey Coates, software engineer and mystical CMS-wizard-come-miracle-maker. I can safely say that there is no way you would be exploring the brand-new Beginner’s Guide in any coherent manner without his help. For all the last-minute additions to CMS deploys, for calmly fielding all the extra questions and asks, for being infinitely responsive and helpful (case in point: adding alt text to the image block less than two minutes after I asked for it) and for much, much more, we are grateful.

There are a great many other folks who helped get this effort underway: Shelly Matsudaira, Aaron Kitney, Jeff Crump, and Cyrus Shepard for their integral assistance moving this thing past the finish line; Rand Fishkin, of course, for creating the original and longest-enduring version of this guide; and to all of you, our dear community, for all the hours you spent reading our first drafts and sharing your honest thoughts, extremely constructive criticisms, and ever-humbling praise. This couldn’t exist without you!

Y’all ready for this?

With tender pride and only a hint of the sort of naturally occurring anxiety that accompanies any big content debut, we’re delighted and excited for you to dive into the brand-new Beginner’s Guide to SEO. The original has been read over ten million times, a mind-boggling and truly humbling number. We can only hope that our newest incarnation is met by a similar number of bright minds eager to dive into the exhilarating, challenging, complex, and lucrative world of SEO.

Whether you’re just starting out, want to jog your memory on the fundamentals, need to clue in colleagues to the complexity of your work, or are just plain curious about what’s changed, we hope from the bottom of our hearts that you get what you need from the new Beginner’s Guide.

Dive in and let us know what you think!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 7: Measuring, Prioritizing, & Executing SEO

Posted by BritneyMuller

It’s finally here, for your review and feedback: Chapter 7 of the new Beginner’s Guide to SEO, the last chapter. We cap off the guide with advice on how to measure, prioritize, and execute on your SEO. And if you missed them, check out the drafts of our outline, Chapter One, Chapter Two, Chapter Three, Chapter FourChapter Five, and Chapter Six for your reading pleasure. As always, let us know what you think of Chapter 7 in the comments!


Set yourself up for success.

They say if you can measure something, you can improve it.

In SEO, it’s no different. Professional SEOs track everything from rankings and conversions to lost links and more to help prove the value of SEO. Measuring the impact of your work and ongoing refinement is critical to your SEO success, client retention, and perceived value.

It also helps you pivot your priorities when something isn’t working.

Start with the end in mind

While it’s common to have multiple goals (both macro and micro), establishing one specific primary end goal is essential.

The only way to know what a website’s primary end goal should be is to have a strong understanding of the website’s goals and/or client needs. Good client questions are not only helpful in strategically directing your efforts, but they also show that you care.

Client question examples:

  1. Can you give us a brief history of your company?
  2. What is the monetary value of a newly qualified lead?
  3. What are your most profitable services/products (in order)?

Keep the following tips in mind while establishing a website’s primary goal, additional goals, and benchmarks:

Goal setting tips

  • Measurable: If you can’t measure it, you can’t improve it.
  • Be specific: Don’t let vague industry marketing jargon water down your goals.
  • Share your goals: Studies have shown that writing down and sharing your goals with others boosts your chances of achieving them.

Measuring

Now that you’ve set your primary goal, evaluate which additional metrics could help support your site in reaching its end goal. Measuring additional (applicable) benchmarks can help you keep a better pulse on current site health and progress.

Engagement metrics

How are people behaving once they reach your site? That’s the question that engagement metrics seek to answer. Some of the most popular metrics for measuring how people engage with your content include:

Conversion rate – The number of conversions (for a single desired action/goal) divided by the number of unique visits. A conversion rate can be applied to anything, from an email signup to a purchase to account creation. Knowing your conversion rate can help you gauge the return on investment (ROI) your website traffic might deliver.

In Google Analytics, you can set up goals to measure how well your site accomplishes its objectives. If your objective for a page is a form fill, you can set that up as a goal. When site visitors accomplish the task, you’ll be able to see it in your reports.

Time on page – How long did people spend on your page? If you have a 2,000-word blog post that visitors are only spending an average of 10 seconds on, the chances are slim that this content is being consumed (unless they’re a mega-speed reader). However, if a URL has a low time on page, that’s not necessarily bad either. Consider the intent of the page. For example, it’s normal for “Contact Us” pages to have a low average time on page.

Pages per visit – Was the goal of your page to keep readers engaged and take them to a next step? If so, then pages per visit can be a valuable engagement metric. If the goal of your page is independent of other pages on your site (ex: visitor came, got what they needed, then left), then low pages per visit are okay.

Bounce rate – “Bounced” sessions indicate that a searcher visited the page and left without browsing your site any further. Many people try to lower this metric because they believe it’s tied to website quality, but it actually tells us very little about a user’s experience. We’ve seen cases of bounce rate spiking for redesigned restaurant websites that are doing better than ever. Further investigation discovered that people were simply coming to find business hours, menus, or an address, then bouncing with the intention of visiting the restaurant in person. A better metric to gauge page/site quality is scroll depth.

Scroll depth – This measures how far visitors scroll down individual webpages. Are visitors reaching your important content? If not, test different ways of providing the most important content higher up on your page, such as multimedia, contact forms, and so on. Also consider the quality of your content. Are you omitting needless words? Is it enticing for the visitor to continue down the page? Scroll depth tracking can be set up in your Google Analytics.

Search traffic

Ranking is a valuable SEO metric, but measuring your site’s organic performance can’t stop there. The goal of showing up in search is to be chosen by searchers as the answer to their query. If you’re ranking but not getting any traffic, you have a problem.

But how do you even determine how much traffic your site is getting from search? One of the most precise ways to do this is with Google Analytics.

Using Google Analytics to uncover traffic insights

Google Analytics (GA) is bursting at the seams with data — so much so that it can be overwhelming if you don’t know where to look. This is not an exhaustive list, but rather a general guide to some of the traffic data you can glean from this free tool.

Isolate organic traffic – GA allows you to view traffic to your site by channel. This will mitigate any scares caused by changes to another channel (ex: total traffic dropped because a paid campaign was halted, but organic traffic remained steady).

Traffic to your site over time – GA allows you to view total sessions/users/pageviews to your site over a specified date range, as well as compare two separate ranges.

How many visits a particular page has received – Site Content reports in GA are great for evaluating the performance of a particular page — for example, how many unique visitors it received within a given date range.

Traffic from a specified campaign – You can use UTM (urchin tracking module) codes for better attribution. Designate the source, medium, and campaign, then append the codes to the end of your URLs. When people start clicking on your UTM-code links, that data will start to populate in GA’s “campaigns” report.

Click-through rate (CTR) – Your CTR from search results to a particular page (meaning the percent of people that clicked your page from search results) can provide insights on how well you’ve optimized your page title and meta description. You can find this data in Google Search Console, a free Google tool.

In addition, Google Tag Manager is a free tool that allows you to manage and deploy tracking pixels to your website without having to modify the code. This makes it much easier to track specific triggers or activity on a website.

Additional common SEO metrics

  • Domain Authority & Page Authority (DA/PA) – Moz’s proprietary authority metrics provide powerful insights at a glance and are best used as benchmarks relative to your competitors’ Domain Authority and Page Authority.
  • Keyword rankings – A website’s ranking position for desired keywords. This should also include SERP feature data, like featured snippets and People Also Ask boxes that you’re ranking for. Try to avoid vanity metrics, such as rankings for competitive keywords that are desirable but often too vague and don’t convert as well as longer-tail keywords.
  • Number of backlinks – Total number of links pointing to your website or the number of unique linking root domains (meaning one per unique website, as websites often link out to other websites multiple times). While these are both common link metrics, we encourage you to look more closely at the quality of backlinks and linking root domains your site has.

How to track these metrics

There are lots of different tools available for keeping track of your site’s position in SERPs, site crawl health, SERP features, and link metrics, such as Moz Pro and STAT.

The Moz and STAT APIs (among other tools) can also be pulled into Google Sheets or other customizable dashboard platforms for clients and quick at-a-glance SEO check-ins. This also allows you to provide more refined views of only the metrics you care about.

Dashboard tools like Data Studio, Tableau, and PowerBI can also help to create interactive data visualizations.

Evaluating a site’s health with an SEO website audit

By having an understanding of certain aspects of your website — its current position in search, how searchers are interacting with it, how it’s performing, the quality of its content, its overall structure, and so on — you’ll be able to better uncover SEO opportunities. Leveraging the search engines’ own tools can help surface those opportunities, as well as potential issues:

  • Google Search Console – If you haven’t already, sign up for a free Google Search Console (GSC) account and verify your website(s). GSC is full of actionable reports you can use to detect website errors, opportunities, and user engagement.
  • Bing Webmaster Tools – Bing Webmaster Tools has similar functionality to GSC. Among other things, it shows you how your site is performing in Bing and opportunities for improvement.
  • Lighthouse Audit – Google’s automated tool for measuring a website’s performance, accessibility, progressive web apps, and more. This data improves your understanding of how a website is performing. Gain specific speed and accessibility insights for a website here.
  • PageSpeed Insights – Provides website performance insights using Lighthouse and Chrome User Experience Report data from real user measurement (RUM) when available.
  • Structured Data Testing Tool – Validates that a website is using schema markup (structured data) properly.
  • Mobile-Friendly Test – Evaluates how easily a user can navigate your website on a mobile device.
  • Web.dev – Surfaces website improvement insights using Lighthouse and provides the ability to track progress over time.
  • Tools for web devs and SEOs – Google often provides new tools for web developers and SEOs alike, so keep an eye on any new releases here.

While we don’t have room to cover every SEO audit check you should perform in this guide, we do offer an in-depth Technical SEO Site Audit course for more info. When auditing your site, keep the following in mind:

Crawlability: Are your primary web pages crawlable by search engines, or are you accidentally blocking Googlebot or Bingbot via your robots.txt file? Does the website have an accurate sitemap.xml file in place to help direct crawlers to your primary pages?

Indexed pages: Can your primary pages be found using Google? Doing a site:yoursite.com OR site:yoursite.com/specific-page check in Google can help answer this question. If you notice some are missing, check to make sure a meta robots=noindex tag isn’t excluding pages that should be indexed and found in search results.

Check page titles & meta descriptions: Do your titles and meta descriptions do a good job of summarizing the content of each page? How are their CTRs in search results, according to Google Search Console? Are they written in a way that entices searchers to click your result over the other ranking URLs? Which pages could be improved? Site-wide crawls are essential for discovering on-page and technical SEO opportunities.

Page speed: How does your website perform on mobile devices and in Lighthouse? Which images could be compressed to improve load time?

Content quality: How well does the current content of the website meet the target market’s needs? Is the content 10X better than other ranking websites’ content? If not, what could you do better? Think about things like richer content, multimedia, PDFs, guides, audio content, and more.

Pro tip: Website pruning!

Removing thin, old, low-quality, or rarely visited pages from your site can help improve your website’s perceived quality. Performing a content audit will help you discover these pruning opportunities. Three primary ways to prune pages include:

  1. Delete the page (4XX): Use when a page adds no value (ex: traffic, links) and/or is outdated.
  2. Redirect (3XX): Redirect the URLs of pages you’re pruning when you want to preserve the value they add to your site, such as inbound links to that old URL.
  3. NoIndex: Use this when you want the page to remain on your site but be removed from the index.

Keyword research and competitive website analysis (performing audits on your competitors’ websites) can also provide rich insights on opportunities for your own website.

For example:

  • Which keywords are competitors ranking on page 1 for, but your website isn’t?
  • Which keywords is your website ranking on page 1 for that also have a featured snippet? You might be able to provide better content and take over that snippet.
  • Which websites link to more than one of your competitors, but not to your website?

Discovering website content and performance opportunities will help devise a more data-driven SEO plan of attack! Keep an ongoing list in order to prioritize your tasks effectively.

Prioritizing your SEO fixes

In order to prioritize SEO fixes effectively, it’s essential to first have specific, agreed-upon goals established between you and your client.

While there are a million different ways you could prioritize SEO, we suggest you rank them in terms of importance and urgency. Which fixes could provide the most ROI for a website and help support your agreed-upon goals?

Stephen Covey, author of The 7 Habits of Highly Effective People, developed a handy time management grid that can ease the burden of prioritization:


Source: Stephen Covey, The 7 Habits of Highly Effective People

Putting out small, urgent SEO fires might feel most effective in the short term, but this often leads to neglecting non-urgent important fixes. The not urgent & important items are ultimately what often move the needle for a website’s SEO. Don’t put these off.

SEO planning & execution

“Without strategy, execution is aimless. Without execution, strategy is useless.”
- Morris Chang

Much of your success depends on effectively mapping out and scheduling your SEO tasks. You can use free tools like Google Sheets to plan out your SEO execution (we have a free template here), but you can use whatever method works best for you. Some people prefer to schedule out their SEO tasks in their Google Calendar, in a kanban or scrum board, or in a daily planner.

Use what works for you and stick to it.

Measuring your progress along the way via the metrics mentioned above will help you monitor your effectiveness and allow you to pivot your SEO efforts when something isn’t working. Say, for example, you changed a primary page’s title and meta description, only to notice that the CTR for that page decreased. Perhaps you changed it to something too vague or strayed too far from the on-page topic — it might be good to try a different approach. Keeping an eye on drops in rankings, CTRs, organic traffic, and conversions can help you manage hiccups like this early, before they become a bigger problem.

Communication is essential for SEO client longevity

Many SEO fixes are implemented without being noticeable to a client (or user). This is why it’s essential to employ good communication skills around your SEO plan, the time frame in which you’re working, and your benchmark metrics, as well as frequent check-ins and reports.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 6: Link Building & Establishing Authority

Posted by BritneyMuller

In Chapter 6 of the new Beginner’s Guide to SEO, we’ll be covering the dos and don’ts of link building and ways your site can build its authority. If you missed them, we’ve got the drafts of our outline, Chapter One, Chapter Two, Chapter Three, Chapter Four, and Chapter Five for your reading pleasure. Be sure to let us know what you think of Chapter 6 in the comments!


Chapter 6: Link Building & Establishing Authority

Turn up the volume.

You’ve created content that people are searching for, that answers their questions, and that search engines can understand, but those qualities alone don’t mean it’ll rank. To outrank the rest of the sites with those qualities, you have to establish authority. That can be accomplished by earning links from authoritative websites, building your brand, and nurturing an audience who will help amplify your content.

Google has confirmed that links and quality content (which we covered back in Chapter 4) are two of the three most important ranking factors for SEO. Trustworthy sites tend to link to other trustworthy sites, and spammy sites tend to link to other spammy sites. But what is a link, exactly? How do you go about earning them from other websites? Let’s start with the basics.

What are links?

Inbound links, also known as backlinks or external links, are HTML hyperlinks that point from one website to another. They’re the currency of the Internet, as they act a lot like real-life reputation. If you went on vacation and asked three people (all completely unrelated to one another) what the best coffee shop in town was, and they all said, “Cuppa Joe on Main Street,” you would feel confident that Cuppa Joe is indeed the best coffee place in town. Links do that for search engines.

Since the late 1990s, search engines have treated links as votes for popularity and importance on the web.

Internal links, or links that connect internal pages of the same domain, work very similarly for your website. A high amount of internal links pointing to a particular page on your site will provide a signal to Google that the page is important, so long as it’s done naturally and not in a spammy way.

The engines themselves have refined the way they view links, now using algorithms to evaluate sites and pages based on the links they find. But what’s in those algorithms? How do the engines evaluate all those links? It all starts with the concept of E-A-T.

You are what you E-A-T

Google’s Search Quality Rater Guidelines put a great deal of importance on the concept of E-A-T — an acronym for expert, authoritative, and trustworthy. Sites that don’t display these characteristics tend to be seen as lower-quality in the eyes of the engines, while those that do are subsequently rewarded. E-A-T is becoming more and more important as search evolves and increases the importance of solving for user intent.

Creating a site that’s considered expert, authoritative, and trustworthy should be your guiding light as you practice SEO. Not only will it simply result in a better site, but it’s future-proof. After all, providing great value to searchers is what Google itself is trying to do.

E-A-T and links to your site

The more popular and important a site is, the more weight the links from that site carry. A site like Wikipedia, for example, has thousands of diverse sites linking to it. This indicates it provides lots of expertise, has cultivated authority, and is trusted among those other sites.

To earn trust and authority with search engines, you’ll need links from websites that display the qualities of E-A-T. These don’t have to be Wikipedia-level sites, but they should provide searchers with credible, trustworthy content.

  • Tip: Moz has proprietary metrics to help you determine how authoritative a site is: Domain Authority, Page Authority, and Spam Score. In general, you’ll want links from sites with a higher Domain Authority than your sites.

Followed vs. nofollowed links

Remember how links act as votes? The rel=nofollow attribute (pronounced as two words, “no follow”) allows you to link to a resource while removing your “vote” for search engine purposes.

Just like it sounds, “nofollow” tells search engines not to follow the link. Some engines still follow them simply to discover new pages, but these links don’t pass link equity (the “votes of popularity” we talked about above), so they can be useful in situations where a page is either linking to an untrustworthy source or was paid for or created by the owner of the destination page (making it an unnatural link).

Say, for example, you write a post about link building practices, and want to call out an example of poor, spammy link building. You could link to the offending site without signaling to Google that you trust it.

Standard links (ones that haven’t had nofollow added) look like this:

<a href="https://moz.com">I love Moz</a>

Nofollow link markup looks like this:

<a href="https://moz.com" rel="nofollow">I love Moz</a>

If follow links pass all the link equity, shouldn’t that mean you want only follow links?

Not necessarily. Think about all the legitimate places you can create links to your own website: a Facebook profile, a Yelp page, a Twitter account, etc. These are all natural places to add links to your website, but they shouldn’t count as votes for your website. (Setting up a Twitter profile with a link to your site isn’t a vote from Twitter that they like your site.)

It’s natural for your site to have a balance between nofollowed and followed backlinks in its link profile (more on link profiles below). A nofollow link might not pass authority, but it could send valuable traffic to your site and even lead to future followed links.

  • Tip: Use the MozBar extension for Google Chrome to highlight links on any page to find out whether they’re nofollow or follow without ever having to view the source code!

Your link profile

Your link profile is an overall assessment of all the inbound links your site has earned: the total number of links, their quality (or spamminess), their diversity (is one site linking to you hundreds of times, or are hundreds of sites linking to you once?), and more. The state of your link profile helps search engines understand how your site relates to other sites on the Internet. There are various SEO tools that allow you to analyze your link profile and begin to understand its overall makeup.

How can I see which inbound links point to my website?

Visit Moz Link Explorer and type in your site’s URL. You’ll be able to see how many and which websites are linking back to you.

What are the qualities of a healthy link profile?

When people began to learn about the power of links, they began manipulating them for their benefit. They’d find ways to gain artificial links just to increase their search engine rankings. While these dangerous tactics can sometimes work, they are against Google’s terms of service and can get a website deindexed (removal of web pages or entire domains from search results). You should always try to maintain a healthy link profile.

A healthy link profile is one that indicates to search engines that you’re earning your links and authority fairly. Just like you shouldn’t lie, cheat, or steal, you should strive to ensure your link profile is honest and earned via your hard work.

Links are earned or editorially placed

Editorial links are links added naturally by sites and pages that want to link to your website.

The foundation of acquiring earned links is almost always through creating high-quality content that people genuinely wish to reference. This is where creating 10X content (a way of describing extremely high-quality content) is essential! If you can provide the best and most interesting resource on the web, people will naturally link to it.

Naturally earned links require no specific action from you, other than the creation of worthy content and the ability to create awareness about it.

  • Tip: Earned mentions are often unlinked! When websites are referring to your brand or a specific piece of content you’ve published, they will often mention it without linking to it. To find these earned mentions, use Moz’s Fresh Web Explorer. You can then reach out to those publishers to see if they’ll update those mentions with links.

Links are relevant and from topically similar websites

Links from websites within a topic-specific community are generally better than links from websites that aren’t relevant to your site. If your website sells dog houses, a link from the Society of Dog Breeders matters much more than one from the Roller Skating Association. Additionally, links from topically irrelevant sources can send confusing signals to search engines regarding what your page is about.

  • Tip: Linking domains don’t have to match the topic of your page exactly, but they should be related. Avoid pursuing backlinks from sources that are completely off-topic; there are far better uses of your time.

Anchor text is descriptive and relevant, without being spammy

Anchor text helps tell Google what the topic of your page is about. If dozens of links point to a page with a variation of a word or phrase, the page has a higher likelihood of ranking well for those types of phrases. However, proceed with caution! Too many backlinks with the same anchor text could indicate to the search engines that you’re trying to manipulate your site’s ranking in search results.

Consider this. You ask ten separate friends at separate times how their day was going, and they each responded with the same phrase:

“Great! I started my day by walking my dog, Peanut, and then had a picante beef Top Ramen for lunch.”

That’s strange, and you’d be quite suspicious of your friends. The same goes for Google. Describing the content of the target page with the anchor text helps them understand what the page is about, but the same description over and over from multiple sources starts to look suspicious. Aim for relevance; avoid spam.

  • Tip: Use the “Anchor Text” report in Moz’s Link Explorer to see what anchor text other websites are using to link to your content.

Links send qualified traffic to your site

Link building should never be solely about search engine rankings. Esteemed SEO and link building thought leader Eric Ward used to say that you should build your links as though Google might disappear tomorrow. In essence, you should focus on acquiring links that will bring qualified traffic to your website — another reason why it’s important to acquire links from relevant websites whose audience would find value in your site, as well.

  • Tip: Use the “Referral Traffic” report in Google Analytics to evaluate websites that are currently sending you traffic. How can you continue to build relationships with similar types of websites?

Link building don’ts & things to avoid

Spammy link profiles are just that: full of links built in unnatural, sneaky, or otherwise low-quality ways. Practices like buying links or engaging in a link exchange might seem like the easy way out, but doing so is dangerous and could put all of your hard work at risk. Google penalizes sites with spammy link profiles, so don’t give in to temptation.

A guiding principle for your link building efforts is to never try to manipulate a site’s ranking in search results. But isn’t that the entire goal of SEO? To increase a site’s ranking in search results? And herein lies the confusion. Google wants you to earn links, not build them, but the line between the two is often blurry. To avoid penalties for unnatural links (known as “link spam”), Google has made clear what should be avoided.

Purchased links

Google and Bing both seek to discount the influence of paid links in their organic search results. While a search engine can’t know which links were earned vs. paid for from viewing the link itself, there are clues it uses to detect patterns that indicate foul play. Websites caught buying or selling followed links risk severe penalties that will severely drop their rankings. (By the way, exchanging goods or services for a link is also a form of payment and qualifies as buying links.)

Link exchanges / reciprocal linking

If you’ve ever received a “you link to me and I’ll link you you” email from someone you have no affiliation with, you’ve been targeted for a link exchange. Google’s quality guidelines caution against “excessive” link exchange and similar partner programs conducted exclusively for the sake of cross-linking, so there is some indication that this type of exchange on a smaller scale might not trigger any link spam alarms.

It is acceptable, and even valuable, to link to people you work with, partner with, or have some other affiliation with and have them link back to you.

It’s the exchange of links at mass scale with unaffiliated sites that can warrant penalties.

Low-quality directory links

These used to be a popular source of manipulation. A large number of pay-for-placement web directories exist to serve this market and pass themselves off as legitimate, with varying degrees of success. These types of sites tend to look very similar, with large lists of websites and their descriptions (typically, the site’s critical keyword is used as the anchor text to link back to the submittor’s site).

There are many more manipulative link building tactics that search engines have identified. In most cases, they have found algorithmic methods for reducing their impact. As new spam systems emerge, engineers will continue to fight them with targeted algorithms, human reviews, and the collection of spam reports from webmasters and SEOs. By and large, it isn’t worth finding ways around them.

If your site does get a manual penalty, there are steps you can take to get it lifted.

How to build high-quality backlinks

Link building comes in many shapes and sizes, but one thing is always true: link campaigns should always match your unique goals. With that said, there are some popular methods that tend to work well for most campaigns. This is not an exhaustive list, so visit Moz’s blog posts on link building for more detail on this topic.

Find customer and partner links

If you have partners you work with regularly, or loyal customers that love your brand, there are ways to earn links from them with relative ease. You might send out partnership badges (graphic icons that signify mutual respect), or offer to write up testimonials of their products. Both of those offer things they can display on their website along with links back to you.

Publish a blog

This content and link building strategy is so popular and valuable that it’s one of the few recommended personally by the engineers at Google. Blogs have the unique ability to contribute fresh material on a consistent basis, generate conversations across the web, and earn listings and links from other blogs.

Careful, though — you should avoid low-quality guest posting just for the sake of link building. Google has advised against this and your energy is better spent elsewhere.

Create unique resources

Creating unique, high quality resources is no easy task, but it’s well worth the effort. High quality content that is promoted in the right ways can be widely shared. It can help to create pieces that have the following traits:

Creating a resource like this is a great way to attract a lot of links with one page. You could also create a highly-specific resource — without as broad of an appeal — that targeted a handful of websites. You might see a higher rate of success, but that approach isn’t as scalable.

Users who see this kind of unique content often want to share it with friends, and bloggers/tech-savvy webmasters who see it will often do so through links. These high quality, editorially earned votes are invaluable to building trust, authority, and rankings potential.

Build resource pages

Resource pages are a great way to build links. However, to find them you’ll want to know some Advanced Google operators to make discovering them a bit easier.

For example, if you were doing link building for a company that made pots and pans, you could search for: cooking intitle:”resources” and see which pages might be good link targets.

This can also give you great ideas for content creation — just think about which types of resources you could create that these pages would all like to reference/link to.

Get involved in your local community

For a local business (one that meets its customers in person), community outreach can result in some of the most valuable and influential links.

  • Engage in sponsorships and scholarships.
  • Host or participate in community events, seminars, workshops, and organizations.
  • Donate to worthy local causes and join local business associations.
  • Post jobs and offer internships.
  • Promote loyalty programs.
  • Run a local competition.
  • Develop real-world relationships with related local businesses to discover how you can team up to improve the health of your local economy.

All of these smart and authentic strategies provide good local link opportunities.

Refurbish top content

You likely already know which of your site’s content earns the most traffic, converts the most customers, or retains visitors for the longest amount of time.

Take that content and refurbish it for other platforms (Slideshare, YouTube, Instagram, Quora, etc.) to expand your acquisition funnel beyond Google.

You can also dust off, update, and simply republish older content on the same platform. If you discover that a few trusted industry websites all linked to a popular resource that’s gone stale, update it and let those industry websites know — you may just earn a good link.

You can also do this with images. Reach out to websites that are using your images and not citing/linking back to you and ask if they’d mind including a link.

Be newsworthy

Earning the attention of the press, bloggers, and news media is an effective, time-honored way to earn links. Sometimes this is as simple as giving something away for free, releasing a great new product, or stating something controversial. Since so much of SEO is about creating a digital representation of your brand in the real world, to succeed in SEO, you have to be a great brand.

Be personal and genuine

The most common mistake new SEOs make when trying to build links is not taking the time to craft a custom, personal, and valuable initial outreach email. You know as well as anyone how annoying spammy emails can be, so make sure yours doesn’t make people roll their eyes.

Your goal for an initial outreach email is simply to get a response. These tips can help:

  • Make it personal by mentioning something the person is working on, where they went to school, their dog, etc.
  • Provide value. Let them know about a broken link on their website or a page that isn’t working on mobile.
  • Keep it short.
  • Ask one simple question (typically not for a link; you’ll likely want to build a rapport first).

Pro Tip:

Earning links can be very resource-intensive, so you’ll likely want to measure your success to prove the value of those efforts.

Metrics for link building should match up with the site’s overall KPIs. These might be sales, email subscriptions, page views, etc. You should also evaluate Domain and/or Page Authority scores, the ranking of desired keywords, and the amount of traffic to your content — but we’ll talk more about measuring the success of your SEO campaigns in Chapter 7.

Beyond links: How awareness, amplification, and sentiment impact authority

A lot of the methods you’d use to build links will also indirectly build your brand. In fact, you can view link building as a great way to increase awareness of your brand, the topics on which you’re an authority, and the products or services you offer.

Once your target audience knows about you and you have valuable content to share, let your audience know about it! Sharing your content on social platforms will not only make your audience aware of your content, but it can also encourage them to amplify that awareness to their own networks, thereby extending your own reach.

Are social shares the same as links? No. But shares to the right people can result in links. Social shares can also promote an increase in traffic and new visitors to your website, which can grow brand awareness, and with a growth in brand awareness can come a growth in trust and links. The connection between social signals and rankings seems indirect, but even indirect correlations can be helpful for informing strategy.

Trustworthiness goes a long way

For search engines, trust is largely determined by the quality and quantity of the links your domain has earned, but that’s not to say that there aren’t other factors at play that can influence your site’s authority. Think about all the different ways you come to trust a brand:

  • Awareness (you know they exist)
  • Helpfulness (they provide answers to your questions)
  • Integrity (they do what they say they will)
  • Quality (their product or service provides value; possibly more than others you’ve tried)
  • Continued value (they continue to provide value even after you’ve gotten what you needed)
  • Voice (they communicate in unique, memorable ways)
  • Sentiment (others have good things to say about their experience with the brand)

That last point is what we’re going to focus on here. Reviews of your brand, its products, or its services can make or break a business.

In your effort to establish authority from reviews, follow these review rules of thumb:

  • Never pay any individual or agency to create a fake positive review for your business or a fake negative review of a competitor.
  • Don’t review your own business or the businesses of your competitors. Don’t have your staff do so either.
  • Never offer incentives of any kind in exchange for reviews.
  • All reviews must be left directly by customers in their own accounts; never post reviews on behalf of a customer or employ an agency to do so.
  • Don’t set up a review station/kiosk in your place of business; many reviews stemming from the same IP can be viewed as spam.
  • Read the guidelines of each review platform where you’re hoping to earn reviews.

Be aware that review spam is a problem that’s taken on global proportions, and that violation of governmental truth-in-advertising guidelines has led to legal prosecution and heavy fines. It’s just too dangerous to be worth it. Playing by the rules and offering exceptional customer experiences is the winning combination for building both trust and authority over time.

Authority is built when brands are doing great things in the real-world, making customers happy, creating and sharing great content, and earning links from reputable sources.

In the next and final section, you’ll learn how to measure the success of all your efforts, as well as tactics for iterating and improving upon them. Onward!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 5: Technical Optimization

Posted by BritneyMuller

After a short break, we’re back to share our working draft of Chapter 5 of the Beginner’s Guide to SEO with you! This one was a whopper, and we’re really looking forward to your input. Giving beginner SEOs a solid grasp of just what technical optimization for SEO is and why it matters — without overwhelming them or scaring them off the subject — is a tall order indeed. We’d love to hear what you think: did we miss anything you think is important for beginners to know? Leave us your feedback in the comments!

And in case you’re curious, check back on our outline, Chapter One, Chapter Two, Chapter Three, and Chapter Four to see what we’ve covered so far.


Chapter 5: Technical Optimization

Basic technical knowledge will help you optimize your site for search engines and establish credibility with developers.

Now that you’ve crafted valuable content on the foundation of solid keyword research, it’s important to make sure it’s not only readable by humans, but by search engines too!

You don’t need to have a deep technical understanding of these concepts, but it is important to grasp what these technical assets do so that you can speak intelligently about them with developers. Speaking your developers’ language is important because you will likely need them to carry out some of your optimizations. They’re unlikely to prioritize your asks if they can’t understand your request or see its importance. When you establish credibility and trust with your devs, you can begin to tear away the red tape that often blocks crucial work from getting done.

Pro tip: SEOs need cross-team support to be effective

It’s vital to have a healthy relationship with your developers so that you can successfully tackle SEO challenges from both sides. Don’t wait until a technical issue causes negative SEO ramifications to involve a developer. Instead, join forces for the planning stage with the goal of avoiding the issues altogether. If you don’t, it can cost you in time and money later.

Beyond cross-team support, understanding technical optimization for SEO is essential if you want to ensure that your web pages are structured for both humans and crawlers. To that end, we’ve divided this chapter into three sections:

  1. How websites work
  2. How search engines understand websites
  3. How users interact with websites

Since the technical structure of a site can have a massive impact on its performance, it’s crucial for everyone to understand these principles. It might also be a good idea to share this part of the guide with your programmers, content writers, and designers so that all parties involved in a site’s construction are on the same page.

1. How websites work

If search engine optimization is the process of optimizing a website for search, SEOs need at least a basic understanding of the thing they’re optimizing!

Below, we outline the website’s journey from domain name purchase all the way to its fully rendered state in a browser. An important component of the website’s journey is the critical rendering path, which is the process of a browser turning a website’s code into a viewable page.

Knowing this about websites is important for SEOs to understand for a few reasons:

  • The steps in this webpage assembly process can affect page load times, and speed is not only important for keeping users on your site, but it’s also one of Google’s ranking factors.
  • Google renders certain resources, like JavaScript, on a “second pass.” Google will look at the page without JavaScript first, then a few days to a few weeks later, it will render JavaScript, meaning SEO-critical elements that are added to the page using JavaScript might not get indexed.

Imagine that the website loading process is your commute to work. You get ready at home, gather your things to bring to the office, and then take the fastest route from your home to your work. It would be silly to put on just one of your shoes, take a longer route to work, drop your things off at the office, then immediately return home to get your other shoe, right? That’s sort of what inefficient websites do. This chapter will teach you how to diagnose where your website might be inefficient, what you can do to streamline, and the positive ramifications on your rankings and user experience that can result from that streamlining.

Before a website can be accessed, it needs to be set up!

  1. Domain name is purchased. Domain names like moz.com are purchased from a domain name registrar such as GoDaddy or HostGator. These registrars are just organizations that manage the reservations of domain names.
  2. Domain name is linked to IP address. The Internet doesn’t understand names like “moz.com” as website addresses without the help of domain name servers (DNS). The Internet uses a series of numbers called an Internet protocol (IP) address (ex: 127.0.0.1), but we want to use names like moz.com because they’re easier for humans to remember. We need to use a DNS to link those human-readable names with machine-readable numbers.

How a website gets from server to browser

  1. User requests domain. Now that the name is linked to an IP address via DNS, people can request a website by typing the domain name directly into their browser or by clicking on a link to the website.
  2. Browser makes requests. That request for a web page prompts the browser to make a DNS lookup request to convert the domain name to its IP address. The browser then makes a request to the server for the code your web page is constructed with, such as HTML, CSS, and JavaScript.
  3. Server sends resources. Once the server receives the request for the website, it sends the website files to be assembled in the searcher’s browser.
  4. Browser assembles the web page. The browser has now received the resources from the server, but it still needs to put it all together and render the web page so that the user can see it in their browser. As the browser parses and organizes all the web page’s resources, it’s creating a Document Object Model (DOM). The DOM is what you can see when you right click + “inspect element” on a web page in your Chrome browser (learn how to inspect elements in other browsers).
  5. Browser makes final requests. The browser will only show a web page after all the page’s necessary code is downloaded, parsed, and executed, so at this point, if the browser needs any additional code in order to show your website, it will make an additional request from your server.
  6. Website appears in browser. Whew! After all that, your website has now been transformed (rendered) from code to what you see in your browser.

Pro tip: Talk to your developers about async!

Something you can bring up with your developers is shortening the critical rendering path by setting scripts to “async” when they’re not needed to render content above the fold, which can make your web pages load faster. Async tells the DOM that it can continue to be assembled while the browser is fetching the scripts needed to display your web page. If the DOM has to pause assembly every time the browser fetches a script (called “render-blocking scripts”), it can substantially slow down your page load.

It would be like going out to eat with your friends and having to pause the conversation every time one of you went up to the counter to order, only resuming once they got back. With async, you and your friends can continue to chat even when one of you is ordering. You might also want to bring up other optimizations that devs can implement to shorten the critical rendering path, such as removing unnecessary scripts entirely, like old tracking scripts.

Now that you know how a website appears in a browser, we’re going to focus on what a website is made of — in other words, the code (programming languages) used to construct those web pages.

The three most common are:

  • HTML – What a website says (titles, body content, etc.)
  • CSS – How a website looks (color, fonts, etc.)
  • JavaScript – How it behaves (interactive, dynamic, etc.)

HTML: What a website says

HTML stands for hypertext markup language, and it serves as the backbone of a website. Elements like headings, paragraphs, lists, and content are all defined in the HTML.

Here’s an example of a webpage, and what its corresponding HTML looks like:

HTML is important for SEOs to know because it’s what lives “under the hood” of any page they create or work on. While your CMS likely doesn’t require you to write your pages in HTML (ex: selecting “hyperlink” will allow you to create a link without you having to type in “a href=”), it is what you’re modifying every time you do something to a web page such as adding content, changing the anchor text of internal links, and so on. Google crawls these HTML elements to determine how relevant your document is to a particular query. In other words, what’s in your HTML plays a huge role in how your web page ranks in Google organic search!

CSS: How a website looks

CSS stands for cascading style sheets, and this is what causes your web pages to take on certain fonts, colors, and layouts. HTML was created to describe content, rather than to style it, so when CSS entered the scene, it was a game-changer. With CSS, web pages could be “beautified” without requiring manual coding of styles into the HTML of every page — a cumbersome process, especially for large sites.

It wasn’t until 2014 that Google’s indexing system began to render web pages more like an actual browser, as opposed to a text-only browser. A black-hat SEO practice that tried to capitalize on Google’s older indexing system was hiding text and links via CSS for the purpose of manipulating search engine rankings. This “hidden text and links” practice is a violation of Google’s quality guidelines.

Components of CSS that SEOs, in particular, should care about:

  • Since style directives can live in external stylesheet files (CSS files) instead of your page’s HTML, it makes your page less code-heavy, reducing file transfer size and making load times faster.
  • Browsers still have to download resources like your CSS file, so compressing them can make your web pages load faster, and page speed is a ranking factor.
  • Having your pages be more content-heavy than code-heavy can lead to better indexing of your site’s content.
  • Using CSS to hide links and content can get your website manually penalized and removed from Google’s index.

JavaScript: How a website behaves

In the earlier days of the Internet, web pages were built with HTML. When CSS came along, webpage content had the ability to take on some style. When the programming language JavaScript entered the scene, websites could now not only have structure and style, but they could be dynamic.

JavaScript has opened up a lot of opportunities for non-static web page creation. When someone attempts to access a page that is enhanced with this programming language, that user’s browser will execute the JavaScript against the static HTML that the server returned, resulting in a web page that comes to life with some sort of interactivity.

You’ve definitely seen JavaScript in action — you just may not have known it! That’s because JavaScript can do almost anything to a page. It could create a pop up, for example, or it could request third-party resources like ads to display on your page.

JavaScript can pose some problems for SEO, though, since search engines don’t view JavaScript the same way human visitors do. That’s because of client-side versus server-side rendering. Most JavaScript is executed in a client’s browser. With server-side rendering, on the other hand, the files are executed at the server and the server sends them to the browser in their fully rendered state.

SEO-critical page elements such as text, links, and tags that are loaded on the client’s side with JavaScript, rather than represented in your HTML, are invisible from your page’s code until they are rendered. This means that search engine crawlers won’t see what’s in your JavaScript — at least not initially.

Google says that, as long as you’re not blocking Googlebot from crawling your JavaScript files, they’re generally able to render and understand your web pages just like a browser can, which means that Googlebot should see the same things as a user viewing a site in their browser. However, due to this “second wave of indexing” for client-side JavaScript, Google can miss certain elements that are only available once JavaScript is executed.

There are also some other things that could go wrong during Googlebot’s process of rendering your web pages, which can prevent Google from understanding what’s contained in your JavaScript:

  • You’ve blocked Googlebot from JavaScript resources (ex: with robots.txt, like we learned about in Chapter 2)
  • Your server can’t handle all the requests to crawl your content
  • The JavaScript is too complex or outdated for Googlebot to understand
  • JavaScript doesn’t “lazy load” content into the page until after the crawler has finished with the page and moved on.

Needless to say, while JavaScript does open a lot of possibilities for web page creation, it can also have some serious ramifications for your SEO if you’re not careful. Thankfully, there is a way to check whether Google sees the same thing as your visitors. To see a page how Googlebot views your page, use Google Search Console’s “Fetch and Render” tool. From your site’s Google Search Console dashboard, select “Crawl” from the left navigation, then “Fetch as Google.”

From this page, enter the URL you want to check (or leave blank if you want to check your homepage) and click the “Fetch and Render” button. You also have the option to test either the desktop or mobile version.

In return, you’ll get a side-by-side view of how Googlebot saw your page versus how a visitor to your website would have seen the page. Below, Google will also show you a list of any resources they may not have been able to get for the URL you entered.

Understanding the way websites work lays a great foundation for what we’ll talk about next, which is technical optimizations to help Google understand the pages on your website better.

2. How search engines understand websites

Search engines have gotten incredibly sophisticated, but they can’t (yet) find and interpret web pages quite like a human can. The following sections outline ways you can better deliver content to search engines.

Help search engines understand your content by structuring it with Schema

Imagine being a search engine crawler scanning down a 10,000-word article about how to bake a cake. How do you identify the author, recipe, ingredients, or steps required to bake a cake? This is where schema (Schema.org) markup comes in. It allows you to spoon-feed search engines more specific classifications for what type of information is on your page.

Schema is a way to label or organize your content so that search engines have a better understanding of what certain elements on your web pages are. This code provides structure to your data, which is why schema is often referred to as “structured data.” The process of structuring your data is often referred to as “markup” because you are marking up your content with organizational code.

JSON-LD is Google’s preferred schema markup (announced in May ‘16), which Bing also supports. To view a full list of the thousands of available schema markups, visit Schema.org or view the Google Developers Introduction to Structured Data for additional information on how to implement structured data. After you implement the structured data that best suits your web pages, you can test your markup with Google’s Structured Data Testing Tool.

In addition to helping bots like Google understand what a particular piece of content is about, schema markup can also enable special features to accompany your pages in the SERPs. These special features are referred to as “rich snippets,” and you’ve probably seen them in action. They’re things like:

  • Top Stories carousel
  • Review stars
  • Sitelinks search boxes
  • Recipes

Remember, using structured data can help enable a rich snippet to be present, but does not guarantee it. Other types of rich snippets will likely be added in the future as the use of schema markup increases.

Some last words of advice for schema success:

  • You can use multiple types of schema markup on a page. However, if you mark up one element, like a product for example, and there are other products listed on the page, you must also mark up those products.
  • Don’t mark up content that is not visible to visitors and follow Google’s Quality Guidelines. For example, if you add review structured markup to a page, make sure those reviews are actually visible on that page.
  • If you have duplicate pages, Google asks that you mark up each duplicate page with your structured markup, not just the canonical version.
  • Provide original and updated (if applicable) content on your structured data pages.
  • Structured markup should be an accurate reflection of your page.
  • Try to use the most specific type of schema markup for your content.
  • Marked-up reviews should not be written by the business. They should be genuine unpaid business reviews from actual customers.

Tell search engines about your preferred pages with canonicalization

When Google crawls the same content on different web pages, it sometimes doesn’t know which page to index in search results. This is why the tag was invented: to help search engines better index the preferred version of content and not all its duplicates.

The rel=”canonical” tag allows you to tell search engines where the original, master version of a piece of content is located. You’re essentially saying, “Hey search engine! Don’t index this; index this source page instead.” So, if you want to republish a piece of content, whether exactly or slightly modified, but don’t want to risk creating duplicate content, the canonical tag is here to save the day.

Proper canonicalization ensures that every unique piece of content on your website has only one URL. To prevent search engines from indexing multiple versions of a single page, Google recommends having a self-referencing canonical tag on every page on your site. Without a canonical tag telling Google which version of your web page is the preferred one, http://www.example.com could get indexed separately from http://example.com, creating duplicates.

“Avoid duplicate content” is an Internet truism, and for good reason! Google wants to reward sites with unique, valuable content — not content that’s taken from other sources and repeated across multiple pages. Because engines want to provide the best searcher experience, they will rarely show multiple versions of the same content, opting instead to show only the canonicalized version, or if a canonical tag does not exist, whichever version they deem most likely to be the original.

Pro tip: Distinguishing between content filtering & content penalties
There is no such thing as a duplicate content penalty. However, you should try to keep duplicate content from causing indexing issues by using the rel=”canonical” tag when possible. When duplicates of a page exist, Google will choose a canonical and filter the others out of search results. That doesn’t mean you’ve been penalized. It just means that Google only wants to show one version of your content.

It’s also very common for websites to have multiple duplicate pages due to sort and filter options. For example, on an e-commerce site, you might have what’s called a faceted navigation that allows visitors to narrow down products to find exactly what they’re looking for, such as a “sort by” feature that reorders results on the product category page from lowest to highest price. This could create a URL that looks something like this: example.com/mens-shirts?sort=price_ascending. Add in more sort/filter options like color, size, material, brand, etc. and just think about all the variations of your main product category page this would create!

To learn more about different types of duplicate content, this post by Dr. Pete helps distill the different nuances.

3. How users interact with websites

In Chapter 1, we said that despite SEO standing for search engine optimization, SEO is as much about people as it is about search engines themselves. That’s because search engines exist to serve searchers. This goal helps explain why Google’s algorithm rewards websites that provide the best possible experiences for searchers, and why some websites, despite having qualities like robust backlink profiles, might not perform well in search.

When we understand what makes their web browsing experience optimal, we can create those experiences for maximum search performance.

Ensuring a positive experience for your mobile visitors

Being that well over half of all web traffic today comes from mobile, it’s safe to say that your website should be accessible and easy to navigate for mobile visitors. In April 2015, Google rolled out an update to its algorithm that would promote mobile-friendly pages over non-mobile-friendly pages. So how can you ensure that your website is mobile friendly? Although there are three main ways to configure your website for mobile, Google recommends responsive web design.

Responsive design

Responsive websites are designed to fit the screen of whatever type of device your visitors are using. You can use CSS to make the web page “respond” to the device size. This is ideal because it prevents visitors from having to double-tap or pinch-and-zoom in order to view the content on your pages. Not sure if your web pages are mobile friendly? You can use Google’s mobile-friendly test to check!

AMP

AMP stands for Accelerated Mobile Pages, and it is used to deliver content to mobile visitors at speeds much greater than with non-AMP delivery. AMP is able to deliver content so fast because it delivers content from its cache servers (not the original site) and uses a special AMP version of HTML and JavaScript. Learn more about AMP.

Mobile-first indexing

As of 2018, Google started switching websites over to mobile-first indexing. That change sparked some confusion between mobile-friendliness and mobile-first, so it’s helpful to disambiguate. With mobile-first indexing, Google crawls and indexes the mobile version of your web pages. Making your website compatible to mobile screens is good for users and your performance in search, but mobile-first indexing happens independently of mobile-friendliness.

This has raised some concerns for websites that lack parity between mobile and desktop versions, such as showing different content, navigation, links, etc. on their mobile view. A mobile site with different links, for example, will alter the way in which Googlebot (mobile) crawls your site and sends link equity to your other pages.

Breaking up long content for easier digestion

When sites have very long pages, they have the option of breaking them up into multiple parts of a whole. This is called pagination and it’s similar to pages in a book. In order to avoid giving the visitor too much all at once, you can break up your single page into multiple parts. This can be great for visitors, especially on e-commerce sites where there are a lot of product results in a category, but there are some steps you should take to help Google understand the relationship between your paginated pages. It’s called rel=”next” and rel=”prev.”

You can read more about pagination in Google’s official documentation, but the main takeaways are that:

  • The first page in a sequence should only have rel=”next” markup
  • The last page in a sequence should only have rel=”prev” markup
  • Pages that have both a preceding and following page should have both rel=”next” and rel=”prev”
  • Since each page in the sequence is unique, don’t canonicalize them to the first page in the sequence. Only use a canonical tag to point to a “view all” version of your content, if you have one.
  • When Google sees a paginated sequence, it will typically consolidate the pages’ linking properties and send searchers to the first page

Pro tip: rel=”next/prev” should still have anchor text and live within an <a> link
This helps Google ensure that they pick up the rel=”next/prev”.

Improving page speed to mitigate visitor frustration

Google wants to serve content that loads lightning-fast for searchers. We’ve come to expect fast-loading results, and when we don’t get them, we’ll quickly bounce back to the SERP in search of a better, faster page. This is why page speed is a crucial aspect of on-site SEO. We can improve the speed of our web pages by taking advantage of tools like the ones we’ve mentioned below. Click on the links to learn more about each.

Images are one of the main culprits of slow pages!

As discussed in Chapter 4, images are one of the number-one reasons for slow-loading web pages! In addition to image compression, optimizing image alt text, choosing the right image format, and submitting image sitemaps, there are other technical ways to optimize the speed and way in which images are shown to your users. Some primary ways to improve image delivery are as follows:

SRCSET: How to deliver the best image size for each device

The SRCSET attribute allows you to have multiple versions of your image and then specify which version should be used in different situations. This piece of code is added to the <img> tag (where your image is located in the HTML) to provide unique images for specific-sized devices.

This is like the concept of responsive design that we discussed earlier, except for images!

This doesn’t just speed up your image load time, it’s also a unique way to enhance your on-page user experience by providing different and optimal images to different device types.

Pro tip: There are more than just three image size versions!
It’s a common misconception that you just need a desktop, tablet, and mobile-sized version of your image. There are a huge variety of screen sizes and resolutions. Learn more about SRCSET.

Show visitors image loading is in progress with lazy loading

Lazy loading occurs when you go to a webpage and, instead of seeing a blank white space for where an image will be, a blurry lightweight version of the image or a colored box in its place appears while the surrounding text loads. After a few seconds, the image clearly loads in full resolution. The popular blogging platform Medium does this really well.

The low resolution version is initially loaded, and then the full high resolution version. This also helps to optimize your critical rendering path! So while all of your other page resources are being downloaded, you’re showing a low-resolution teaser image that helps tell users that things are happening/being loaded. For more information on how you should lazy load your images, check out Google’s Lazy Loading Guidance.

Improve speed by condensing and bundling your files

Page speed audits will often make recommendations such as “minify resource,” but what does that actually mean? Minification condenses a code file by removing things like line breaks and spaces, as well as abbreviating code variable names wherever possible.

“Bundling” is another common term you’ll hear in reference to improving page speed. The process of bundling combines a bunch of the same coding language files into one single file. For example, a bunch of JavaScript files could be put into one larger file to reduce the amount of JavaScript files for a browser.

By both minifying and bundling the files needed to construct your web page, you’ll speed up your website and reduce the number of your HTTP (file) requests.

Improving the experience for international audiences

Websites that target audiences from multiple countries should familiarize themselves with international SEO best practices in order to serve up the most relevant experiences. Without these optimizations, international visitors might have difficulty finding the version of your site that caters to them.

There are two main ways a website can be internationalized:

  • Language
    Sites that target speakers of multiple languages are considered multilingual websites. These sites should add something called an hreflang tag to show Google that your page has copy for another language. Learn more about hreflang.
  • Country
    Sites that target audiences in multiple countries are called multi-regional websites and they should choose a URL structure that makes it easy to target their domain or pages to specific countries. This can include the use of a country code top level domain (ccTLD) such as “.ca” for Canada, or a generic top-level domain (gTLD) with a country-specific subfolder such as “example.com/ca” for Canada. Learn more about locale-specific URLs.

You’ve researched, you’ve written, and you’ve optimized your website for search engines and user experience. The next piece of the SEO puzzle is a big one: establishing authority so that your pages will rank highly in search results.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 2: Crawling, Indexing, and Ranking

Posted by BritneyMuller

It’s been a few months since our last share of our work-in-progress rewrite of the Beginner’s Guide to SEO, but after a brief hiatus, we’re back to share our draft of Chapter Two with you! This wouldn’t have been possible without the help of Kameron Jenkins, who has thoughtfully contributed her great talent for wordsmithing throughout this piece.

This is your resource, the guide that likely kicked off your interest in and knowledge of SEO, and we want to do right by you. You left amazingly helpful commentary on our outline and draft of Chapter One, and we’d be honored if you would take the time to let us know what you think of Chapter Two in the comments below.


Chapter 2: How Search Engines Work – Crawling, Indexing, and Ranking

First, show up.

As we mentioned in Chapter 1, search engines are answer machines. They exist to discover, understand, and organize the internet’s content in order to offer the most relevant results to the questions searchers are asking.

In order to show up in search results, your content needs to first be visible to search engines. It’s arguably the most important piece of the SEO puzzle: If your site can’t be found, there’s no way you’ll ever show up in the SERPs (Search Engine Results Page).

How do search engines work?

Search engines have three primary functions:

  1. Crawl: Scour the Internet for content, looking over the code/content for each URL they find.
  2. Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
  3. Rank: Provide the pieces of content that will best answer a searcher’s query. Order the search results by the most helpful to a particular query.

What is search engine crawling?

Crawling, is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.

The bot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, crawlers are able to find new content and add it to their index — a massive database of discovered URLs — to later be retrieved when a searcher is seeking information that the content on that URL is a good match for.

What is a search engine index?

Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.

Search engine ranking

When someone performs a search, search engines scour their index for highly relevant content and then orders that content in the hopes of solving the searcher’s query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.

It’s possible to block search engine crawlers from part or all of your site, or instruct search engines to avoid storing certain pages in their index. While there can be reasons for doing this, if you want your content found by searchers, you have to first make sure it’s accessible to crawlers and is indexable. Otherwise, it’s as good as invisible.

By the end of this chapter, you’ll have the context you need to work with the search engine, rather than against it!

Note: In SEO, not all search engines are equal

Many beginners wonder about the relative importance of particular search engines. Most people know that Google has the largest market share, but how important it is to optimize for Bing, Yahoo, and others? The truth is that despite the existence of more than 30 major web search engines, the SEO community really only pays attention to Google. Why? The short answer is that Google is where the vast majority of people search the web. If we include Google Images, Google Maps, and YouTube (a Google property), more than 90% of web searches happen on Google — that’s nearly 20 times Bing and Yahoo combined.

Crawling: Can search engines find your site?

As you’ve just learned, making sure your site gets crawled and indexed is a prerequisite for showing up in the SERPs. First things first: You can check to see how many and which pages of your website have been indexed by Google using “site:yourdomain.com“, an advanced search operator.

Head to Google and type “site:yourdomain.com” into the search bar. This will return results Google has in its index for the site specified:

Screen Shot 2017-08-03 at 5.19.15 PM.png

The number of results Google displays (see “About __ results” above) isn’t exact, but it does give you a solid idea of which pages are indexed on your site and how they are currently showing up in search results.

For more accurate results, monitor and use the Index Coverage report in Google Search Console. You can sign up for a free Google Search Console account if you don’t currently have one. With this tool, you can submit sitemaps for your site and monitor how many submitted pages have actually been added to Google’s index, among other things.

If you’re not showing up anywhere in the search results, there are a few possible reasons why:

  • Your site is brand new and hasn’t been crawled yet.
  • Your site isn’t linked to from any external websites.
  • Your site’s navigation makes it hard for a robot to crawl it effectively.
  • Your site contains some basic code called crawler directives that is blocking search engines.
  • Your site has been penalized by Google for spammy tactics.

If your site doesn’t have any other sites linking to it, you still might be able to get it indexed by submitting your XML sitemap in Google Search Console or manually submitting individual URLs to Google. There’s no guarantee they’ll include a submitted URL in their index, but it’s worth a try!

Can search engines see your whole site?

Sometimes a search engine will be able to find parts of your site by crawling, but other pages or sections might be obscured for one reason or another. It’s important to make sure that search engines are able to discover all the content you want indexed, and not just your homepage.

Ask yourself this: Can the bot crawl through your website, and not just to it?

Is your content hidden behind login forms?

If you require users to log in, fill out forms, or answer surveys before accessing certain content, search engines won’t see those protected pages. A crawler is definitely not going to log in.

Are you relying on search forms?

Robots cannot use search forms. Some individuals believe that if they place a search box on their site, search engines will be able to find everything that their visitors search for.

Is text hidden within non-text content?

Non-text media forms (images, video, GIFs, etc.) should not be used to display text that you wish to be indexed. While search engines are getting better at recognizing images, there’s no guarantee they will be able to read and understand it just yet. It’s always best to add text within the <HTML> markup of your webpage.

Can search engines follow your site navigation?

Just as a crawler needs to discover your site via links from other sites, it needs a path of links on your own site to guide it from page to page. If you’ve got a page you want search engines to find but it isn’t linked to from any other pages, it’s as good as invisible. Many sites make the critical mistake of structuring their navigation in ways that are inaccessible to search engines, hindering their ability to get listed in search results.

Common navigation mistakes that can keep crawlers from seeing all of your site:

  • Having a mobile navigation that shows different results than your desktop navigation
  • Any type of navigation where the menu items are not in the HTML, such as JavaScript-enabled navigations. Google has gotten much better at crawling and understanding Javascript, but it’s still not a perfect process. The more surefire way to ensure something gets found, understood, and indexed by Google is by putting it in the HTML.
  • Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler
  • Forgetting to link to a primary page on your website through your navigation — remember, links are the paths crawlers follow to new pages!

This is why it’s essential that your website has a clear navigation and helpful URL folder structures.

Information architecture

Information architecture is the practice of organizing and labeling content on a website to improve efficiency and fundability for users. The best information architecture is intuitive, meaning that users shouldn’t have to think very hard to flow through your website or to find something.

Your site should also have a useful 404 (page not found) page for when a visitor clicks on a dead link or mistypes a URL. The best 404 pages allow users to click back into your site so they don’t bounce off just because they tried to access a nonexistent link.

Tell search engines how to crawl your site

In addition to making sure crawlers can reach your most important pages, it’s also pertinent to note that you’ll have pages on your site you don’t want them to find. These might include things like old URLs that have thin content, duplicate URLs (such as sort-and-filter parameters for e-commerce), special promo code pages, staging or test pages, and so on.

Blocking pages from search engines can also help crawlers prioritize your most important pages and maximize your crawl budget (the average number of pages a search engine bot will crawl on your site).

Crawler directives allow you to control what you want Googlebot to crawl and index using a robots.txt file, meta tag, sitemap.xml file, or Google Search Console.

Robots.txt

Robots.txt files are located in the root directory of websites (ex. yourdomain.com/robots.txt) and suggest which parts of your site search engines should and shouldn’t crawl via specific robots.txt directives. This is a great solution when trying to block search engines from non-private pages on your site.

You wouldn’t want to block private/sensitive pages from being crawled here because the file is easily accessible by users and bots.

Pro tip:

  • If Googlebot can’t find a robots.txt file for a site (40X HTTP status code), it proceeds to crawl the site.
  • If Googlebot finds a robots.txt file for a site (20X HTTP status code), it will usually abide by the suggestions and proceed to crawl the site.
  • If Googlebot finds neither a 20X or a 40X HTTP status code (ex. a 501 server error) it can’t determine if you have a robots.txt file or not and won’t crawl your site.

Meta directives

The two types of meta directives are the meta robots tag (more commonly used) and the x-robots-tag. Each provides crawlers with stronger instructions on how to crawl and index a URL’s content.

The x-robots tag provides more flexibility and functionality if you want to block search engines at scale because you can use regular expressions, block non-HTML files, and apply sitewide noindex tags.

These are the best options for blocking more sensitive*/private URLs from search engines.

*For very sensitive URLs, it is best practice to remove them from or require a secure login to view the pages.

WordPress Tip: In Dashboard > Settings > Reading, make sure the “Search Engine Visibility” box is not checked. This blocks search engines from coming to your site via your robots.txt file!

Avoid these common pitfalls, and you’ll have clean, crawlable content that will allow bots easy access to your pages.

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed.

Sitemaps

A sitemap is just what it sounds like: a list of URLs on your site that crawlers can use to discover and index your content. One of the easiest ways to ensure Google is finding your highest priority pages is to create a file that meets Google’s standards and submit it through Google Search Console. While submitting a sitemap doesn’t replace the need for good site navigation, it can certainly help crawlers follow a path to all of your important pages.

Google Search Console

Some sites (most common with e-commerce) make the same content available on multiple different URLs by appending certain parameters to URLs. If you’ve ever shopped online, you’ve likely narrowed down your search via filters. For example, you may search for “shoes” on Amazon, and then refine your search by size, color, and style. Each time you refine, the URL changes slightly. How does Google know which version of the URL to serve to searchers? Google does a pretty good job at figuring out the representative URL on its own, but you can use the URL Parameters feature in Google Search Console to tell Google exactly how you want them to treat your pages.

Indexing: How do search engines understand and remember your site?

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. In the previous section on crawling, we discussed how search engines discover your web pages. The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page’s contents. All of that information is stored in its index.

Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Can I see how a Googlebot crawler sees my pages?

Yes, the cached version of your page will reflect a snapshot of the last time googlebot crawled it.

Google crawls and caches web pages at different frequencies. More established, well-known sites that post frequently like https://www.nytimes.com will be crawled more frequently than the much-less-famous website for Roger the Mozbot’s side hustle, http://www.rogerlovescupcakes.com (if only it were real…)

You can view what your cached version of a page looks like by clicking the drop-down arrow next to the URL in the SERP and choosing “Cached”:

You can also view the text-only version of your site to determine if your important content is being crawled and cached effectively.

Are pages ever removed from the index?

Yes, pages can be removed from the index! Some of the main reasons why a URL might be removed include:

  • The URL is returning a “not found” error (4XX) or server error (5XX) – This could be accidental (the page was moved and a 301 redirect was not set up) or intentional (the page was deleted and 404ed in order to get it removed from the index)
  • The URL had a noindex meta tag added – This tag can be added by site owners to instruct the search engine to omit the page from its index.
  • The URL has been manually penalized for violating the search engine’s Webmaster Guidelines and, as a result, was removed from the index.
  • The URL has been blocked from crawling with the addition of a password required before visitors can access the page.

If you believe that a page on your website that was previously in Google’s index is no longer showing up, you can manually submit the URL to Google by navigating to the “Submit URL” tool in Search Console.

Ranking: How do search engines rank URLs?

How do search engines ensure that when someone types a query into the search bar, they get relevant results in return? That process is known as ranking, or the ordering of search results by most relevant to least relevant to a particular query.

To determine relevance, search engines use algorithms, a process or formula by which stored information is retrieved and ordered in meaningful ways. These algorithms have gone through many changes over the years in order to improve the quality of search results. Google, for example, makes algorithm adjustments every day — some of these updates are minor quality tweaks, whereas others are core/broad algorithm updates deployed to tackle a specific issue, like Penguin to tackle link spam. Check out our Google Algorithm Change History for a list of both confirmed and unconfirmed Google updates going back to the year 2000.

Why does the algorithm change so often? Is Google just trying to keep us on our toes? While Google doesn’t always reveal specifics as to why they do what they do, we do know that Google’s aim when making algorithm adjustments is to improve overall search quality. That’s why, in response to algorithm update questions, Google will answer with something along the lines of: “We’re making quality updates all the time.” This indicates that, if your site suffered after an algorithm adjustment, compare it against Google’s Quality Guidelines or Search Quality Rater Guidelines, both are very telling in terms of what search engines want.

What do search engines want?

Search engines have always wanted the same thing: to provide useful answers to searcher’s questions in the most helpful formats. If that’s true, then why does it appear that SEO is different now than in years past?

Think about it in terms of someone learning a new language.

At first, their understanding of the language is very rudimentary — “See Spot Run.” Over time, their understanding starts to deepen, and they learn semantics—- the meaning behind language and the relationship between words and phrases. Eventually, with enough practice, the student knows the language well enough to even understand nuance, and is able to provide answers to even vague or incomplete questions.

When search engines were just beginning to learn our language, it was much easier to game the system by using tricks and tactics that actually go against quality guidelines. Take keyword stuffing, for example. If you wanted to rank for a particular keyword like “funny jokes,” you might add the words “funny jokes” a bunch of times onto your page, and make it bold, in hopes of boosting your ranking for that term:

Welcome to funny jokes! We tell the funniest jokes in the world. Funny jokes are fun and crazy. Your funny joke awaits. Sit back and read funny jokes because funny jokes can make you happy and funnier. Some funny favorite funny jokes.

This tactic made for terrible user experiences, and instead of laughing at funny jokes, people were bombarded by annoying, hard-to-read text. It may have worked in the past, but this is never what search engines wanted.

The role links play in SEO

When we talk about links, we could mean two things. Backlinks or “inbound links” are links from other websites that point to your website, while internal links are links on your own site that point to your other pages (on the same site).

Links have historically played a big role in SEO. Very early on, search engines needed help figuring out which URLs were more trustworthy than others to help them determine how to rank search results. Calculating the number of links pointing to any given site helped them do this.

Backlinks work very similarly to real life WOM (Word-Of-Mouth) referrals. Let’s take a hypothetical coffee shop, Jenny’s Coffee, as an example:

  • Referrals from others = good sign of authority
    Example: Many different people have all told you that Jenny’s Coffee is the best in town
  • Referrals from yourself = biased, so not a good sign of authority
    Example: Jenny claims that Jenny’s Coffee is the best in town
  • Referrals from irrelevant or low-quality sources = not a good sign of authority and could even get you flagged for spam
    Example: Jenny paid to have people who have never visited her coffee shop tell others how good it is.
  • No referrals = unclear authority
    Example: Jenny’s Coffee might be good, but you’ve been unable to find anyone who has an opinion so you can’t be sure.

This is why PageRank was created. PageRank (part of Google’s core algorithm) is a link analysis algorithm named after one of Google’s founders, Larry Page. PageRank estimates the importance of a web page by measuring the quality and quantity of links pointing to it. The assumption is that the more relevant, important, and trustworthy a web page is, the more links it will have earned.

The more natural backlinks you have from high-authority (trusted) websites, the better your odds are to rank higher within search results.

The role content plays in SEO

There would be no point to links if they didn’t direct searchers to something. That something is content! Content is more than just words; it’s anything meant to be consumed by searchers — there’s video content, image content, and of course, text. If search engines are answer machines, content is the means by which the engines deliver those answers.

Any time someone performs a search, there are thousands of possible results, so how do search engines decide which pages the searcher is going to find valuable? A big part of determining where your page will rank for a given query is how well the content on your page matches the query’s intent. In other words, does this page match the words that were searched and help fulfill the task the searcher was trying to accomplish?

Because of this focus on user satisfaction and task accomplishment, there’s no strict benchmarks on how long your content should be, how many times it should contain a keyword, or what you put in your header tags. All those can play a role in how well a page performs in search, but the focus should be on the users who will be reading the content.

Today, with hundreds or even thousands of ranking signals, the top three have stayed fairly consistent: links to your website (which serve as a third-party credibility signals), on-page content (quality content that fulfills a searcher’s intent), and RankBrain.

What is RankBrain?

RankBrain is the machine learning component of Google’s core algorithm. Machine learning is a computer program that continues to improve its predictions over time through new observations and training data. In other words, it’s always learning, and because it’s always learning, search results should be constantly improving.

For example, if RankBrain notices a lower ranking URL providing a better result to users than the higher ranking URLs, you can bet that RankBrain will adjust those results, moving the more relevant result higher and demoting the lesser relevant pages as a byproduct.

Like most things with the search engine, we don’t know exactly what comprises RankBrain, but apparently, neither do the folks at Google.

What does this mean for SEOs?

Because Google will continue leveraging RankBrain to promote the most relevant, helpful content, we need to focus on fulfilling searcher intent more than ever before. Provide the best possible information and experience for searchers who might land on your page, and you’ve taken a big first step to performing well in a RankBrain world.

Engagement metrics: correlation, causation, or both?

With Google rankings, engagement metrics are most likely part correlation and part causation.

When we say engagement metrics, we mean data that represents how searchers interact with your site from search results. This includes things like:

  • Clicks (visits from search)
  • Time on page (amount of time the visitor spent on a page before leaving it)
  • Bounce rate (the percentage of all website sessions where users viewed only one page)
  • Pogo-sticking (clicking on an organic result and then quickly returning to the SERP to choose another result)

Many tests, including Moz’s own ranking factor survey, have indicated that engagement metrics correlate with higher ranking, but causation has been hotly debated. Are good engagement metrics just indicative of highly ranked sites? Or are sites ranked highly because they possess good engagement metrics?

What Google has said

While they’ve never used the term “direct ranking signal,” Google has been clear that they absolutely use click data to modify the SERP for particular queries.

According to Google’s former Chief of Search Quality, Udi Manber:

“The ranking itself is affected by the click data. If we discover that, for a particular query, 80% of people click on #2 and only 10% click on #1, after a while we figure out probably #2 is the one people want, so we’ll switch it.”

Another comment from former Google engineer Edmond Lau corroborates this:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. The actual mechanics of how click data is used is often proprietary, but Google makes it obvious that it uses click data with its patents on systems like rank-adjusted content items.”

Because Google needs to maintain and improve search quality, it seems inevitable that engagement metrics are more than correlation, but it would appear that Google falls short of calling engagement metrics a “ranking signal” because those metrics are used to improve search quality, and the rank of individual URLs is just a byproduct of that.

What tests have confirmed

Various tests have confirmed that Google will adjust SERP order in response to searcher engagement:

  • Rand Fishkin’s 2014 test resulted in a #7 result moving up to the #1 spot after getting around 200 people to click on the URL from the SERP. Interestingly, ranking improvement seemed to be isolated to the location of the people who visited the link. The rank position spiked in the US, where many participants were located, whereas it remained lower on the page in Google Canada, Google Australia, etc.
  • Larry Kim’s comparison of top pages and their average dwell time pre- and post-RankBrain seemed to indicate that the machine-learning component of Google’s algorithm demotes the rank position of pages that people don’t spend as much time on.
  • Darren Shaw’s testing has shown user behavior’s impact on local search and map pack results as well.

Since user engagement metrics are clearly used to adjust the SERPs for quality, and rank position changes as a byproduct, it’s safe to say that SEOs should optimize for engagement. Engagement doesn’t change the objective quality of your web page, but rather your value to searchers relative to other results for that query. That’s why, after no changes to your page or its backlinks, it could decline in rankings if searchers’ behaviors indicates they like other pages better.

In terms of ranking web pages, engagement metrics act like a fact-checker. Objective factors such as links and content first rank the page, then engagement metrics help Google adjust if they didn’t get it right.

The evolution of search results

Back when search engines lacked a lot of the sophistication they have today, the term “10 blue links” was coined to describe the flat structure of the SERP. Any time a search was performed, Google would return a page with 10 organic results, each in the same format.

In this search landscape, holding the #1 spot was the holy grail of SEO. But then something happened. Google began adding results in new formats on their search result pages, called SERP features. Some of these SERP features include:

  • Paid advertisements
  • Featured snippets
  • People Also Ask boxes
  • Local (map) pack
  • Knowledge panel
  • Sitelinks

And Google is adding new ones all the time. It even experimented with “zero-result SERPs,” a phenomenon where only one result from the Knowledge Graph was displayed on the SERP with no results below it except for an option to “view more results.”

The addition of these features caused some initial panic for two main reasons. For one, many of these features caused organic results to be pushed down further on the SERP. Another byproduct is that fewer searchers are clicking on the organic results since more queries are being answered on the SERP itself.

So why would Google do this? It all goes back to the search experience. User behavior indicates that some queries are better satisfied by different content formats. Notice how the different types of SERP features match the different types of query intents.

Query Intent

Possible SERP Feature Triggered

Informational

Featured Snippet

Informational with one answer

Knowledge Graph / Instant Answer

Local

Map Pack

Transactional

Shopping

We’ll talk more about intent in Chapter 3, but for now, it’s important to know that answers can be delivered to searchers in a wide array of formats, and how you structure your content can impact the format in which it appears in search.

Localized search

A search engine like Google has its own proprietary index of local business listings, from which it creates local search results.

If you are performing local SEO work for a business that has a physical location customers can visit (ex: dentist) or for a business that travels to visit their customers (ex: plumber), make sure that you claim, verify, and optimize a free Google My Business Listing.

When it comes to localized search results, Google uses three main factors to determine ranking:

  1. Relevance
  2. Distance
  3. Prominence

Relevance

Relevance is how well a local business matches what the searcher is looking for. To ensure that the business is doing everything it can to be relevant to searchers, make sure the business’ information is thoroughly and accurately filled out.

Distance

Google use your geo-location to better serve you local results. Local search results are extremely sensitive to proximity, which refers to the location of the searcher and/or the location specified in the query (if the searcher included one).

Organic search results are sensitive to a searcher’s location, though seldom as pronounced as in local pack results.

Prominence

With prominence as a factor, Google is looking to reward businesses that are well-known in the real world. In addition to a business’ offline prominence, Google also looks to some online factors to determine local ranking, such as:

Reviews

The number of Google reviews a local business receives, and the sentiment of those reviews, have a notable impact on their ability to rank in local results.

Citations

A “business citation” or “business listing” is a web-based reference to a local business’ “NAP” (name, address, phone number) on a localized platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).

Local rankings are influenced by the number and consistency of local business citations. Google pulls data from a wide variety of sources in continuously making up its local business index. When Google finds multiple consistent references to a business’s name, location, and phone number it strengthens Google’s “trust” in the validity of that data. This then leads to Google being able to show the business with a higher degree of confidence. Google also uses information from other sources on the web, such as links and articles.

Check a local business’ citation accuracy here.

Organic ranking

SEO best practices also apply to local SEO, since Google also considers a website’s position in organic search results when determining local ranking.

In the next chapter, you’ll learn on-page best practices that will help Google and users better understand your content.

[Bonus!] Local engagement

Although not listed by Google as a local ranking determiner, the role of engagement is only going to increase as time goes on. Google continues to enrich local results by incorporating real-world data like popular times to visit and average length of visits…

Screenshot of Google SERP result for a local business showing busy times of day

…and even provides searchers with the ability to ask the business questions!

Screenshot of the Questions & Answers portion of a local Google SERP result

Undoubtedly now more than ever before, local results are being influenced by real-world data. This interactivity is how searchers interact with and respond to local businesses, rather than purely static (and game-able) information like links and citations.

Since Google wants to deliver the best, most relevant local businesses to searchers, it makes perfect sense for them to use real time engagement metrics to determine quality and relevance.


You don’t have to know the ins and outs of Google’s algorithm (that remains a mystery!), but by now you should have a great baseline knowledge of how the search engine finds, interprets, stores, and ranks content. Armed with that knowledge, let’s learn about choosing the keywords your content will target!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 3: Keyword Research

Posted by BritneyMuller

Welcome to the draft of Chapter Three of the new and improved Beginner’s Guide to SEO! So far you’ve been generous and energizing with your feedback for our outline, Chapter One, and Chapter Two. We’re asking for a little more of your time as we debut the our third chapter on keyword research. Please let us know what you think in the comments!


Chapter 3: Keyword Research

Understand what your audience wants to find.

Now that you’ve learned how to show up in search results, let’s determine which strategic keywords to target in your website’s content, and how to craft that content to satisfy both users and search engines.

The power of keyword research lies in better understanding your target market and how they are searching for your content, services, or products.

Keyword research provides you with specific search data that can help you answer questions like:

  • What are people searching for?
  • How many people are searching for it?
  • In what format do they want that information?

In this chapter, you’ll get tools and strategies for uncovering that information, as well as learn tactics that’ll help you avoid keyword research foibles and build strong content. Once you uncover how your target audience is searching for your content, you begin to uncover a whole new world of strategic SEO!

What terms are people searching for?

You may know what you do, but how do people search for the product, service, or information you provide? Answering this question is a crucial first step in the keyword research process.

Discovering keywords

You likely have a few keywords in mind that you would like to rank for. These will be things like your products, services, or other topics your website addresses, and they are great seed keywords for your research, so start there! You can enter those keywords into a keyword research tool to discover average monthly search volume and similar keywords. We’ll get into search volume in greater depth in the next section, but during the discovery phase, it can help you determine which variations of your keywords are most popular amongst searchers.

Once you enter in your seed keywords into a keyword research tool, you will begin to discover other keywords, common questions, and topics for your content that you might have otherwise missed.

Let’s use the example of a florist that specializes in weddings.

Typing “wedding” and “florist” into a keyword research tool, you may discover highly relevant, highly searched for related terms such as:

  • Wedding bouquets
  • Bridal flowers
  • Wedding flower shop

In the process of discovering relevant keywords for your content, you will likely notice that the search volume of those keywords varies greatly. While you definitely want to target terms that your audience is searching for, in some cases, it may be more advantageous to target terms with lower search volume because they’re far less competitive.

Since both high- and low-competition keywords can be advantageous for your website, learning more about search volume can help you prioritize keywords and pick the ones that will give your website the biggest strategic advantage.

Pro tip: Diversify!

It’s important to note that entire websites don’t rank for keywords, pages do. With big brands, we often see the homepage ranking for many keywords, but for most websites, this isn’t usually the case. Many websites receive more organic traffic to pages other than the homepage, which is why it’s so important to diversify your website’s pages by optimizing each for uniquely valuable keywords.

How often are those terms searched?

Uncovering search volume

The higher the search volume for a given keyword or keyword phrase, the more work is typically required to achieve higher rankings. This is often referred to as keyword difficulty and occasionally incorporates SERP features; for example, if many SERP features (like featured snippets, knowledge graph, carousels, etc) are clogging up a keyword’s result page, difficulty will increase. Big brands often take up the top 10 results for high-volume keywords, so if you’re just starting out on the web and going after the same keywords, the uphill battle for ranking can take years of effort.

Typically, the higher the search volume, the greater the competition and effort required to achieve organic ranking success. Go too low, though, and you risk not drawing any searchers to your site. In many cases, it may be most advantageous to target highly specific, lower competition search terms. In SEO, we call those long-tail keywords.

Understanding the long tail

It would be great to rank #1 for the keyword “shoes”… or would it?

It’s wonderful to deal with keywords that have 50,000 searches a month, or even 5,000 searches a month, but in reality, these popular search terms only make up a fraction of all searches performed on the web. In fact, keywords with very high search volumes may even indicate ambiguous intent, which, if you target these terms, it could put you at risk for drawing visitors to your site whose goals don’t match the content your page provides.

Does the searcher want to know the nutritional value of pizza? Order a pizza? Find a restaurant to take their family? Google doesn’t know, so they offer these features to help you refine. Targeting “pizza” means that you’re likely casting too wide a net.

The remaining 75% lie in the “chunky middle” and “long tail” of search.

Don’t underestimate these less popular keywords. Long tail keywords with lower search volume often convert better, because searchers are more specific and intentional in their searches. For example, a person searching for “shoes” is probably just browsing. Whereas, someone searching for “best price red womens size 7 running shoe,” practically has their wallet out!

Pro tip: Questions are SEO gold!

Discovering what questions people are asking in your space, and adding those questions and their answers to an FAQ page, can yield incredible organic traffic for your website.

Getting strategic with search volume

Now that you’ve discovered relevant search terms for your site and their corresponding search volumes, you can get even more strategic by looking at your competitors and figuring out how searches might differ by season or location.

Keywords by competitor

You’ll likely compile a lot of keywords. How do you know which to tackle first? It could be a good idea to prioritize high-volume keywords that your competitors are not currently ranking for. On the flip side, you could also see which keywords from your list your competitors are already ranking for and prioritize those. The former is great when you want to take advantage of your competitors’ missed opportunities, while the latter is an aggressive strategy that sets you up to compete for keywords your competitors are already performing well for.

Keywords by season

Knowing about seasonal trends can be advantageous in setting a content strategy. For example, if you know that “christmas box” starts to spike in October through December in the United Kingdom, you can prepare content months in advance and give it a big push around those months.

Keywords by region

You can more strategically target a specific location by narrowing down your keyword research to specific towns, counties, or states in the Google Keyword Planner, or evaluate “interest by subregion” in Google Trends. Geo-specific research can help make your content more relevant to your target audience. For example, you might find out that in Texas, the preferred term for a large truck is “big rig,” while in New York, “tractor trailer” is the preferred terminology.

Which format best suits the searcher’s intent?

In Chapter 2, we learned about SERP features. That background is going to help us understand how searchers want to consume information for a particular keyword. The format in which Google chooses to display search results depends on intent, and every query has a unique one. While there are thousands of of possible search types, there are five major categories to be aware of:

1. Informational queries: The searcher needs information, such as the name of a band or the height of the Empire State Building.

2. Navigational queries: The searcher wants to go to a particular place on the Internet, such as Facebook or the homepage of the NFL.

3. Transactional queries: The searcher wants to do something, such as buy a plane ticket or listen to a song.

4. Commercial investigation: The searcher wants to compare products and find the best one for their specific needs.

5. Local queries: The searcher wants to find something locally, such as a nearby coffee shop, doctor, or music venue.

An important step in the keyword research process is surveying the SERP landscape for the keyword you want to target in order to get a better gauge of searcher intent. If you want to know what type of content your target audience wants, look to the SERPs!

Google has closely evaluated the behavior of trillions of searches in an attempt to provide the most desired content for each specific keyword search.

Take the search “dresses,” for example:

By the shopping carousel, you can infer that Google has determined many people who search for “dresses” want to shop for dresses online.

There is also a Local Pack feature for this keyword, indicating Google’s desire to help searchers who may be looking for local dress retailers.

If the query is ambiguous, Google will also sometimes include the “refine by” feature to help searchers specify what they’re looking for further. By doing so, the search engine can provide results that better help the searcher accomplish their task.

Google has a wide array of result types it can serve up depending on the query, so if you’re going to target a keyword, look to the SERP to understand what type of content you need to create.

Tools for determining the value of a keyword

How much value would a keyword add to your website? These tools can help you answer that question, so they’d make great additions to your keyword research arsenal:

  • Moz Keyword Explorer – Our own Moz Keyword Explorer tool extracts accurate search volume data, keyword difficulty, and keyword opportunity metrics by using live clickstream data. To learn more about how we’re producing our keyword data, check out Announcing Keyword Explorer.
  • Google Keyword Planner – Google’s AdWords Keyword Planner has historically been the most common starting point for SEO keyword research. However, Keyword Planner does restrict search volume data by lumping keywords together into large search volume range buckets. To learn more, check out Google Keyword Planner’s Dirty Secrets.
  • Google Trends – Google’s keyword trend tool is great for finding seasonal keyword fluctuations. For example, “funny halloween costume ideas” will peak in the weeks before Halloween.
  • AnswerThePublic – This free tool populates commonly searched for questions around a specific keyword. Bonus! You can use this tool in tandem with another free tool, Keywords Everywhere, to prioritize ATP’s suggestions by search volume.
  • SpyFu Keyword Research Tool – Provides some really neat competitive keyword data.

Download our free keyword research template!

Keyword research can yield a ton of data. Stay organized by downloading our free keyword research template. Customize the template to fit your unique needs. Happy keyword researching!

Now that you know how to uncover what your target audience is searching for and how often, it’s time to move onto the next step: crafting pages in a way that users will love and search engines can understand.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 1: SEO 101

Posted by BritneyMuller

Back in mid-November, we kicked off a campaign to rewrite our biggest piece of content: the Beginner’s Guide to SEO. You offered up a huge amount of helpful advice and insight with regards to our outline, and today we’re here to share our draft of the first chapter.

In many ways, the Beginner’s Guide to SEO belongs to each and every member of our community; it’s important that we get this right, for your sake. So without further ado, here’s the first chapter — let’s dive in!


Chapter 1: SEO 101

What is it, and why is it important?

Welcome! We’re excited that you’re here!

If you already have a solid understanding of SEO and why it’s important, you can skip to Chapter 2 (though we’d still recommend skimming the best practices from Google and Bing at the end of this chapter; they’re useful refreshers).

For everyone else, this chapter will help build your foundational SEO knowledge and confidence as you move forward.

What is SEO?

SEO stands for “search engine optimization.” It’s the practice of increasing both the quality and quantity of website traffic, as well as exposure to your brand, through non-paid (also known as “organic”) search engine results.

Despite the acronym, SEO is as much about people as it is about search engines themselves. It’s about understanding what people are searching for online, the answers they are seeking, the words they’re using, and the type of content they wish to consume. Leveraging this data will allow you to provide high-quality content that your visitors will truly value.

Here’s an example. Frankie & Jo’s (a Seattle-based vegan, gluten-free ice cream shop) has heard about SEO and wants help improving how and how often they show up in organic search results. In order to help them, you need to first understand their potential customers:

  • What types of ice cream, desserts, snacks, etc. are people searching for?
  • Who is searching for these terms?
  • When are people searching for ice cream, snacks, desserts, etc.?
    • Are there seasonality trends throughout the year?
  • How are people searching for ice cream?
    • What words do they use?
    • What questions do they ask?
    • Are more searches performed on mobile devices?
  • Why are people seeking ice cream?
    • Are individuals looking for health conscious ice cream specifically or just looking to satisfy a sweet tooth?
  • Where are potential customers located — locally, nationally, or internationally?

And finally — here’s the kicker — how can you help provide the best content about ice cream to cultivate a community and fulfill what all those people are searching for?

Search engine basics

Search engines are answer machines. They scour billions of pieces of content and evaluate thousands of factors to determine which content is most likely to answer your query.

Search engines do all of this by discovering and cataloguing all available content on the Internet (web pages, PDFs, images, videos, etc.) via a process known as “crawling and indexing.”

What are “organic” search engine results?

Organic search results are search results that aren’t paid for (i.e. not advertising). These are the results that you can influence through effective SEO. Traditionally, these were the familiar “10 blue links.”

Today, search engine results pages — often referred to as “SERPs” — are filled with both more advertising and more dynamic organic results formats (called “SERP features”) than we’ve ever seen before. Some examples of SERP features are featured snippets (or answer boxes), People Also Ask boxes, image carousels, etc. New SERP features continue to emerge, driven largely by what people are seeking.

For example, if you search for “Denver weather,” you’ll see a weather forecast for the city of Denver directly in the SERP instead of a link to a site that might have that forecast. And, if you search for “pizza Denver,” you’ll see a “local pack” result made up of Denver pizza places. Convenient, right?

It’s important to remember that search engines make money from advertising. Their goal is to better solve searcher’s queries (within SERPs), to keep searchers coming back, and to keep them on the SERPs longer.

Some SERP features on Google are organic and can be influenced by SEO. These include featured snippets (a promoted organic result that displays an answer inside a box) and related questions (a.k.a. “People Also Ask” boxes).

It’s worth noting that there are many other search features that, even though they aren’t paid advertising, can’t typically be influenced by SEO. These features often have data acquired from proprietary data sources, such as Wikipedia, WebMD, and IMDb.

Why SEO is important

While paid advertising, social media, and other online platforms can generate traffic to websites, the majority of online traffic is driven by search engines.

Organic search results cover more digital real estate, appear more credible to savvy searchers, and receive way more clicks than paid advertisements. For example, of all US searches, only ~2.8% of people click on paid advertisements.

In a nutshell: SEO has ~20X more traffic opportunity than PPC on both mobile and desktop.

SEO is also one of the only online marketing channels that, when set up correctly, can continue to pay dividends over time. If you provide a solid piece of content that deserves to rank for the right keywords, your traffic can snowball over time, whereas advertising needs continuous funding to send traffic to your site.

Search engines are getting smarter, but they still need our help.

Optimizing your site will help deliver better information to search engines so that your content can be properly indexed and displayed within search results.

Should I hire an SEO professional, consultant, or agency?

Depending on your bandwidth, willingness to learn, and the complexity of your website(s), you could perform some basic SEO yourself. Or, you might discover that you would prefer the help of an expert. Either way is okay!

If you end up looking for expert help, it’s important to know that many agencies and consultants “provide SEO services,” but can vary widely in quality. Knowing how to choose a good SEO company can save you a lot of time and money, as the wrong SEO techniques can actually harm your site more than they will help.

White hat vs black hat SEO

“White hat SEO” refers to SEO techniques, best practices, and strategies that abide by search engine rule, its primary focus to provide more value to people.

“Black hat SEO” refers to techniques and strategies that attempt to spam/fool search engines. While black hat SEO can work, it puts websites at tremendous risk of being penalized and/or de-indexed (removed from search results) and has ethical implications.

Penalized websites have bankrupted businesses. It’s just another reason to be very careful when choosing an SEO expert or agency.

Search engines share similar goals with the SEO industry

Search engines want to help you succeed. They’re actually quite supportive of efforts by the SEO community. Digital marketing conferences, such as Unbounce, MNsearch, SearchLove, and Moz’s own MozCon, regularly attract engineers and representatives from major search engines.

Google assists webmasters and SEOs through their Webmaster Central Help Forum and by hosting live office hour hangouts. (Bing, unfortunately, shut down their Webmaster Forums in 2014.)

While webmaster guidelines vary from search engine to search engine, the underlying principles stay the same: Don’t try to trick search engines. Instead, provide your visitors with a great online experience.

Google webmaster guidelines

Basic principles:

  • Make pages primarily for users, not search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging.

Things to avoid:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content (i.e. copied from somewhere else)
  • Cloaking — the practice of showing search engine crawlers different content than visitors.
  • Hidden text and links
  • Doorway pages — pages created to rank well for specific searches to funnel traffic to your website.

Full Google Webmaster Guidelines version here.

Bing webmaster guidelines

Basic principles:

  • Provide clear, deep, engaging, and easy-to-find content on your site.
  • Keep page titles clear and relevant.
  • Links are regarded as a signal of popularity and Bing rewards links that have grown organically.
  • Social influence and social shares are positive signals and can have an impact on how you rank organically in the long run.
  • Page speed is important, along with a positive, useful user experience.
  • Use alt attributes to describe images, so that Bing can better understand the content.

Things to avoid:

  • Thin content, pages showing mostly ads or affiliate links, or that otherwise redirect visitors away to other sites will not rank well.
  • Abusive link tactics that aim to inflate the number and nature of inbound links such as buying links, participating in link schemes, can lead to de-indexing.
  • Ensure clean, concise, keyword-inclusive URL structures are in place. Dynamic parameters can dirty up your URLs and cause duplicate content issues.
  • Make your URLs descriptive, short, keyword rich when possible, and avoid non-letter characters.
  • Burying links in Javascript/Flash/Silverlight; keep content out of these as well.
  • Duplicate content
  • Keyword stuffing
  • Cloaking — the practice of showing search engine crawlers different content than visitors.

Guidelines for representing your local business on Google

These guidelines govern what you should and shouldn’t do in creating and managing your Google My Business listing(s).

Basic principles:

  • Be sure you’re eligible for inclusion in the Google My Business index; you must have a physical address, even if it’s your home address, and you must serve customers face-to-face, either at your location (like a retail store) or at theirs (like a plumber)
  • Honestly and accurately represent all aspects of your local business data, including its name, address, phone number, website address, business categories, hours of operation, and other features.

Things to avoid

  • Creation of Google My Business listings for entities that aren’t eligible
  • Misrepresentation of any of your core business information, including “stuffing” your business name with geographic or service keywords, or creating listings for fake addresses
  • Use of PO boxes or virtual offices instead of authentic street addresses
  • Abuse of the review portion of the Google My Business listing, via fake positive reviews of your business or fake negative ones of your competitors
  • Costly, novice mistakes stemming from failure to read the fine details of Google’s guidelines

Fulfilling user intent

Understanding and fulfilling user intent is critical. When a person searches for something, they have a desired outcome. Whether it’s an answer, concert tickets, or a cat photo, that desired content is their “user intent.”

If a person performs a search for “bands,” is their intent to find musical bands, wedding bands, band saws, or something else?

Your job as an SEO is to quickly provide users with the content they desire in the format in which they desire it.

Common user intent types:

Informational: Searching for information. Example: “How old is Issa Rae?”

Navigational: Searching for a specific website. Example: “HBOGO Insecure”

Transactional: Searching to buy something. Example: “where to buy ‘We got y’all’ Insecure t-shirt”

You can get a glimpse of user intent by Googling your desired keyword(s) and evaluating the current SERP. For example, if there’s a photo carousel, it’s very likely that people searching for that keyword search for photos.

Also evaluate what content your top-ranking competitors are providing that you currently aren’t. How can you provide 10X the value on your website?

Providing relevant, high-quality content on your website will help you rank higher in search results, and more importantly, it will establish credibility and trust with your online audience.

Before you do any of that, you have to first understand your website’s goals to execute a strategic SEO plan.

Know your website/client’s goals

Every website is different, so take the time to really understand a specific site’s business goals. This will not only help you determine which areas of SEO you should focus on, where to track conversions, and how to set benchmarks, but it will also help you create talking points for negotiating SEO projects with clients, bosses, etc.

What will your KPIs (Key Performance Indicators) be to measure the return on SEO investment? More simply, what is your barometer to measure the success of your organic search efforts? You’ll want to have it documented, even if it’s this simple:

For the website ________________________, my primary SEO KPI is _______________.

Here are a few common KPIs to get you started:

  • Sales
  • Downloads
  • Email signups
  • Contact form submissions
  • Phone calls

And if your business has a local component, you’ll want to define KPIs for your Google My Business listings, as well. These might include:

  • Clicks-to-call
  • Clicks-to-website
  • Clicks-for-driving-directions

Notice how “Traffic” and “Ranking” are not on the above lists? This is because, for most websites, ranking well for keywords and increasing traffic won’t matter if the new traffic doesn’t convert (to help you reach the site’s KPI goals).

You don’t want to send 1,000 people to your website a month and have only 3 people convert (to customers). You want to send 300 people to your site a month and have 40 people convert.

This guide will help you become more data-driven in your SEO efforts. Rather than haphazardly throwing arrows all over the place (and getting lucky every once in awhile), you’ll put more wood behind fewer arrows.

Grab a bow (and some coffee); let’s dive into Chapter 2 (Crawlers & Indexation).


We’re looking forward to hearing your thoughts on this draft of Chapter 1. What works? Anything you feel could be added or explained differently? Let us know your suggestions, questions, and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO

Posted by BritneyMuller


Many of you reading likely cut your teeth on Moz’s Beginner’s Guide to SEO. Since it was launched, it’s easily been our top-performing piece of content:

Most months see 100k+ views (the reverse plateau in 2013 is when we changed domains).

While Moz’s Beginner’s Guide to SEO still gets well over 100k views a month, the current guide itself is fairly outdated. This big update has been on my personal to-do list since I started at Moz, and we need to get it right because — let’s get real — you all deserve a bad-ass SEO 101 resource!

However, updating the guide is no easy feat. Thankfully, I have the help of my fellow Mozzers. Our content team has been a collective voice of reason, wisdom, and organization throughout this process and has kept this train on its tracks.

Despite the effort we’ve put into this already, it felt like something was missing: your input! We’re writing this guide to be a go-to resource for all of you (and everyone who follows in your footsteps), and want to make sure that we’re including everything that today’s SEOs need to know. You all have a better sense of that than anyone else.

So, in order to deliver the best possible update, I’m seeking your help.

This is similar to the way Rand did it back in 2007. And upon re-reading your many “more examples” requests, we’ve continued to integrate more examples throughout.

The plan:

  • Over the next 6–8 weeks, I’ll be updating sections of the Beginner’s Guide and posting them, one by one, on the blog.
  • I’ll solicit feedback from you incredible people and implement top suggestions.
  • The guide will be reformatted/redesigned, and I’ll 301 all of the blog entries that will be created over the next few weeks to the final version.
  • It’s going to remain 100% free to everyone — no registration required, no premium membership necessary.

To kick things off, here’s the revised outline for the Beginner’s Guide to SEO:

Click each chapter’s description to expand the section for more detail.

Chapter 1: SEO 101


What is it, and why is it important? ↓


Chapter 2: Crawlers & Indexing


First, you need to show up. ↓


Chapter 3: Keyword Research


Next, know what to say and how to say it. ↓


Chapter 4: On-Page SEO


Next, structure your message to resonate and get it published. ↓


Chapter 5: Technical SEO


Next, translate your site into Google’s language. ↓


Chapter 6: Establishing Authority

Finally, turn up the volume. ↓


Chapter 7: Measuring and Tracking SEO

Pivot based on what’s working. ↓


Appendix A: Glossary of Terms

Appendix B: List of Additional Resources

Appendix C: Contributors & Credits


What did you struggle with most when you were first learning about SEO? What would you have benefited from understanding from the get-go?

Are we missing anything? Any section you wish wouldn’t be included in the updated Beginner’s Guide? Leave your suggestions in the comments!

Thanks in advance for contributing.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The Beginner’s Guide to Structured Data for SEO: A Two-Part Series

Posted by bridget.randolph

Part 1: An overview of structured data for SEO

SEOs have been talking about structured data for a few years now — ever since Google, Bing, Yahoo! and Yandex got together in 2011 to create a standardized list of attributes and entities which they all agreed to support, and which became known as Schema.org. However, there’s still a lot of confusion around what structured data is, what it’s for, and how and when to implement structured data for SEO purposes. In fact, a survey carried out last year by Bing found that only 17% of marketers are using (or were planning to use) Schema.org structured data markup.

In this two-part series, you’ll learn the basics of structured data: first we’ll talk about what it is, and how it relates to SEO (Part 1), and then I’ll take you through a simple process for identifying structured data opportunities and implementing structured data on your own site (Part 2).

What is “structured data”?

“Structured data” as a general term simply refers to any data which is organized (i.e., given “structure”). For example, if you have a bunch of scattered Post-It notes with phone messages about meetings, dates, times, people, etc, and you organize these into a table with labeled rows and columns for each type of information, you’re structuring the data.

Example of unstructured data

Post-It 1: “John called, confirming 3pm on Wed at Coffee Shop”

Post-It 2: “Don’t forget your 10am meeting at Mary’s Office this Friday”

Example of structured data

Meeting With

Date

Time

Location

John

Wednesday

3pm

Coffee Shop

Mary

Friday

10am

Office


Structured data can be used in many different ways, such as using Open Graph markup to specify a Facebook title and description, or using SQL to query a relational database. In an SEO context, “structured data” usually refers to implementing some type of markup on a webpage, in order to provide additional detail around the page’s content. This markup improves the search engines’ understanding of that content, which can help with relevancy signals and also enables a site to benefit from enhanced results in SERPs (rich snippets, rich cards, carousels, knowledge boxes, etc). Because this type of markup needs to be parsed and understood consistently by search engines as well as by people, there are standardized implementations (known as formats and/or syntaxes) and classifications of concepts, relationships, and terms (known as vocabularies) which should be used.

There are three syntaxes which search engines will typically support (Microdata, JSON-LD, and microformats) and two common vocabularies which can be used with these syntaxes: Schema.org and Microformats.org. Schema.org can be used with either the Microdata and JSON-LD syntaxes, while the microformats syntax and vocabulary go together. If you’re reading up on this topic, you may also see references to RDFa, which is another syntax.

*This all gets pretty confusing, so if you’re feeling less-than-crystal-clear right now, you might want to check out this great glossary cheat sheet from Aaron Bradley.


When we talk about structured data for SEO, we’re usually talking about the particular vocabulary known as “Schema.org.” Schema.org is the most commonly used approach to structured data markup for SEO purposes. It isn’t the only one, though. Some websites use the Microformats.org vocabulary, most often for marking up product reviews (h-review markup) or defining a physical location (h-card markup).

In addition to being able to use different vocabularies to mark up your site, you can also implement this markup in different ways using syntaxes. For Schema.org vocabulary, the best ways to add markup to your site are either through using the Microdata format, or JSON-LD. With Microdata markup, your structured data is integrated within the main HTML of the page, whereas JSON-LD uses a Javascript object to insert all of your markup into the head of the page, which is often a cleaner, simpler implementation from a development perspective.

The Microdata approach was originally the recommended one for SEO purposes, but Google’s JSON-LD support has improved in the past few years and now it is their recommended approach when possible. Note, however, that Bing does not currently support JSON-LD (although hopefully this may be changing soon).

How does structured data support SEO?

Google, Bing, and other search engines encourage webmasters to use structured data, and incentivize that usage by providing benefits to websites with structured data correctly implemented.

Some of these benefits include search result enhancements and content-specific features, such as:

  • Rich search results: Includes styling, images, and other visual enhancements
  • Rich cards: A variation on rich search results, similar to rich snippets and designed for mobile users
  • Enriched search results: Includes interactive or immersive features
  • Knowledge Graph: Information about an entity such as a brand
  • Breadcrumbs: Breadcrumbs in your search result
  • Carousels: A collection of multiple rich results in a carousel style
  • Rich results for AMP: To have your AMP (Accelerated Mobile Pages) appear in carousels and with rich results, you’ll need to include structured data

These enhanced search results can also improve your click-through rate (CTR) and drive additional traffic, because they are more visually appealing and provide additional information to searchers. And improved CTR can also indirectly improve your rankings, as a user behavior signal.

Implementing structured data on your site is also a way to prepare for the future of search, as Google in particular continues to move in the direction of hyper-personalization and solving problems and answering questions directly. Tom Anthony gave a presentation about this topic not too long ago, titled Five Emerging Trends in Search.

Common uses for structured data

Part 2 of this series will go into more detail around specific structured data opportunities and how to implement them. However, there are certain common uses for structured data which almost any website or brand can benefit from:

Knowledge Graph

If you have a personal or business brand, you can edit the information which appears on the right-hand side of the SERP for branded searches. Google uses structured data to populate the Knowledge Graph box.

Rich snippets and rich cards

The most commonly used markup allows you to provide additional context for:

  • Articles
  • Recipes
  • Products
  • Star Ratings and Product Reviews
  • Videos

Using this markup allows your site to show up in the SERPs as a rich snippet or rich card:

Google’s rich cards examples for “Recipe”

If your site has several items that would fit the query, you can also get a “host carousel” result like this one for “chicken recipes”:

Image source

In addition to these types of content markup, Google is currently experimenting with “action markup,” which enables users to take an action directly from the SERP, such as booking an appointment or watching a movie. If this is relevant to your business, you may want to express interest in participating.

AMP (Accelerated Mobile Pages)

If your site uses AMP (Accelerated Mobile Pages), you’ll want to make sure you include structured data markup on both the regular and AMP pages. This will allow your AMP pages to appear in rich results, including the Top Stories carousel and host carousels.

Social cards

Although Open Graph, Twitter cards, and other social-specific markup may not have a big impact from a purely SEO perspective, this markup is visible to search engines and Bing specifically notes that their search engine can understand Open Graph page-level annotations (although at the moment they only use this data to provide visual enhancements for a specific handful of publishers).

If you use any social networks for marketing, or simply want your content to look good when it’s shared on social media, make sure you correctly implement social markup and validate using the various platforms’ respective testing tools:

AdWords

You can include structured data in your AdWords ads, using structured snippet extensions. These allow you to add additional information within your ad copy to help people understand more about your products or services and can also improve click-through rate (CTR) on your ads.

Email marketing

If you have Gmail, you may have gotten a confirmation email for a flight and seen the information box at the top showing your flight details, or seen a similar information box for your last Amazon order. This is possible due to structured data markup for emails. Google Inbox and Gmail support both JSON-LD and Microdata markup for emails about various types of orders, invoices and reservations.

3 common myths about structured data & SEO

Myth #1: Implementing structured data means I will definitely get rich snippets.

Although using structured data markup is necessary to be eligible for rich snippets and rich cards, there is no guarantee that simply adding structured data markup to your site will immediately result in rich snippets or cards. Sometimes it may not show up at all, or may appear inconsistently. This doesn’t necessarily mean you’ve done anything wrong.

Myth #2: Structured data is a ranking signal.

Using structured data correctly can help search engines to better understand what your content is about and may therefore contribute to a stronger relevancy signal. In addition, studies have shown that rich snippets can improve click-through rate (CTR), which can lead to better rankings indirectly. However, the use of structured data markup on its own is not a direct ranking signal.

Myth #3: Google can figure it out without the extra work.

Sometimes it’s tempting to skip extra steps, like implementing structured data, since we know that Google is getting smarter at figuring things out and understanding content without much help. But this is a short-sighted view. Yes, Google and other search engines can understand and figure out some of this stuff on their own, but if you want them to be able to understand a specific thing about your content, you should use the correct markup. Not only will it help in the short term with the things the algorithms aren’t so good at understanding, it also ensures that your site itself is well structured and that your content serves a clear purpose. Also, Google won’t give you certain features without correct implementation, which could be costing you on a large scale over time, especially if you’re in a competitive niche. Apart from anything else, studies have shown that rich snippets can improve CTR by anywhere from 5%–30%.

Additional resources

In Part 2 of this two-part series, we’ll be looking at the practical side of structured data implementation: how to actually identify structured data opportunities for your site, and how to implement and test the markup correctly.

But for now, here are some resources to help you get started:

In the meantime, I’d love to hear from you: Have you implemented structured data markup on your site? Share your results in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

A Beginner’s Guide to Marketing Automation

Posted by Angela_Petteys

To say marketing automation is a complex subject is putting it mildly. On the surface it seems simple enough, but once you get just a little bit deeper into it, it’s overwhelming. Even if you work with marketing automation on a daily basis, it can be hard to describe.

When used correctly, marketing automation can be useful in helping sales and marketing teams do their jobs more effectively so they can reach their goals. But there are also a lot of misunderstandings about what marketing automation is and isn’t. Let’s try to get a better understanding of what marketing automation is and how it can potentially help a business.

What is marketing automation?

Marketing automation is the use of software to deliver personalized messages to customers and leads. The software allows you to create a dynamic series of messages to send to your contacts. The message a person receives is decided by factors you specify, like what their spending habits are, where they are in the buying process, and past interactions they’ve had with your site.

Delivering content that’s tailored to a person’s needs and interests helps build stronger relationships which, in turn, can help increase conversions and revenue. Marketing automation can help you accomplish all these things while streamlining your operations at the same time.

In the broad scope of things, marketing automation incorporates several different aspects of marketing and business development, including email marketing, content development, conversion rate optimization, and lead generation.

The benefits of using marketing automation

By far, one of the biggest benefits of marketing automation is that it helps sales and marketing teams work more efficiently. People love personalized content; sending out personalized emails generates six times more revenue than sending non-personalized emails. But manually sending out customized messages to contacts simply isn’t practical. Marketing automation platforms handle the mundane and repetitive work that goes into delivering personalized content, giving sales and marketing professionals more time to focus on things that are more interesting and challenging.

Not only does marketing automation make it easier to deliver messages, it makes it easier to figure out where people are in the conversion process. Marketing automation programs typically have a lead scoring feature which helps users quickly identify which leads are the most sales-ready.

One of the most common reasons why businesses consider using marketing automation in the first place is because they want to improve their conversion rates and revenues. Marketing automation is a way to encourage customers to stay engaged longer, making it more likely they’ll stick around long enough to convert. On average, companies that use marketing automation have 53% higher conversion rates and an annual revenue growth rate 3.1% higher compared to companies that don’t.

For products and services with longer conversion cycles, marketing automation can also help speed up the process. In one example cited by VentureHarbour, Thomson Reuters was able to reduce their conversion time by 72% by using marketing automation software.

What applications are there for marketing automation?

While marketing automation has several different applications, email messaging and lead generation/nurturing are among the most common.

Yes, email is still relevant as a marketing tool. While it’s easy to say things like “Everybody’s on Facebook/Twitter/Instagram,” it’s simply not true. However, most Internet users do have at least one email address. Email inboxes also tend to move at a slower pace than social media feeds, giving you the best chance at making a direct connection with your contacts. There’s a multitude of ways marketing automation can be used with email:

  • Welcome messages
  • Product retargeting
  • Abandoned cart reminders
  • Personalized product recommendations

And that’s just to name a few.

Many companies use marketing automation to solicit feedback from their contacts, regardless if they’ve converted or not. Whether it’s by sending out surveys or asking people to send comments directly to them, the information they garner can be extremely valuable in guiding changes that will help improve their revenues in the long run.

Given that personalized emails generate so much more revenue than non-personalized emails, marketing automation can be an effective way to nurture your leads. According to Marketo, about 50% of leads in any system are not ready to buy and nearly 80% of all new leads will never become sales. With marketing automation, the goal is to give people something of value when they need it most so that they’re more likely to convert. Effective lead nurturing generates 50% more sales-ready leads at a 33% lower cost. Nurtured leads also tend to make larger purchases than non-nurtured leads.

Marketing automation platforms are also often commonly used to manage social media campaigns, create landing pages, and conduct ongoing A/B testing.

B2B vs. B2C marketing automation

Businesses of all sizes can potentially benefit from marketing automation, but whether a business has a B2B or B2C model is going to have an impact on the type of messaging used in their campaigns. While both types of businesses would have the main goals of improving conversions and revenue, there are differences in how they’ll reach that goal.

B2B sales

B2B sales tend to have longer conversion cycles than B2C sales and often involve products or services that require a more long-term commitment. (Of course, there are some exceptions.) Because of this, B2B messaging has a greater emphasis on long-form content like whitepapers, case studies, and e-books. When major purchases are being considered for a business, multiple people are often involved in the decision-making process, so it’s not always a matter of winning over one person like it is with B2C sales. It’s important for the business with something to sell to establish themselves as an authority in their industry — offering in-depth informational content is a great way to do that.

B2C sales

Since B2C sales move at a faster pace, the content used in their messaging is typically much simpler. For example, Sephora customers aren’t going to be interested in long case studies about a product, but they might appreciate a 30-second video demonstrating how to use a product instead. For B2C companies, the focus tends to be more on brand building and giving customers reasons to come back, so their messaging typically includes things like abandoned shopping cart reminders, personalized product recommendations, and offers tailored to specific types of customers.

Key concepts

Although many different aspects of marketing and business development come together in marketing automation, the whole process is ultimately driven by a few core concepts.

Conversion funnels

A conversion funnel is the process a person takes toward becoming a customer. Now that it’s so easy to find product reviews and shop around, a lot of people don’t just buy things from the first place they see it for sale. Marketing automation is a way to keep people engaged so they’re more likely to convert.

The conversion funnel can be broken down into a few basic stages:

  • Awareness: The customer initially becomes aware of a company, product, or service. It’s too soon for a person to want to make any decisions, but a business has made its way onto their radar.
  • Interest: Not everyone who is aware of a business/product/service is going to have a need for it. At this point, those who are interested will start becoming more engaged by doing things like requesting a quote, signing up for a free trial, following a business on social media, looking for reviews, or reading blog posts and other content on a company’s site.
  • Consideration: By now, a person is familiar enough with a business to know they like what’s being offered. They’re not quite ready to make a decision, but a business is in the running.
  • Action: This is the point where a person decides to convert. You’ve won them over and they’re ready to do business with you.

Ideally, after a person converts once, they’ll be so happy with their decision that they become a repeat customer. But as people move through the conversion funnel, whether they do it once or several times, some of them will always drop out at each level. On average, only 1–5 % of people who enter a conversion funnel actually convert. When people drop out, it’s known as churn, and while some churn is inevitable, marketing automation can help reduce it. By understanding the needs and interests of people at each stage of the conversion funnel, you’re better able to keep them engaged by providing them with the type of content they’re most interested in.

For example, let’s say a company installs vinyl windows and they advertise heavily in the local media. At any given time, a large percentage of the thousands of people who see their ads won’t take any action after seeing one because they either don’t need new windows or because they live in a rental property. No amount of additional messaging will win those people over. But since replacing windows can be very expensive, the people who actually do need them typically spend time doing research to make sure they choose the right type of window and get the best price. If this company were to send additional information about vinyl windows to the people who contact them to get an estimate, they may be able to convince more people to convert.

Feedback loops and metrics

One of the basic laws of physics is that for every action, there’s an equal and opposite reaction. A very similar concept also applies in the world of marketing automation, and it’s known as a feedback loop. When you send a message to a person, the recipient will have some kind of reaction to it, even if that reaction is to do nothing at all. That reaction is part of your feedback loop and you’ll need to pay attention to your metrics to get an idea of what those reactions are.

Feedback loops and metrics are a reflection of how effective your marketing automation strategy is. Whether a person converts, clicks through to your site, ignores the message, flags it as spam, or unsubscribes from your list, that tells you something about how the recipient felt about your message.

When you look at your metrics, you’ll ideally want to see high open rates, clickthrough rates, and maybe even some forwards, since those are signs your content is engaging, valuable, and not annoying to your contacts. Some unsubscribes and abuse reports are inevitable, especially since a lot of people get confused about the difference between the two. But don’t ignore those metrics just because they’re not what you want to see. An increasing number of either could be a sign your strategy is too aggressive and needs to be reworked.

User flow

While conversion funnels refer to the process taken toward converting, user flow refers to the series of pages a person visits before taking an action.

When you have traffic coming to your site from different sources like PPC ads, social media, and email messages, you want to direct users to pages that will make it easy for them to take the action you want them to take, whether it’s buying something, signing up for a free trial, or joining an email list.

You also have to keep in mind that people often have different needs depending on how they arrive at a page, so you’ll want to do your best to make sure people are being taken to a page that would appeal to them. For example, if a person is directly taken to a product page after doing a search for a long-tail keyword, that’s fine since they’re clearly looking for something specific and are more likely to be ready to convert. But someone who clicks on a PPC ad and fills out a form on a landing page is probably going to want more information before they make any decisions, so it’s not time to give them a hard sell.

Workflows

Workflows are where the automation part of marketing automation comes into play. Your workflow is the series of triggers you create to deliver messages. Creating a workflow involves taking yourself through the entire process and asking yourself, “If this happens, what should happen next?”

Workflows can consist of many different triggers, such as how long it’s been since a person has taken an action, interactions you’ve had with a person, or actions they’ve previously taken on your site. Some types of workflows commonly used by retailers include sending discount codes to customers who haven’t made any purchases in a while, reminding people to review products after they’ve had some time to enjoy their purchase, and sending reminders to people who have recently added items to their cart without actually making a purchase.

Important steps in creating a marketing automation strategy

1. Define your goals

This might seem like an obvious point to make, but before you do anything else, you need to decide exactly what you want marketing automation to help you achieve so you can plan your strategy accordingly. Are you trying to generate more leads? Working to build up business from return customers? Trying to boost sales during an off season? Each of those goals is going to require a different strategy, so it’s important to understand exactly what your main objectives are.

2. Identify who to target

Of course it’s important to understand the needs of your customers at all points of the conversion process. But depending on what your main goals are, your time and energy may be best spent focusing on people who are at a specific point of the process. For instance, if you’re not really having a problem with lead generation but you want more people to convert, your time and energy would be better spent focusing on the middle and lower parts of the conversion funnel.

3. Map user flows

By using marketing automation, you’re trying to get people to take some kind of action. Mapping user flow is a way to visualize the steps people need to go through to be able to take that action.

Depending on the way a person arrives at your site, some people might need more information than others before they’re willing to take that action. You don’t want to make people go through more steps than are necessary to do something, but you don’t want to hit people with a hard sell too soon, either. By using state diagrams to map user flows, as recommended by Peep Laja of ConversionXL, you’ll see exactly how people are arriving at a page and how many steps it takes for them to take the desired action.

4. Segment and rate your leads

It’s important to remember that not all leads are necessarily equal in terms of quality. Your database of contacts is inevitably going to be a mix of people who are on the verge of buying, people who are still researching their options, and people who probably won’t convert, so it’s not possible to create broad messages that will somehow appeal to all of those types of people. Rating your leads helps you figure out exactly who needs further nurturing and who is ready to be handed over to a sales team.

The interactions a person has had with your content and the actions they’ve taken on your site can be a reflection of how ready they are to convert. A person who has viewed a pricing page is most likely going to be closer to buying than someone who has simply read a blog post on a site. A person who has visited a site multiple times over the course of a few weeks is clearly more interested than someone who has only visited once or twice in the past year. Marketing automation software lets you assign values to certain actions and interactions so that it can calculate a score for that lead.

Marketing automation also lets you segment your database of contacts to a very high degree so you can deliver messages to very specific types of people. For example, when working with a B2B business, a marketer might want to target messages to people with certain job titles who work at businesses of a certain size. With B2C sales, a retailer might want to segment their lists to give special offers to people who have spent a certain amount of money with the company or send product recommendations to people who live in certain locations.

Building and maintaining a contact database

There’s no easy way around it: Building a high-quality database of contacts takes time. Marketing automation should come into play once you already have a fairly sizeable database of contacts to work with, but you will need to keep adding new names to that database on a regular basis.

One of the most effective ways to build a database of highly qualified contacts is by creating informative content. Blog content is great for providing high-level information, and it helps businesses build trust and establish themselves as an authority in their field. On the other hand, things like whitepapers and e-books are best for attracting people who want more in-depth information on a subject and are more inclined to be interested in what a business is offering, which is why those types of content are usually gated. With gated content, a person’s contact information is essentially the price of accessing the content.

For businesses that offer a service, free trials are an excellent way to get contact information since the people who sign up for them are obviously interested in what’s being offered.

Just say “no” to purchased lists

Whatever you do, don’t be tempted to buy a list of contacts. Purchased lists may give you a quick boost up front, but they’ll work against you in the long run.

First of all, high-quality lists of contacts aren’t for sale. The kinds of lists you can buy or rent are typically full of invalid and abandoned email addresses. Even if a person actually does see your message, they likely either won’t be interested or will be skeptical about doing business with a company they’re not familiar with.

If you were to start sending messages to a list full of contacts of questionable quality, you’ll most likely end up with high bounce rates, lots of unsubscriptions, low open rates, and a whole lot of abuse reports. Email service providers pay attention to those sorts of metrics and if they start seeing them on a regular basis, they’ll view you as a spammer, which will only make it harder for you to get your message to more qualified leads once you have them.

Best practices for marketing automation messaging

Get to the point

Make your point quickly and make it clear. We all have a limited amount of time each day and one thing people have little patience for is long messages. People just want to know what’s in it for them. How would your product or service solve their problem? What’s unique about what you’re offering?

Keep it active

By implementing marketing automation strategies, you’re trying to keep people engaged. Therefore, your messages should be written in an active tone and encourage recipients to take some kind of action, whether it’s downloading a whitepaper, reading a blog post, watching a video, or making a purchase.

Remember where people are in the process

Don’t forget that some types of content will be more appealing than others depending on where a person is in the conversion funnel. People who are just starting to learn more about a company or product are not going to be happy if they get hit with a hard sell, but highly promotional content could potentially be effective on someone further down in the conversion funnel.

Avoid looking spammy

When used correctly, marketing automation is not spam — we’ll talk more about why that is in just a little bit. But don’t give your contacts the wrong impression. Certain things will always look spammy, such as typing in all capital letters, overusing the color red, and using too many links in the body of the message. If you’re going to use symbols in your subject lines or messages, don’t use too many of them. Avoid using words known to trigger spam filters.

If you’re unfamiliar with the CAN-SPAM Act, take some time to learn about what it means for your campaign. Subject lines need to be accurate and not misleading. Companies that send marketing messages through email need to provide a physical mailing address. (PO box addresses are allowed.) You also need to provide an unsubscribe option in all messages and make sure all opt-out requests are honored as soon as possible.

Hone your list

Bigger isn’t always better when it comes to contact lists. One of the key goals for marketing automation is to get your message to precisely the right people. Pay close attention to your metrics so you know who your most qualified leads are and get rid of the ones who aren’t responding anymore. You’re better off with a smaller list of highly qualified leads than with a large list of contacts who don’t care. If it’s been months since a person last opened a message from you, just remove them from your list and focus more on the leads who are more interested.

Misconceptions about marketing automation

It’s impersonal

When done correctly, marketing automation can and should feel personal. In all fairness, it’s easy to understand how people get the wrong impression here — after all, the word “automation” is usually associated with things like computerization and robots. But for a marketing automation strategy to be successful, there needs to be a human touch behind it. Marketing automation simply makes it easier for you to get your message out there. It’s up to you to come up with content that will appeal to people and to create the strategy for getting it out there.

It’s spam

We all know how obnoxious spam is — marketers included. Marketers also understand how ineffective it is. While spam is an unsolicited message promoting something irrelevant to the vast majority of its recipients, the goal of marketing automation is to deliver highly relevant messages to users who clearly express an interest in it.

Unlike spam, marketing automation also frequently involves non-promotional content. Marketing automation messages absolutely can be promotional in nature, but ultimately, the goal is to foster positive relationships by offering something of value — and that doesn’t always involve a hard sell.

You can set it and forget it

This is another case where the word “automation” can give the wrong impression. When you think of something being automated, it’s easy to think you can just set it up, sit back, and let it run on its own. In reality, marketing automation is anything but a hands-off process. Marketing automation needs constant attention and refinement to make sure it’s as successful as possible. Many people use the A/B testing functionality of marketing automation software to run ongoing tests to see which sorts of content, subject lines, design variations, and CTAs people best respond to.

It’s just email marketing

Email is a significant part of marketing automation, but marketing automation isn’t just a new name for email marketing.

First of all, the types of messages involved in basic email marketing and marketing automation are distinctly different. When most people think of email marketing, they’re thinking of broad email blasts that go out to an entire list of contacts, but that’s just what you’re trying to avoid doing with marketing automation. Marketing automation messages are much more fine-tuned to a user’s interests and needs. Although basic email marketing programs do allow for some list segmentation, marketing automation programs allow you to get much more hyper-segmented.

Basic email marketing and marketing automation programs also offer different functionality and insights. While regular email marketing platforms give some basic information about how people interact with your message, marketing automation programs offer more measurable, in-depth insights.

While marketing automation offers a lot of benefits, it’s not going to be an ideal solution for all businesses. For some types of businesses, basic email marketing is all they really need. Studies have shown that marketers often feel like marketing automation software isn’t worth the investment, but many marketers also fail to use it to its full potential or businesses try using it before they have a large enough database of contacts to truly make it worthwhile. Before using marketing automation, the key things to consider are whether or not you have the time and resources to dedicate to training on the software so they can use it to its full potential.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert