Tag Archive | "Pages"

Diagnosing Why a Site’s Set of Pages May Be Ranking Poorly – Whiteboard Friday

Posted by randfish

Your rankings have dropped and you don’t know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It’s an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.

Diagnosing why a site's pages may be ranking poorly

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about diagnosing a site and specifically a section of a site’s pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we’ve got a fairly extensive process here, so let’s get started.

Step 1: Uncover the problem

First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we’re talking about a site that’s 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to

A. Treat different site sections as unique segments for investigation. You should look at them individually.

A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I’m going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let’s say I have a website that’s dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I’m ranking for fewer of them, and it’s my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.

B. Check traffic from search over time.

So I go to my Google Analytics or whatever analytics you’re using, and you might see something like, okay, I’m going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that’s a big traffic drop. We fell off a cliff there for these particular pages.

This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It’s going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it’s not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.

From there, I’m going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it’s almost certainly something you’ve done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or ’17 or ’18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.

C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.

This is one of the reasons I like to rank track for even these types of queries that don’t get a lot of traffic.

1. Target keywords. In this case, it might be “Denver population growth,” maybe that’s one of your keywords. You would see, “Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?”

2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus “Denver population growth,” so My Site or MySite.com Denver population growth. If you’re not ranking for that, that’s usually an indication of a more serious problem, potentially a penalty or some type of dampening that’s happening around your brand name or around your website.

3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.

4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I’m not ranking for this, but I am for this one … sorry, if I’m not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It’s probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.

5. site: urlstring/ So I would search for “site:MySite.com/cities/Denver.” I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it’s been a month. I wonder why they haven’t come back. Maybe there’s some sort of crawl issue, robots.txt issue, meta robots issue, something. I’m preventing Google from potentially getting there. Or maybe they can’t get there at all, and this results in zero results. That means Google hasn’t even indexed the page. Now we have another type of problem.

D. Check your tools.

1. Google Search Console. I would start there, especially in the site issues section.

2. Check your rank tracker or whatever tool you’re using, whether that’s Moz or something else.

3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you’ve run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.

4. Site uptime. So I might check Pingdom or other things that alert me to, “Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen,” those types of things.

Step 2: Offer hypothesis for falling rankings/traffic

Okay, you’ve done your diagnostics. Now it’s time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.

A. If rankings are up, but traffic is down…

In those cases, these are the five things that are most typically to blame.

1. New SERP features. There’s a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don’t get that featured snippet, you’re losing out to one of your competitors.

2. Lower search demand. Like we talked about in Google Trends. I’m looking at search demand, and there are just not as many people searching as there used to be.

3. Brand or reputation issues. I’m ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They’re choosing someone else actively because of reputation issues.

4. Snippet problems. I’m ranking in the same place I used to be, but I’m no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.

5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they’ve clicked in the past or where they’re located. Location is often a big cause for this.

So for many SEOs for many years, “SEO consultant” resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now “SEO consultant” results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.

B. If rankings and traffic are down…

If you’re seeing that rankings have fallen and traffic has fallen in conjunction, there’s a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it’s things like this:

1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you’re providing just isn’t good enough.

3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it’s just not serving the new searcher intent.

4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you’re not solving the searcher’s query as well. Your user interface, your UX is not as good. Your keyword targeting isn’t as good as theirs. Your content quality and the unique value you provide isn’t as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.

5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn’t getting indexed, or Google hasn’t updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I’m having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can’t see the content on my page anymore because I made some change in the technology of how it’s displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.

Maybe at the server level, someone on the tech ops team of my website decided, “Oh, there’s this really problematic bot coming from Mountain View that’s costing us a bunch of bandwidth. Let’s block bots from Mountain View.” No, don’t do that. Bad. Those kinds of technical issues can happen.

6. Spam and penalties. We’ll talk a little bit more about how to diagnose those in a second.

7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren’t satisfied by your results, maybe because their expectations have changed or market issues have changed.

Step 3: Make fixes and observe results

All right. Next and last in this process, what we’re going to do is make some fixes and observe the results. Hopefully, we’ve been able to correctly diagnose and form some wise hypotheses about what’s going wrong, and now we’re going to try and resolve them.

A. On-page and technical issues should solve after a new crawl + index.

So on-page and technical issues, if we’re fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we’re talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it’s long tail to many different pages, you’re not going to see that instant traffic gain and rise as fast.

B. Link issues and spam penalty problems can take months to show results.

Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You’ve seen a lot of spam experts on Twitter saying, “Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today,” because Google made some fix in their latest index rollout or their algorithm changed, and it’s sort of, okay, well we’ll reward the people for all the fixes that they’ve made. Sometimes that’s in batches that take months.

C. Fixing a small number of pages in a section that’s performing poorly might not show results very quickly.

For example, let’s say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You’ve tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you’ve left 5,000 other cities pages untouched.

Sometimes Google will sort of be like, “No, you know what? We still think your cities pages, as a whole, don’t do a good job solving this query. So even though these two that you’ve updated do a better job, we’re not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section.” That is a real thing that we’ve observed happening in Google’s results.

Because of this, one of the things that I would urge you to do is if you’re seeing good results from the people you’re testing it with and you’re pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.

D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.

So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, “Ah, we have something new to judge. Let’s see how these location pages on MySite.com perform versus the old cities pages.”

So I know we’ve covered a ton today and there are a lot of diagnostic issues that we haven’t necessarily dug deep into, but I hope this can help you if you’re encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

PSA: Google Doesn’t Use Content On Non-Canonical Pages

Google’s John Mueller wrote on Twitter “remember that the content on ‘non-canonical’ versions generally doesn’t get used.” Meaning, if you point page A to page B using a 301 or canonical tag…


Search Engine Roundtable

Posted in IM NewsComments Off

FAQs on new Google Speed Update: AMP pages, Search Console notifications & desktop only pages

A page with AMP but a slow canonical URL will not be impacted by this update, assuming the AMP URL is not slow, Google told us.

The post FAQs on new Google Speed Update: AMP pages, Search Console notifications & desktop only pages appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Designing a Page’s Content Flow to Maximize SEO Opportunity – Whiteboard Friday

Posted by randfish

Controlling and improving the flow of your on-site content can actually help your SEO. What’s the best way to capitalize on the opportunity present in your page design? Rand covers the questions you need to ask (and answer) and the goals you should strive for in today’s Whiteboard Friday.

Designing a page's content flow to maximize SEO opportunity

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about a designing a page’s content flow to help with your SEO.

Now, unfortunately, somehow in the world of SEO tactics, this one has gotten left by the wayside. I think a lot of people in the SEO world are investing in things like content and solving searchers’ problems and getting to the bottom of searcher intent. But unfortunately, the page design and the flow of the elements, the UI elements, the content elements that sit in a page is discarded or left aside. That’s unfortunate because it can actually make a huge difference to your SEO.

Q: What needs to go on this page, in what order, with what placement?

So if we’re asking ourselves like, “Well, what’s the question here?” Well, it’s what needs to go on this page. I’m trying to rank for “faster home Wi-Fi.” Right now, Lifehacker and a bunch of other people are ranking in these results. It gets a ton of searches. I can drive a lot of revenue for my business if I can rank there. But what needs to go on this page in what order with what placement in order for me to perform the best that I possibly can? It turns out that sometimes great content gets buried in a poor page design and poor page flow. But if we want to answer this question, we actually have to ask some other ones. We need answers to at least these three:

A. What is the searcher in this case trying to accomplish?

When they enter “faster home Wi-Fi,” what’s the task that they want to get done?

B. Are there multiple intents behind this query, and which ones are most popular?

What’s the popularity of those intents in what order? We need to know that so that we can design our flow around the most common ones first and the secondary and tertiary ones next.

C. What’s the business goal of ranking? What are we trying to accomplish?

That’s always going to have to be balanced out with what is the searcher trying to accomplish. Otherwise, in a lot of cases, there’s no point in ranking at all. If we can’t get our goals met, we should just rank for something else where we can.

Let’s assume we’ve got some answers:

Let’s assume that, in this case, we have some good answers to these questions so we can proceed. So pretty simple. If I search for “faster home Wi-Fi,” what I want is usually it’s going to be…

A. Faster download speed at home.

That’s what the searcher is trying to accomplish. But there are multiple intents behind this. Sometimes the searcher is looking to do that..

B1. With their current ISP and their current equipment.

They want to know things they can optimize that don’t cause them to spend money. Can they place their router in different places? Can they change out a cable? Do they need to put it in a different room? Do they need to move their computer? Is the problem something else that’s interfering with their Wi-Fi in their home that they need to turn off? Those kinds of issues.

B2. With a new ISP.

Or can they get a new ISP? They might be looking for an ISP that can provide them with faster home internet in their area, and they want to know what’s available, which is a very different intent than the first one.

B3. With current ISP but new equipment.

maybe they want to keep their ISP, but they are willing to upgrade to new equipment. So they’re looking for what’s the equipment that I could buy that would make the current ISP I have, which in many cases in the United States, sadly, there’s only one ISP that can provide you with service in a lot of areas. So they can’t change ISP, but they can change out their equipment.

C. Affiliate revenue with product referrals.

Let’s assume that (C) is we know that what we’re trying to accomplish is affiliate revenue from product referrals. So our business is basically we’re going to send people to new routers or the Google Mesh Network home device, and we get affiliate revenue by passing folks off to those products and recommending them.

Now we can design a content flow.

Okay, fair enough. We now have enough to be able to take care of this design flow. The design flow can involve lots of things. There are a lot of things that could live on a page, everything from navigation to headline to the lead-in copy or the header image or body content, graphics, reference links, the footer, a sidebar potentially.

The elements that go in here are not actually what we’re talking about today. We can have that conversation too. I want a headline that’s going to tell people that I serve all of these different intents. I want to have a lead-in that has a potential to be the featured snippet in there. I want a header image that can rank in image results and be in the featured snippet panel. I’m going to want body content that serves all of these in the order that’s most popular. I want graphics and visuals that suggest to people that I’ve done my research and I can provably show that the results that you get with this different equipment or this different ISP will be relevant to them.

But really, what we’re talking about here is the flow that matters. The content itself, the problem is that it gets buried. What I see many times is folks will take a powerful visual or a powerful piece of content that’s solving the searcher’s query and they’ll put it in a place on the page where it’s hard to access or hard to find. So even though they’ve actually got great content, it is buried by the page’s design.

5 big goals that matter.

The goals that matter here and the ones that you should be optimizing for when you’re thinking about the design of this flow are:

1. How do I solve the searcher’s task quickly and enjoyably?

So that’s about user experience as well as the UI. I know that, for many people, they are going to want to see and, in fact, the result that’s ranking up here on the top is Lifehacker’s top 10 list for how to get your home Wi-Fi faster. They include things like upgrading your ISP, and here’s a tool to see what’s available in your area. They include maybe you need a better router, and here are the best ones. Maybe you need a different network or something that expands your network in your home, and here’s a link out to those. So they’re serving that purpose up front, up top.

2. Serve these multiple intents in the order of demand.

So if we can intuit that most people want to stick with their ISP, but are willing to change equipment, we can serve this one first (B3). We can serve this one second (B1), and we can serve the change out my ISP third (B2), which is actually the ideal fit in this scenario for us. That helps us

3. Optimize for the business goal without sacrificing one and two.

I would urge you to design generally with the searcher in mind and if you can fit in the business goal, that is ideal. Otherwise, what tends to happen is the business goal comes first, the searcher comes second, and you come tenth in the results.

4. If possible, try to claim the featured snippet and the visual image that go up there.

That means using the lead-in up at the top. It’s usually the first paragraph or the first few lines of text in an ordered or unordered list, along with a header image or visual in order to capture that featured snippet. That’s very powerful for search results that are still showing it.

5. Limit our bounce back to the SERP as much as possible.

In many cases, this means limiting some of the UI or design flow elements that hamper people from solving their problems or that annoy or dissuade them. So, for example, advertising that pops up or overlays that come up before I’ve gotten two-thirds of the way down the page really tend to hamper efforts, really tend to increase this bounce back to the SERP, the search engine call pogo-sticking and can harm your rankings dramatically. Design elements, design flows where the content that actually solves the problem is below an advertising block or below a promotional block, that also is very limiting.

So to the degree that we can control the design of our pages and optimize for that, we can actually take existing content that you might already have and improve its rankings without having to remake it, without needing new links, simply by improving the flow.

I hope we’ll see lots of examples of those in the comments, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google Suggests Not To Make Country Specific Pages With Hreflang Just For Currency Changes

Let’s say you have an e-commerce site that sells to multiple countries and all the pages are written in English but you sell the product in different currencies based on the country the user is buying from…


Search Engine Roundtable

Posted in IM NewsComments Off

Google releases a variety of Accelerated Mobile Pages Project (AMP) updates: scrolling animations, video analytics, fluid ad support

The scope and feature list of the open-source project continue to expand.

The post Google releases a variety of Accelerated Mobile Pages Project (AMP) updates: scrolling animations, video analytics, fluid ad support appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Accelerated Mobile Pages: Is faster better?

Google has doubled down on Accelerated Mobile Pages (AMP), its open source initiative designed to improve web page speed and performance for mobile users. More than 2 billion AMP pages have been published from over 900,000 domains, and many online publishers report significant gains in both traffic…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

How to Diagnose Pages that Rank in One Geography But Not Another – Whiteboard Friday

Posted by randfish

Are you ranking pretty well in one locale, only to find out your rankings tank in another? It’s not uncommon, even for sites without an intent to capture local queries. In today’s Whiteboard Friday, Rand shows you how to diagnose the issue with a few clever SEO tricks, then identify the right strategy to get back on top.

Diagnose Why Pages ranks for One Geography But Not Another

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to this edition of Whiteboard Friday. This week we’re going to chat about rankings that differ from geography to geography. Many of you might see that you are ranking particularly well in one city, but when you perform that search in another city or in another country perhaps, that still speaks the same language and has very similar traits, that maybe you’re not performing well.

Maybe you do well in Canada, but you don’t do well in the United States. Maybe you do well in Portland, Oregon, but you do poorly in San Diego, California. Sometimes you might be thinking to yourself, “Well, wait, this search is not particularly local, or at least I didn’t think of it as being particularly local. Why am I ranking in one and not the other?” So here’s a process that you can use to diagnose.

Confirm the rankings you see are accurate:

The first thing we need to do is confirm that the rankings you see or that you’ve heard about are accurate. This is actually much more difficult than it used to be. It used to be you could scroll to the bottom of Google and change your location to whatever you wanted. Now Google will geolocate you by your IP address or by a precise location on your mobile device, and unfortunately you can’t just specify one particular location or another — unless you know some of these SEO hacks.

A. Google’s AdPreview Tool – Google has an ad preview tool, where you can specify and set a particular location. That’s at AdWords.Google.com slash a bunch of junk slash ad preview. We’ll make sure that the link is down in the notes below.

B. The ampersand-near-equals parameter (&near=) - Now, some SEOs have said that this is not perfect, and I agree it is imperfect, but it is pretty close. We’ve done some comparisons here at Moz. I’ve done them while I’m traveling. It’s not bad. Occasionally, you’ll see one or two things that are not the same. The advertisements are frequently not the same. In fact, they don’t seem to work well. But the organic results look pretty darn close. The maps results look pretty darn close. So I think it’s a reasonable tool that you can use.

That is by basically changing the Google search query — so this is the URL in the search query — from Google.com/search?q= and then you might have ice+cream or WordPress+web+design, and then you use this, &near= and the city and state here in the United States or city and province in Canada or city and region in another country. In this case, I’m going with Portland+OR. This will change my results. You can give this a try yourself. You can see that you will see the ice cream places that are in Portland, Oregon, when you perform this search query.

For countries, you can use another one. You can either go directly to the country code Google, so for the UK Google.co.uk, or for New Zealand Google.co.nz, or for Canada Google.ca. Then you can type that in.You can also use this parameter &GL= instead of &near. This is global location equals the country code, and then you could put in CA for Canada or UK for the UK or NZ for New Zealand.

C. The Mozbar’s search profiles – You can also do this with the MozBar. The MozBar kind of hacks the near parameter for you, and you can just specify a location and create a search profile. Do that right inside the MozBar. That’s one of the very nice things about using it.

D. Rank tracking with a platform that supports location-specific rankings - Some of them don’t, some of them do. Moz does right now. I believe Searchmetrics does if you use the enterprise. Oh, I’m trying to remember if Rob Bucci said STAT does. Well, Rob will answer in the comments, and he’ll tell us whether STAT does. I think that they do.

Look at who IS ranking and what features they may have:

So next, once you’ve figured out whether this ranking anomaly that you perceive is real or not, you can step two look at who is ranking in the one where you’re not and figure out what factors they might have going for them.

  • Have they gotten a lot of local links, location-specific links from these websites that are in that specific geography or serve that geography, local chambers of commerce, local directories, those kinds of things?
  • Do they have a more hyper-local service area? On a map, if this is the city, do they serve that specific region? You serve a broad set of locations all over the place, and maybe you don’t have a geo-specific region that you’re serving.
  • Do they have localized listings, listings in places like where Moz Local or a competitor like Yext or Whitespark might push all their data to? Those could be things like Google Maps and Bing Maps, directories, local data aggregators, Yelp, TripAdvisor, etc., etc.
  • Do they have rankings in Google Maps? If you go and look and you see that this website is ranking particularly well in Google Maps for that particular region and you are not, that might be another signal that hyper-local intent and hyper-local ranking signals, ranking algorithm is in play there.
  • Are they running local AdWords ads? I know this might seem like, “Wait a minute. Rand, I thought ads were not directly connected to organic search results.” They’re not, but it tends to be the case that if you bid on AdWords, you tend to increase your organic click-through rate as well, because people see your ad up at the top, and then they see you again a second time, and so they’re a little more biased to click. Therefore, buying local ads can sometimes increase organic click-through rate as well. It can also brand people with your particular business. So that is one thing that might make a difference here.

Consider location-based searcher behaviors:

Now we’re not considering who is ranking, but we’re considering who is doing the searching, these location-based searchers and what their behavior is like.

  • Are they less likely to search for your brand because you’re not as well known in that region?
  • Are they less likely to click your site in the SERPs because you’re not as well known?
  • Is their intent somehow different because of their geography? Maybe there’s a language issue or a regionalism of some kind. This could be a local language thing even here in the United States, where parts of the country say “soda” and parts of the country say “pop.” Maybe those mean two different things, and “pop” means, “Oh, it’s a popcorn store in Seattle,” because there’s the Pop brand, but in the Midwest, “pop” clearly refers to types of soda beverages.
  • Are they more or less sensitive to a co-located solution? So it could be that in many geographies, a lot of your market doesn’t care about whether the solution that they’re getting is from their local region, and in others it does. A classic one on a country level is France, whose searchers tend to care tremendously more that they are getting .fr results and that the location of the business they are clicking on is in France versus other folks in Europe who might click a .com or a .co.uk with no problem.


Divide into three buckets:

You’re going to divide the search queries that you care about that have these challenges into three different types of buckets:

Bucket one: Hyper-geo-sensitive

This would be sort of the classic geo-specific search, where you see maps results right up at the top. The SERPs change completely from geo to geo. So if you perform the search in Portland and then you perform it in San Diego, you see very, very different results. Seven to nine of the top ten at least are changing up, and it’s the case that almost no non-local listings are showing in the top five results. When you see these, this is probably non-targetable without a physical location in that geography. So if you don’t have a physical location, you’re kind of out of business until you get there. If you do, then you can work on the local ranking signals that might be holding you back.

Bucket two: Semi-geo-sensitive

I’ve actually illustrated this one over here, because this can be a little bit challenging to describe. But basically, you’re getting a mix of geo-specific and global results. So, for example, I use the &near=Portland, Oregon, because I’m in Seattle and I want to see Portland’s results for WordPress web design.

WordPress web design, when I do the search all over the United States, the first one or two results are pretty much always the same. They’re always this Web Savvy Marketing link and this Creative Bloq, and they’re very broad. They are not specifically about a local provider of WordPress web design.

But then you get to number three and four and five, and the results change to be local-specific businesses. So in Portland, it’s these Mozak Design guys. Mozak, no relation to Moz, to my knowledge anyway. In San Diego, it’s Kristin Falkner, who’s ranking number three, and then other local San Diego WordPress web design businesses at four and five. So it’s kind of this mix of geo and non-geo. You can generally tell this by looking and changing your geography in this fashion seeing those different things.

Some of the top search results usually will be like this, and they’ll stay consistent from geography to geography. In these cases, what you want to do is work on boosting those local-specific signals. So if you are ranking number five or six and you want to be number three, go for that, or you can try and be in the global results, in which case you’re trying to boost the classic ranking signals, not the local ones so you can get up there.

Bucket three: Non-geo-sensitive

Those would be, “I do this search, and I don’t see any local-specific results.” It’s just a bunch of nationwide or worldwide brands. There are no maps, usually only one, maybe two geo-specific results in the top 10, and they tend to be further down, and the SERPs barely change from geo to geo. They’re pretty much the same throughout the country.

So once you put these into these three buckets, then you know which thing to do. Here, it’s pursue classic signals. You probably don’t need much of a local boost.

Here, you have the option of going one way or the other, boosting local signals to get into these rankings or boosting the classic signals to get into those global ones.

Here you’re going to need the physical business.

All right, everyone. I hope you’ve enjoyed this edition of Whiteboard Friday, and we’ll see you again next week. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customers have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn’t any. It’s bigger, better, faster, and you won’t pay an extra dime for it.

A moment of humility, though — if you’ve used our existing site crawl, you know it hasn’t always lived up to your expectations. Truth is, it hasn’t lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt “Aardwolf” engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on “All Crawled Pages” in the new crawler, and you’ll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let’s say I just wanted to see all of the pages crawled for Moz.com in the “/blog” directory…

I just click the [+], select “URL,” enter “/blog,” and I’m on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can’t wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click “Recrawl my site” from the top of any page in the Site Crawl section, and you’ll be on your way…

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you’re under tight deadlines for client reviews, we understand that waiting just isn’t an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what’s critical for one site is barely a nuisance for another. For example, let’s say I don’t care about a handful of overly dynamic URLs (for many sites, it’s a minor issue). With the new Site Crawl, I can just select those issues and then “Ignore” them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We’ll also keep tracking any new issues that pop up over time. Just because you don’t care about something today doesn’t mean you won’t need to know about it a month from now.

Fix duplicate content

Under “Content Issues,” we’ve launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the “parent” page. Here’s a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we’ve misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what’s a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we’ve attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we’re going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven’t helped you do as well as we could. We’re going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we’ve been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you’ll have history available from day one! Stay tuned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and be sure to register for the upcoming webinar.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Build Landing Pages that Convert with These 3 Smart Steps

step by step for landing pages that convert

It was May 2015, and I was sitting in the audience at Rainmaker Digital’s Authority Rainmaker conference in Denver, Colorado.

Sonia Simone was about to give a presentation called “Dr. Evil’s Guide to Landing Page Design and Optimization,” and I was excited to learn from one of my personal copywriting heroes.

At the time, I was familiar with certain landing page “rules” — like writing compelling headlines, testing different button colors, and eliminating distracting design elements — but other than that, writing the copy seemed like some magical activity.

But that day at the conference, Sonia broke down the entire landing page creation process into a few straightforward steps.

I had an epiphany in the middle of her talk as she gave us guidelines for writing landing pages, including the three main goals your landing page should accomplish.

Read on to find out about Sonia’s three steps and how to use them to create landing pages that convert.

What is a landing page?

Before we go over Sonia’s guidelines, let’s do a quick refresher on the term “landing page.”

A landing page is any page on your site where traffic is sent specifically to prompt a certain action or result.

The goal is to persuade your prospect to take actions like:

  • Sign up for a free account
  • Opt in to receive a free autoresponder course
  • Sign up to download a free report
  • Join your paid membership site
  • Buy your product
  • Purchase a consulting package

First identify the singular goal of your landing page. Once you’ve got that, you’re ready to roll with the following three steps.

Step #1: Present your offer

If you’re giving away a free autoresponder course or free series of downloadable interviews, make that clear. If you’re selling a product or service, explain exactly what it is.

Where to put this element

Don’t wait to state your offer; make it explicit immediately. Often, you can write exactly what you’re offering in the headline of your landing page.

If you decide not to include the offer in your headline, place it close to the top of the page. People need to know what you’re offering right away, so don’t bury the lede.

Step #2: Explain how the offer will help your prospect

Why should your prospect care about your autoresponder course, downloadable interviews, or paid product? What exactly is it going to do for them?

Describe the main benefits of your offer — and remember the difference between features and benefits before you write this copy for your landing page.

Where to put this element

Subheads and bullet points are both great spots for spelling out the benefits of your offer. Otherwise, use short paragraphs for your copy.

Step #3: Clearly state what your prospect should do next

Many landing pages fall flat here. You must explain exactly what you want the prospect to do next.

This part of the copy is called the “call to action” for a reason. You are prompting the reader to take a particular action, and if you leave any ambiguity, you’ll likely confuse people and lose conversions.

Whether you need the prospect to click a button, fill out a form, or make a phone call, explain the action as clearly as you can. Your job here is to eliminate all possibility of confusion in your prospect’s mind.

Where to put this element

Powerful calls to action can appear in a number of places on your landing page. Select the most appropriate spots for your call to action text throughout the page as well as at the very end of the page.

For example, if you need the reader to click a big, red button, put your call to action right above that button. If appropriate, you can also include an alternative version of your call to action on the button itself.

Other elements to consider when creating landing pages

Once you’ve built the foundation of your landing page with the three steps above, you’ve got the basics covered! Now you can start testing different copy variations and design elements.

You can test:

  • Headline options
  • Long copy vs. short copy
  • Button color and text
  • How you describe benefits
  • The layout of the page

Master the art of creating landing pages that convert

If you’re looking for additional ways to test and fine-tune your landing page, check out Copyblogger’s free ebook, Landing Pages: How to Turn Traffic into Money. It’s nearly 50 pages of landing page tips and techniques you can start using right away.

If you’ve been feeling hesitant to write the copy for your landing page, pull out a blank sheet of paper and get to work using the three smart steps above. You’ll love having your own landing page epiphany!

The post Build Landing Pages that Convert with These 3 Smart Steps appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Advert