Tag Archive | "Insights"

Custom Extraction Using an SEO Crawler for CRO and UX Insights – Whiteboard Friday

Posted by MrLukeCarthy

From e-commerce to listings sites to real estate and myriad verticals beyond, the data you can harness using custom extraction via crawler tools is worth its weight in revenue. With a greater granularity of data at your fingertips, you can uncover CRO and user experience insights that can inform your optimizations and transform your customer experience.

In this episode of Whiteboard Friday, we’re delighted to welcome Luke Carthy to share actionable wisdom from his recent MozCon 2019 presentation, Killer CRO and UX Wins Using an SEO Crawler.

Video Transcription

Hey, Moz. What’s up? Wow, can I just say it’s incredible I’m here in Seattle right now doing a Whiteboard Friday? I can’t wait to share this cool stuff with you. So thanks for joining me.

My name is Luke Carthy. As you can probably tell, I’m from the UK, and I want to talk to you about custom extraction, specifically in the world of e-commerce. However, what I will say is this works beautifully well in many of the verticals as well, so real estate, in job listings. In fact, any website that can pretty much spit out HTML in a web crawler, you can use custom extraction.

What is custom extraction?

Let’s get started. What is custom extraction? Well, as I kind of just alluded to, it allows you, when you’re crawling using like Screaming Frog, for example, or DeepCrawl or whatever it is you want to use, it allows you to grab and extract specific parts of the HTML and export it to a file, a CSV, in Excel, or whatever you prefer.



As a principle, okay, great, but I’m going to give you some really good examples of how you can really leverage that. So e-commerce, right here we’ve got a product page that I’ve beautifully drawn, and everything in red is something that you can potentially extract. Although, as I said, anything on the page you can. These are just some good examples.

Product information + page performance

Think about this for a moment. You’re an e-commerce website, you’re a listing site, and of course you have listing pages, you have product pages. Wouldn’t it be great if you could very quickly, at scale, understand all of your products’ pricing, whether you’ve got stock, whether it’s got an image, whether it’s got a description, how many reviews it has, and of the reviews, what’s the aggregate score, whether it’s four stars, five stars, whatever it is?

That’s really powerful because you can then start to understand how good pages perform based upon the information that they have, based upon traffic, conversion, customer feedback, and all sorts of great stuff, all using custom extraction and spitting it out on say a CSV or an Excel spreadsheet file.

Competitive insights

But where it gets super powerful and you get a lot of insight from is when you start to turn the lens to your competitors and you think about ways in which you can get those really good insights. You may have three competitors. You may have some aspirational competitors. You may have a site that you don’t necessarily compete with, but you use them on a day-to-day basis or you admire how easy their site was to use, and you can go away and do that.

You can fire up a crawl, and there’s no reason why you couldn’t extract that same information from other competitors and see what’s going on, to see what pricing your competitors are selling an item at, do they have that in stock or not, what are the reviews like, what FAQs do people have, can you then leverage that in your own content. 

Examples of how to glean insights from custom extraction in e-commerce

Example 1: Price increases for products competitors don’t stock

Let me give you a perfect example of how I’ve managed to use this.

I’ve managed to identify that a competitor doesn’t have a specific product in stock, and, as a result of that, I’ve been able to increase our prices because they didn’t sell it. We did at that specific time, and we could identify the price point, the fact that they didn’t have any stock, and it was awesome. Think about that. Really powerful insights at massive amounts of scale. 

Example 2: Improving facets and filters on category pages

Another example I wanted to talk to you about. Category pages, again incredibly gorgeous illustrations. So category pages, we have filters, we have a category page, and just to switch things up a little bit I’ve also got like a listings page as well, so whether it’s, as I said, real estate, jobs, or anything in that environment.

If you think about the competition again for a second, there is no reason why you wouldn’t be able to extract via custom extraction the best filters that people use, the top filters, the top facets that people like to select and understand. So you can then see whether you’re using the same kind of combinations of features and facets on your site and maybe improve that.

Equally, you can then start to understand what specific features correlate to sales and performance and impacts and really start to improve the performance of how your website performs and behaves for your customers. The same thing applies to both environments here. 

If you are a listing site and you list jobs or you list products or classified ads, is it location filters that they have at the top? Is it availability? Is it reviews? Is it scores? You can crawl a number of your competitors across a number of areas and identify whether there’s a pattern, see a theme, and then see whether you can leverage and better that and take advantage of that. That’s a great way in which you can use it. 

Example 3: Recommendations, suggestions, and optimization

But on top of that and the one that I am most fascinated with is by far recommendations.

In the MozCon talk I did earlier I had a statistic, and I think I can recall it. It was 35% of what people buy on Amazon comes from recommendations, and 75% of what people watch on Netflix comes from suggestions, from recommendations.

Think about how powerful that is. You can crawl your own site, understand your own recommendations at scale, identify the stock of those recommendations, the price, whether they have images, in what order they are, and you can start to build a really vivid picture as to what products people associate with your items. You can do that on a global scale. You can crawl the entire of your product portfolio or your listing portfolio and get that. 



But again, back to powerful intelligence, your competitors, especially when you have competitors that might have multivariable facets or multivariable recommendations. What I mean by that is we’ve all seen sites where you’ve got multiple carousels. So you’ve got Recommended for You.

You might have People Also Bought, alternative suggestions. The more different types of recommendations you have, the more data you have, the more intelligence you have, the more insight you have. Going back to say a real estate example, you might be looking at a property here. It’s at this price. What is your main aspirational real estate competitor recommending to you that you may not be aware of?

Then you can think about whether the focus is on location, whether it’s on price, whether it’s on number of bedrooms, etc., and you can start to understand and behave how that can work and get some really powerful insights from that. 

Custom extraction is all about granular data at scale

To summarize and bring it all to a close, custom extraction is all about great granular data at scale. The really powerful thing about it is you can do all of this yourself, so there’s no need to have to have meetings, send elaborate emails, get permission from somebody.

Fire up Screaming Frog, fire up DeepCrawl, fire up whatever kind of crawler you want to use, have a look at custom extraction, and see how you can make your business more efficient, find out how you can get some really cool competitive insights, and yeah, hopefully, fingers crossed that works for you guys. Thank you very much.

Bonus resources:

Video transcription by Speechpad.com


This is a meaty topic, we know — if you enjoyed this Whiteboard Friday and find yourself eager to know more, you’re in luck! Luke’s full presentation at MozCon 2019 goes even more in-depth into what custom extraction can do for you. Catch his talk along with 26 other forward-thinking topics from our amazing speakers in the MozCon video bundle:

Access the sessions now!

We recommend sharing them with your team and spreading the learning love. Happy watching!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

How to Automate Pagespeed Insights For Multiple URLs using Google Sheets

Posted by James_McNulty

Calculating individual page speed performance metrics can help you to understand how efficiently your site is running as a whole. Since Google uses the speed of a site (frequently measured by and referred to as PageSpeed) as one of the signals used by its algorithm to rank pages, it’s important to have that insight down to the page level.

One of the pain points in website performance optimization, however, is the lack of ability to easily run page speed performance evaluations en masse. There are plenty of great tools like PageSpeed Insights or the Lighthouse Chrome plugin that can help you understand more about the performance of an individual page, but these tools are not readily configured to help you gather insights for multiple URLs — and running individual reports for hundreds or even thousands of pages isn’t exactly feasible or efficient.

In September 2018, I set out to find a way to gather sitewide performance metrics and ended up with a working solution. While this method resolved my initial problem, the setup process is rather complex and requires that you have access to a server.

Ultimately, it just wasn’t an efficient method. Furthermore, it was nearly impossible to easily share with others (especially those outside of UpBuild).

In November 2018, two months after I published this method, Google released version 5 of the PageSpeed Insights API. V5 now uses Lighthouse as its analysis engine and also incorporates field data provided by the Chrome User Experience Report (CrUX). In short, this version of the API now easily provides all of the data that is provided in the Chrome Lighthouse audits.

So I went back to the drawing board, and I’m happy to announce that there is now an easier, automated method to produce Lighthouse reports en masse using Google Sheets and Pagespeed Insights API v5.

Introducing the Automated PageSpeed Insights Report:

With this tool, we are able to quickly uncover key performance metrics for multiple URLs with just a couple of clicks.

All you’ll need is a copy of this Google Sheet, a free Google API key, and a list of URLs you want data for — but first, let’s take a quick tour.

How to use this tool

The Google Sheet consists of the three following tabs:

  • Settings
  • Results
  • Log

Settings

On this tab, you will be required to provide a unique Google API key in order to make the sheet work.

Getting a Google API Key

  1. Visit the Google API Credentials page.
  2. Choose the API key option from the ‘Create credentials’ dropdown (as shown):

  1. You should now see a prompt providing you with a unique API key:

  1. Next, simply copy and paste that API key into the section shown below found on the “Settings” tab of the Automated Pagespeed Insights spreadsheet.

Now that you have an API key, you are ready to use the tool.

Setting the report schedule

On the Settings tab, you can schedule which day and time that the report should start running each week. As you can see from this screenshot below, we have set this report to begin every Wednesday at 8:00 am. This will be set to the local time as defined by your Google account.

As you can see this setting is also assigning the report to run for the following three hours on the same day. This is a workaround to the limitations set by both Google Apps Scripts and Google PageSpeed API.

Limitations

Our Google Sheet is using a Google Apps script to run all the magic behind the scenes. Each time that the report runs, Google Apps Scripts sets a six-minute execution time limit, (thirty minutes for G Suite Business / Enterprise / Education and Early Access users).

In six minutes you should be able to extract PageSpeed Insights for around 30 URLs.

Then you’ll be met with the following message:

In order to continue running the function for the rest of the URLs, we simply need to schedule the report to run again. That is why this setting will run the report again three more times in the consecutive hours, picking up exactly where it left off.

The next hurdle is the limitation set by Google Sheets itself.

If you’re doing the math, you’ll see that since we can only automate the report a total of four times — we theoretically will be only able to pull PageSpeed Insights data for around 120 URLs. That’s not ideal if you’re working with a site that has more than a few hundred pages!.

The schedule function in the Settings tab uses the Google Sheet’s built-in Triggers feature. This tells our Google Apps script to run the report automatically at a particular day and time. Unfortunately, using this feature more than four times causes the “Service using too much computer time for one day” message.

This means that our Google Apps Script has exceeded the total allowable execution time for one day. It most commonly occurs for scripts that run on a trigger, which have a lower daily limit than scripts executed manually.

Manually?

You betcha! If you have more than 120 URLs that you want to pull data for, then you can simply use the Manual Push Report button. It does exactly what you think.

Manual Push Report

Once clicked, the ‘Manual Push Report’ button (linked from the PageSpeed Menu on the Google Sheet) will run the report. It will pick up right where it left off with data populating in the fields adjacent to your URLs in the Results tab.

For clarity, you don’t even need to schedule the report to run to use this document. Once you have your API key, all you need to do is add your URLs to the Results tab (starting in cell B6) and click ‘Manual Push Report’.

You will, of course, be met with the inevitable “Exceed maximum execution time” message after six minutes, but you can simply dismiss it, and click “Manual Push Report” again and again until you’re finished. It’s not fully automated, but it should allow you to gather the data you need relatively quickly.

Setting the log schedule

Another feature in the Settings tab is the Log Results function.

This will automatically take the data that has populated in the Results tab and move it to the Log sheet. Once it has copied over the results, it will automatically clear the populated data from the Results tab so that when the next scheduled report run time arrives, it can gather new data accordingly. Ideally, you would want to set the Log day and time after the scheduled report has run to ensure that it has time to capture and log all of the data.

You can also manually push data to the Log sheet using the ‘Manual Push Log’ button in the menu.

How to confirm and adjust the report and log schedules

Once you’re happy with the scheduling for the report and the log, be sure to set it using the ‘Set Report and Log Schedule’ from the PageSpeed Menu (as shown):

Should you want to change the frequency, I’d recommend first setting the report and log schedule using the sheet.

Then adjust the runLog and runTool functions using Google Script Triggers.

  • runLog controls when the data will be sent to the LOG sheet.
  • runTool controls when the API runs for each URL.

Simply click the pencil icon next to each respective function and adjust the timings as you see fit.

You can also use the ‘Reset Schedule’ button in the PageSpeed Menu (next to Help) to clear all scheduled triggers. This can be a helpful shortcut if you’re simply using the interface on the ‘Settings’ tab.

PageSpeed results tab

This tab is where the PageSpeed Insights data will be generated for each URL you provide. All you need to do is add a list of URLs starting from cell B6. You can either wait for your scheduled report time to arrive or use the ‘Manual Push Report’ button.

You should now see the following data generating for each respective URL:

  • Time to Interactive
  • First Contentful Paint
  • First Meaningful Paint
  • Time to First Byte
  • Speed Index

You will also see a column for Last Time Report Ran and Status on this tab. This will tell you when the data was gathered, and if the pull request was successful. A successful API request will show a status of “complete” in the Status column.

Log tab

Logging the data is a useful way to keep a historical account on these important speed metrics. There is nothing to modify in this tab, however, you will want to ensure that there are plenty of empty rows. When the runLog function runs (which is controlled by the Log schedule you assign in the “Settings” tab, or via the Manual Push Log button in the menu), it will move all of the rows from the Results tab that contains a status of “complete”. If there are no empty rows available on the Log tab, it will simply not copy over any of the data. All you need to do is add several thousands of rows depending on how often you plan to check-in and maintain the Log.

How to use the log data

The scheduling feature in this tool has been designed to run on a weekly basis to allow you enough time to review the results, optimize, then monitor your efforts. If you love spreadsheets then you can stop right here, but if you’re more of a visual person, then read on.

Visualizing the results in Google Data Studio

You can also use this Log sheet as a Data Source in Google Data Studio to visualize your results. As long as the Log sheet stays connected as a source, the results should automatically publish each week. This will allow you to work on performance optimization and evaluate results using Data Studio easily, as well as communicate performance issues and progress to clients who might not love spreadsheets as much as you do.

Blend your log data with other data sources

One great Google Data Studio feature is the ability to blend data. This allows you to compare and analyze data from multiple sources, as long as they have a common key. For example, if you wanted to blend the Time to Interactive results against Google Search Console data for those same URLs, you can easily do so. You will notice that the column in the Log tab containing the URLs is titled “Landing Page”. This is the same naming convention that Search Console uses and will allow Data Studio to connect the two sources.

There are several ways that you can use this data in Google Data Studio.

Compare your competitors’ performance

You don’t need to limit yourself to just your own URLs in this tool; you can use any set of URLs. This would be a great way to compare your competitor’s pages and even see if there are any clear indicators of speed affecting positions in Search results.

Improve usability

Don’t immediately assume that your content is the problem. Your visitors may not be leaving the page because they don’t find the content useful; it could be slow load times or other incompatibility issues that are driving visitors away. Compare bounce rates, time on site, and device type data alongside performance metrics to see if it could be a factor.

Increase organic visibility

Compare your performance data against Search ranking positions for your target keywords. Use a tool to gather your page positions, and fix performance issues for landing pages on page two or three of Google Search results to see if you can increase their prominence.

Final thoughts

This tool is all yours.

Make a copy and use it as is, or tear apart the Google Apps Script that makes this thing work and adapt it into something bigger and better (if you do, please let me know; I want to hear all about it).

Remember PageSpeed Insights API V5 now includes all of the same data that is provided in the Chrome Lighthouse audits, which means there are way more available details you can extract beyond the five metrics that this tool generates.

Hopefully, for now, this tool helps you gather Performance data a little more efficiently between now and when Google releases their recently announced Speed report for Search Console.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

How to pitch to top online publishers: 10 Exclusive survey insights

How would you like to get your brand featured on major online websites like Buzzfeed, the Washington Post, or Bustle?

When you earn the attention of top-tier press, you reap the business benefits of large-scale brand exposure and the SEO benefits of high-authority backlinks. It’s a win-win.

But it’s increasingly difficult to win the attention of the online press. Any day of the week, you have tweets from Chrissy Teigen and the contentious presidential election dominating the media coverage and driving the online social discussion.

There is plenty of anecdotal evidence to suggest that there is an ideal time to pitch a writer or the best way to write a press release or an email for optimal success. The problem with a lot of digital PR advice you read online is that it’s purely situational and can vary from person to person.

That’s why my team decided to end the back and forth once and for all. In a new publisher survey, we asked 500+ online writers and editors from sites like the New York Times, CNN, Cosmopolitan, and Mashable how they want to be pitched, what types of content they prefer to cover, and what PR professionals should include (and exclude) in their outreach emails in order to gain their trust and earn a coveted space on their website.

Here are 10 major data-backed insights that you can use to optimize your outreach strategy.

The top three reasons why journalists decline your pitch

The top three reasons journalists decline your pitch is because it’s irrelevant, boring, or too self-promotional.

A crucial reason why a writer rejected your email pitch is that it’s irrelevant. That’s right, 88% of writers have rejected a pitch for it being unrelated to their beat.

Note: Password: exclusive – a password will be taken down to facilitate the exclusive for SEW.

Almost 64% of writers have rejected a pitch because it was simply too boring. If you fail to explain why the content you’re pitching is exciting or newsworthy, how can you expect an online editor to envision the story?

Another 62% of online journalists have rejected a pitched because it’s too self-promotional. Online editors seek to inform and entertain their audiences. A tired pitch about some new thing that’s happening at your company or some funny thing your CEO tweeted is not going to capture the attention of the masses and drive traffic – and editors know that.

Over 42% of writers reported receiving 11 to 100 pitches a day

Over 42% of writers reported receiving 11 to 100 pitches a day and almost five percent receive 100+ email pitches per day.

To a certain extent, online writers and editors rely on PR pitches to provide them with content to fill their editorial calendar. But can you imagine receiving 100 pitches a day? It’s no wonder that journalists take to Twitter so often to vent about the latest #PRFail that recently arrived in their inbox. With all of that inbox clutter, who wouldn’t be frustrated with a lazily written, irrelevant pitch?

Time = money. You’re wasting both when you reach out to a writer about content that’s relevant to them or their beat.

Only 22% of digital writers open every single email addressed to them

Only one out of every five people you send emails to will open every pitch addressed to them. And most people, about three in four, open an email based on the subject line alone. This places a lot of pressure on your subject line writing, which is why it’s one of the most important elements of your outreach email strategy.

Read all about how to perfect your subject lines for PR outreach in a previous post for SEMrush.

Most writers (58%) prefer to receive a pitch between 100 and 200 words

Keep it short and sweet. Given the sheer volume of pitches they receive daily, writers are too busy to sift through a complicated pitch to decipher what it’s about. If they open your email, you have about half a minute to capture their attention before they move on to the next pitch.

Here are some tips to keep the word count of your email down.

  • Include only the most relevant, interesting, and newsworthy details of your content
  • Use bullet points to list disparate details
  • Link to the full content from your email (that is, don’t attach additional info to the email)

Some content topics are more competitive than others

Our survey found that writers who cover popular topics, such as women’s interest, home and lifestyle, and entertainment receive double the number of pitches than writers covering personal finance and business.

How can you change your content marketing strategy in light of these stats? Create content on the sphere of two verticals. For example, a piece of content that explores inter-office dating can be covered by writers who cover two different beats – dating and career/business.

By creating content that naturally appeals to more than one audience, you double your potential for exposure right out of the gate.

Staff editors are pitched more than staff writers or freelance contributors

According to our data, it’s safe to say staff editors have more inbox congestion than staff writers or freelancers. However, that doesn’t mean that you should remove them from your outreach list. When it comes to who to pitch, the best answer is still unclear. Despite the data suggesting you have a better chance with freelancers and staff writers, the bottom line is that they oftentimes still have to pitch the editor their story. By writing directly to the editor, they make the decision right then and there on whether to assign the story.

There are pros and cons to pitching all people in all three of these roles, but it’s good to keep their different roles and responsibilities in mind when actively pitching a content campaign.

The best time to send pitches are 5 am to 12 pm on Monday, Tuesday, and Wednesday

After years of practicing PR for clients across all topic verticals, it’s long been a suspicion of mine that pitches sent on a Friday tend to fall on deaf ears and require a follow up to really be seen. If you felt the same way, then you’re experience is about to be validated.

Our survey of 500+ journalists found that the best days to send email pitches are at the beginning of the workweek: Monday, Tuesday, and Wednesday. And the best time? Survey says early morning is better, between 5 am to 12 pm.

Journalists prefer zero or one follow up emails an average of three to seven days after you’ve sent your initial pitch

Speaking of follow up emails, should you even send them at all? While 20% of writers believe that you should never send a follow-up email, the majority of writers (60%) consider one follow-up email to be the most acceptable.

When should you follow up? Data shows that most writers prefer that you follow-up three to seven days after you send the first email pitch.

Heed Phil’s warning. It may be surprising to you that some people send follow-ups to journalists who’ve already declined their pitch.

The thing is, many PR pros are still using unsophisticated mass outreach tools that are too automated. If you’re in doubt about your tool, it’s better to use a spreadsheet and focus on “one-on-one” email outreach. Automate effectively, responsibly, and at your own risk.

If you provide good content, journalists will want you to keep in touch

We asked 500+ journalists and online writers how they want to keep in touch with a PR pro after working with them on a story. They told us that the best way to stay in contact is via email (77%) and by continuing sending the journalist relevant content (57%).

Journalists were quick to note that they do not want phone calls or to meet in person but were more open to chatting on Twitter and LinkedIn occasionally.

Over 53% of writers say they don’t subscribe to press release sources

Is the press release “dead?” While it is still a strategy that marketers and brands employ, its usefulness is slowly declining in favor of direct, targeted “One-on-one” outreach.

Around 20% of writers admitted that they never write a story based on a press release, while about 29% of writers we surveyed say they use press releases for their stories more than 10 times a year.

Conclusion

Offering compelling, newsworthy, data-driven content is the key to earning top tier press mentions. 10x content paired with strategic one-on-one digital PR is the winning combination to earn attention and authority for your brand.

When it comes to earning press on top tier online websites like the NYTimes, CNN, Forbes, the Atlantic, and more, it’s not impossible, but it is increasingly harder with countless pieces of content being created every day. Capturing and keeping a journalist’s attention is a competitive game. Keep these stats in mind to give your content the upper hand in a crowded inbox.

Domenica is a Brand Relationship Manager at Fractl. She can be found on Twitter @atdomenica.

The post How to pitch to top online publishers: 10 Exclusive survey insights appeared first on Search Engine Watch.

Search Engine Watch

Posted in IM NewsComments Off

5 additional data blending examples for smarter SEO insights

Once you preprocess columns to consistent formatting, additional data blending options include prioritizing pages with search clicks, mining internal site search for content gaps, analyzing traffic issues with 404 pages and more.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

The State of Local SEO: Industry Insights for a Successful 2019

Posted by MiriamEllis

A thousand thanks to the 1,411 respondents who gave of their time and knowledge in contributing to this major survey! You’ve created a vivid image of what real-life, everyday local search marketers and local business owners are observing on a day-to-day basis, what strategies are working for them right now, and where some frankly stunning opportunities for improvement reside. Now, we’re ready to share your insights into:

  • Google Updates
  • Citations
  • Reviews
  • Company infrastructure
  • Tool usage
  • And a great deal more…

This survey pooled the observations of everyone from people working to market a single small business, to agency marketers with large local business clients:

Respondents who self-selected as not marketing a local business were filtered from further survey results.

Thanks to you, this free report is a window into the industry. Bring these statistics to teammates and clients to earn the buy-in you need to effectively reach local consumers in 2019.

Get the full report

There are so many stories here worthy of your time

Let’s pick just one, to give a sense of the industry intelligence you’ll access in this report. Likely you’ve now seen the Local Search Ranking Factors 2018 Survey, undertaken by Whitespark in conjunction with Moz. In that poll of experts, we saw Google My Business signals being cited as the most influential local ranking component. But what was #2? Link building.

You might come away from that excellent survey believing that, since link building is so important, all local businesses must be doing it. But not so. The State of the Local SEO Industry Report reveals that:

When asked what’s working best for them as a method for earning links, 35% of local businesses and their marketers admitted to having no link building strategy in place at all:

And that, Moz friends, is what opportunity looks like. Get your meaningful local link building strategy in place in the new year, and prepare to leave ⅓ of your competitors behind, wondering how you surpassed them in the local and organic results.

The full report contains 30+ findings like this one. Rivet the attention of decision-makers at your agency, quote persuasive statistics to hesitant clients, and share this report with teammates who need to be brought up to industry speed. When read in tandem with the Local Search Ranking Factors survey, this report will help your business or agency understand both what experts are saying and what practitioners are experiencing.

Sometimes, local search marketing can be a lonely road to travel. You may find yourself wondering, “Does anyone understand what I do? Is anyone else struggling with this task? How do I benchmark myself?” You’ll find both confirmation and affirmation today, and Moz’s best hope is that you’ll come away a better, bolder, more effective local marketer. Let’s begin!

Download the report

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The Mindset and Insights that Will Bring You Wonderful Things in 2019

Great to see you again after our Thanksgiving holiday! Our Black Friday promotion was great, and the whole team is…

The post The Mindset and Insights that Will Bring You Wonderful Things in 2019 appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

SearchCap: Google My Business app, Bing ads insights & web site trust

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

SearchCap: Google PageSpeed Insights update, exact match PPC & account analytics

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

SearchCap: Google My Business Insights, search industry honors Barry Schwartz, more

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

NEW On-Demand Crawl: Quick Insights for Sales, Prospecting, & Competitive Analysis

Posted by Dr-Pete

In June of 2017, Moz launched our entirely rebuilt Site Crawl, helping you dive deep into crawl issues and technical SEO problems, fix those issues in your Moz Pro Campaigns (tracked websites), and monitor weekly for new issues. Many times, though, you need quick insights outside of a Campaign context, whether you’re analyzing a prospect site before a sales call or trying to assess the competition.

For years, Moz had a lab tool called Crawl Test. The bad news is that Crawl Test never made it to prime-time and suffered from some neglect. The good news is that I’m happy to announce the full launch (as of August 2018) of On-Demand Crawl, an entirely new crawl tool built on the engine that powers Site Crawl, but with a UI designed around quick insights for prospecting and competitive analysis.

While you don’t need a Campaign to run a crawl, you do need to be logged into your Moz Pro subscription. If you don’t have a subscription, you can sign-up for a free trial and give it a whirl.

How can you put On-Demand Crawl to work? Let’s walk through a short example together.


All you need is a domain

Getting started is easy. From the “Moz Pro” menu, find “On-Demand Crawl” under “Research Tools”:

Just enter a root domain or subdomain in the box at the top and click the blue button to kick off a crawl. While I don’t want to pick on anyone, I’ve decided to use a real site. Our recent analysis of the August 1st Google update identified some sites that were hit hard, and I’ve picked one (lilluna.com) from that list.

Please note that Moz is not affiliated with Lil’ Luna in any way. For the most part, it seems to be a decent site with reasonably good content. Let’s pretend, just for this post, that you’re looking to help this site out and determine if they’d be a good fit for your SEO services. You’ve got a call scheduled and need to spot-check for any major problems so that you can go into that call as informed as possible.

On-Demand Crawls aren’t instantaneous (crawling is a big job), but they’ll generally finish between a few minutes and an hour. We know these are time-sensitive situations. You’ll soon receive an email that looks like this:

The email includes the number of URLs crawled (On-Demand will currently crawl up to 3,000 URLs), the total issues found, and a summary table of crawl issues by category. Click on the [View Report] link to dive into the full crawl data.


Assess critical issues quickly

We’ve designed On-Demand Crawl to assist your own human intelligence. You’ll see some basic stats at the top, but then immediately move into a graph of your top issues by count. The graph only displays issues that occur at least once on your site – you can click “See More” to show all of the issues that On-Demand Crawl tracks (the top two bars have been truncated)…

Issues are also color-coded by category. Some items are warnings, and whether they matter depends a lot on context. Other issues, like “Critcal Errors” (in red) almost always demand attention. So, let’s check out those 404 errors. Scroll down and you’ll see a list of “Pages Crawled” with filters. You’re going to select “4xx” in the “Status Codes” dropdown…

You can then pretty easily spot-check these URLs and find out that they do, in fact, seem to be returning 404 errors. Some appear to be legitimate content that has either internal or external links (or both). So, within a few minutes, you’ve already found something useful.

Let’s look at those yellow “Meta Noindex” errors next. This is a tricky one, because you can’t easily determine intent. An intentional Meta Noindex may be fine. An unintentional one (or hundreds of unintentional ones) could be blocking crawlers and causing serious harm. Here, you’ll filter by issue type…

Like the top graph, issues appear in order of prevalence. You can also filter by all pages that have issues (any issues) or pages that have no issues. Here’s a sample of what you get back (the full table also includes status code, issue count, and an option to view all issues)…

Notice the “?s=” common to all of these URLs. Clicking on a few, you can see that these are internal search pages. These URLs have no particular SEO value, and the Meta Noindex is likely intentional. Good technical SEO is also about avoiding false alarms because you lack internal knowledge of a site. On-Demand Crawl helps you semi-automate and summarize insights to put your human intelligence to work quickly.


Dive deeper with exports

Let’s go back to those 404s. Ideally, you’d like to know where those URLs are showing up. We can’t fit everything into one screen, but if you scroll up to the “All Issues” graph you’ll see an “Export CSV” option…

The export will honor any filters set in the page list, so let’s re-apply that “4xx” filter and pull the data. Your export should download almost immediately. The full export contains a wealth of information, but I’ve zeroed in on just what’s critical for this particular case…

Now, you know not only what pages are missing, but exactly where they link from internally, and can easily pass along suggested fixes to the customer or prospect. Some of these turn out to be link-heavy pages that could probably benefit from some clean-up or updating (if newer recipes are a good fit).

Let’s try another one. You’ve got 8 duplicate content errors. Potentially thin content could fit theories about the August 1st update, so this is worth digging into. If you filter by “Duplicate Content” issues, you’ll see the following message…

The 8 duplicate issues actually represent 18 pages, and the table returns all 18 affected pages. In some cases, the duplicates will be obvious from the title and/or URL, but in this case there’s a bit of mystery, so let’s pull that export file. In this case, there’s a column called “Duplicate Content Group,” and sorting by it reveals something like the following (there’s a lot more data in the original export file)…

I’ve renamed “Duplicate Content Group” to just “Group” and included the word count (“Words”), which could be useful for verifying true duplicates. Look at group #7 – it turns out that these “Weekly Menu Plan” pages are very image heavy and have a common block of text before any unique text. While not 100% duplicated, these otherwise valuable pages could easily look like thin content to Google and represent a broader problem.


Real insights in real-time

Not counting the time spent writing the blog post, running this crawl and diving in took less than an hour, and even that small amount of time spent uncovered more potential issues than what I could cover in this post. In less than an hour, you can walk into a client meeting or sales call with in-depth knowledge of any domain.

Keep in mind that many of these features also exist in our Site Crawl tool. If you’re looking for long-term, campaign insights, use Site Crawl (if you just need to update your data, use our “Recrawl” feature). If you’re looking for quick, one-time insights, check out On-Demand Crawl. Standard Pro users currently get 5 On-Demand Crawls per month (with limits increasing at higher tiers).

Your On-Demand Crawls are currently stored for 90 days. When you re-enter the feature, you’ll see a table of all of your recent crawls (the image below has been truncated):

Click on any row to go back to see the crawl data for that domain. If you get the sale and decide to move forward, congratulations! You can port that domain directly into a Moz campaign.

We hope you’ll try On-Demand Crawl out and let us know what you think. We’d love to hear your case studies, whether it’s sales, competitive analysis, or just trying to solve the mysteries of a Google update.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert