Tag Archive | "Viable"

The Minimum Viable Knowledge You Need to Work with JavaScript & SEO Today

Posted by sergeystefoglo

If your work involves SEO at some level, you’ve most likely been hearing more and more about JavaScript and the implications it has on crawling and indexing. Frankly, Googlebot struggles with it, and many websites utilize modern-day JavaScript to load in crucial content today. Because of this, we need to be equipped to discuss this topic when it comes up in order to be effective.

The goal of this post is to equip you with the minimum viable knowledge required to do so. This post won’t go into the nitty gritty details, describe the history, or give you extreme detail on specifics. There are a lot of incredible write-ups that already do this — I suggest giving them a read if you are interested in diving deeper (I’ll link out to my favorites at the bottom).

In order to be effective consultants when it comes to the topic of JavaScript and SEO, we need to be able to answer three questions:

  1. Does the domain/page in question rely on client-side JavaScript to load/change on-page content or links?
  2. If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?
  3. If not, what is the ideal solution?

With some quick searching, I was able to find three examples of landing pages that utilize JavaScript to load in crucial content.

I’m going to be using Sitecore’s Symposium landing page through each of these talking points to illustrate how to answer the questions above.

We’ll cover the “how do I do this” aspect first, and at the end I’ll expand on a few core concepts and link to further resources.

Question 1: Does the domain in question rely on client-side JavaScript to load/change on-page content or links?

The first step to diagnosing any issues involving JavaScript is to check if the domain uses it to load in crucial content that could impact SEO (on-page content or links). Ideally this will happen anytime you get a new client (during the initial technical audit), or whenever your client redesigns/launches new features of the site.

How do we go about doing this?

Ask the client

Ask, and you shall receive! Seriously though, one of the quickest/easiest things you can do as a consultant is contact your POC (or developers on the account) and ask them. After all, these are the people who work on the website day-in and day-out!

“Hi [client], we’re currently doing a technical sweep on the site. One thing we check is if any crucial content (links, on-page content) gets loaded in via JavaScript. We will do some manual testing, but an easy way to confirm this is to ask! Could you (or the team) answer the following, please?

1. Are we using client-side JavaScript to load in important content?

2. If yes, can we get a bulleted list of where/what content is loaded in via JavaScript?”

Check manually

Even on a large e-commerce website with millions of pages, there are usually only a handful of important page templates. In my experience, it should only take an hour max to check manually. I use the Chrome Web Developers plugin, disable JavaScript from there, and manually check the important templates of the site (homepage, category page, product page, blog post, etc.)

In the example above, once we turn off JavaScript and reload the page, we can see that we are looking at a blank page.

As you make progress, jot down notes about content that isn’t being loaded in, is being loaded in wrong, or any internal linking that isn’t working properly.

At the end of this step we should know if the domain in question relies on JavaScript to load/change on-page content or links. If the answer is yes, we should also know where this happens (homepage, category pages, specific modules, etc.)

Crawl

You could also crawl the site (with a tool like Screaming Frog or Sitebulb) with JavaScript rendering turned off, and then run the same crawl with JavaScript turned on, and compare the differences with internal links and on-page elements.

For example, it could be that when you crawl the site with JavaScript rendering turned off, the title tags don’t appear. In my mind this would trigger an action to crawl the site with JavaScript rendering turned on to see if the title tags do appear (as well as checking manually).

Example

For our example, I went ahead and did a manual check. As we can see from the screenshot below, when we disable JavaScript, the content does not load.

In other words, the answer to our first question for this pages is “yes, JavaScript is being used to load in crucial parts of the site.”

Question 2: If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?

If your client is relying on JavaScript on certain parts of their website (in our example they are), it is our job to try and replicate how Google is actually seeing the page(s). We want to answer the question, “Is Google seeing the page/site the way we want it to?”

In order to get a more accurate depiction of what Googlebot is seeing, we need to attempt to mimic how it crawls the page.

How do we do that?

Use Google’s new mobile-friendly testing tool

At the moment, the quickest and most accurate way to try and replicate what Googlebot is seeing on a site is by using Google’s new mobile friendliness tool. My colleague Dom recently wrote an in-depth post comparing Search Console Fetch and Render, Googlebot, and the mobile friendliness tool. His findings were that most of the time, Googlebot and the mobile friendliness tool resulted in the same output.

In Google’s mobile friendliness tool, simply input your URL, hit “run test,” and then once the test is complete, click on “source code” on the right side of the window. You can take that code and search for any on-page content (title tags, canonicals, etc.) or links. If they appear here, Google is most likely seeing the content.

Search for visible content in Google

It’s always good to sense-check. Another quick way to check if GoogleBot has indexed content on your page is by simply selecting visible text on your page, and doing a site:search for it in Google with quotations around said text.

In our example there is visible text on the page that reads…

“Whether you are in marketing, business development, or IT, you feel a sense of urgency. Or maybe opportunity?”

When we do a site:search for this exact phrase, for this exact page, we get nothing. This means Google hasn’t indexed the content.

Crawling with a tool

Most crawling tools have the functionality to crawl JavaScript now. For example, in Screaming Frog you can head to configuration > spider > rendering > then select “JavaScript” from the dropdown and hit save. DeepCrawl and SiteBulb both have this feature as well.

From here you can input your domain/URL and see the rendered page/code once your tool of choice has completed the crawl.

Example:

When attempting to answer this question, my preference is to start by inputting the domain into Google’s mobile friendliness tool, copy the source code, and searching for important on-page elements (think title tag, <h1>, body copy, etc.) It’s also helpful to use a tool like diff checker to compare the rendered HTML with the original HTML (Screaming Frog also has a function where you can do this side by side).

For our example, here is what the output of the mobile friendliness tool shows us.

After a few searches, it becomes clear that important on-page elements are missing here.

We also did the second test and confirmed that Google hasn’t indexed the body content found on this page.

The implication at this point is that Googlebot is not seeing our content the way we want it to, which is a problem.

Let’s jump ahead and see what we can recommend the client.

Question 3: If we’re confident Googlebot isn’t seeing our content properly, what should we recommend?

Now we know that the domain is using JavaScript to load in crucial content and we know that Googlebot is most likely not seeing that content, the final step is to recommend an ideal solution to the client. Key word: recommend, not implement. It’s 100% our job to flag the issue to our client, explain why it’s important (as well as the possible implications), and highlight an ideal solution. It is 100% not our job to try to do the developer’s job of figuring out an ideal solution with their unique stack/resources/etc.

How do we do that?

You want server-side rendering

The main reason why Google is having trouble seeing Sitecore’s landing page right now, is because Sitecore’s landing page is asking the user (us, Googlebot) to do the heavy work of loading the JavaScript on their page. In other words, they’re using client-side JavaScript.

Googlebot is literally landing on the page, trying to execute JavaScript as best as possible, and then needing to leave before it has a chance to see any content.

The fix here is to instead have Sitecore’s landing page load on their server. In other words, we want to take the heavy lifting off of Googlebot, and put it on Sitecore’s servers. This will ensure that when Googlebot comes to the page, it doesn’t have to do any heavy lifting and instead can crawl the rendered HTML.

In this scenario, Googlebot lands on the page and already sees the HTML (and all the content).

There are more specific options (like isomorphic setups)

This is where it gets to be a bit in the weeds, but there are hybrid solutions. The best one at the moment is called isomorphic.

In this model, we’re asking the client to load the first request on their server, and then any future requests are made client-side.

So Googlebot comes to the page, the client’s server has already executed the initial JavaScript needed for the page, sends the rendered HTML down to the browser, and anything after that is done on the client-side.

If you’re looking to recommend this as a solution, please read this post from the AirBNB team which covers isomorphic setups in detail.

AJAX crawling = no go

I won’t go into details on this, but just know that Google’s previous AJAX crawling solution for JavaScript has since been discontinued and will eventually not work. We shouldn’t be recommending this method.

(However, I am interested to hear any case studies from anyone who has implemented this solution recently. How has Google responded? Also, here’s a great write-up on this from my colleague Rob.)

Summary

At the risk of severely oversimplifying, here’s what you need to do in order to start working with JavaScript and SEO in 2018:

  1. Know when/where your client’s domain uses client-side JavaScript to load in on-page content or links.
    1. Ask the developers.
    2. Turn off JavaScript and do some manual testing by page template.
    3. Crawl using a JavaScript crawler.
  2. Check to see if GoogleBot is seeing content the way we intend it to.
    1. Google’s mobile friendliness checker.
    2. Doing a site:search for visible content on the page.
    3. Crawl using a JavaScript crawler.
  3. Give an ideal recommendation to client.
    1. Server-side rendering.
    2. Hybrid solutions (isomorphic).
    3. Not AJAX crawling.

Further resources

I’m really interested to hear about any of your experiences with JavaScript and SEO. What are some examples of things that have worked well for you? What about things that haven’t worked so well? If you’ve implemented an isomorphic setup, I’m curious to hear how that’s impacted how Googlebot sees your site.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Minimum Viable SEO: If You Only Have a Few Minutes Each Week… Do This! – Whiteboard Friday

Posted by randfish

Even if you know — deep down in your heart of hearts — how important SEO is, it’s hard to prioritize when you have less than 3 hours a month to devote to it. But there’s still a way to include the bare minimum, even if you run on a tight schedule. In today’s Whiteboard Friday, Rand covers a minimum viable SEO strategy to give those with limited time a plan going forward.

Minimum Viable SEO

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week, Minimum Viable SEO. So if you only have a few minutes in a month, in a week to do some SEO, and I know many of you are professional SEOs, but you work with lots of folks, like content creators, clients, web developers, who have very, very limited time, what I want to try and do is provide a path for you of “do this if you have no other time in the week to do your SEO.”

So let’s say here’s my calendar. It’s February, so 28 days. Start of the month, you have an hour to give me, sometime in the first week of the month. It doesn’t have to be, but that’s a great way to go. At the start of each week, I’m going to ask for 10 minutes just to do a little bit of planning, and then each time you publish content, a very, very small amount of time, just 3 minutes.

I know it sounds hard to believe, but you can get a fair amount of solid SEO work. Especially if you’re in an industry that is not hyper-competitive or if you’re going after the right kinds of keywords, that aren’t super competitive, you can really make a difference. If you’re building up a lot of content over months and years, just following this simple protocol can really take your SEO to the next level.

Start of the month: 1 hour

So, all right, let’s say we’re at the start of our month. We have our hour. I want you to do one of two things, and this is going to be based on if you’re technical SEO, meaning if your website is using WordPress and it’s pretty much nicely crawlable, maybe you’ve signed up for Google Search Console, you don’t see a lot of errors, there’s not a lot of issues, you haven’t created a bunch of technical data on your website in the past, great, fine, then you’re going to be focused on keywords and content. A keyword to content map, which is something we’ve discussed here on Whiteboard Friday — I’d urge you to check that video out if you haven’t yet — but I’m going to make an MVP version, a very, very small version that can help a little bit.

Keyword → content map MVP

Create a spreadsheet with valuable keywords…

That spreadsheet, I just want a spreadsheet with a few things in it, three things really. The most valuable keywords, so just the most valuable keywords that you know you’re targeting or that you care about right now for your business. You think that people are searching for these keywords. Maybe you’ve done a little bit of keyword research. It could be for free, through Google’s AdWords tool, or you could pay for something like Keyword Explorer for Moz, but, really, just 50 to 100 keywords in there.

…current rank and SERP features…

I want the current rank and whatever SERP features appear. You could even trim this down to just your current ranking and the top search SERP feature, so if it has a featured snippet, or if it has videos, or if it shows maps or news, whatever that is, tweets.

…and the URL targeting it (or a note to create content).

Then I want the URL that’s targeting it. Or if you have no URL targeting it yet, you haven’t yet created a piece of content that targets this keyword, put a little, “Okay, that’s a ‘needs to be created.’ I need this before I can start targeting this keyword and trying to rank for it.”

You’re going to update this weekly. You can do that totally manually. Fifty keywords, you can look them up in an hour. You can check the rankings. You can see where you’re going. That’s fine. It’s a little bit of a pain in the butt, but it can totally be done. Or you could use a tool, Moz Pro, Ahrefs, SEMRush, Searchmetrics. There are all sorts of tools out there that’ll track rankings and show you which features appear and whether your URLs are in there or not.

Okay, this is our keyword to content map. If you have that hour, but you know you have technical issues on the site, I’m going to urge you, before you focus on keywords and content, to make sure your technical SEO, your crawl is set. That means, step one, just a basic, simple crawl analysis. So for free, you can use Google Search Console. It will show you, most of the time with relative accuracy, big important errors like 404s and 500s and things that Google thought we’re duplicate content and that kind of stuff.

If you want to pay, you can get a little bit more advanced features and some better filters and sorting and more frequency and those kinds of things. Moz Pro is fine for that. Screaming Frog is good, OnPage.org. All of these are popular in the SEO field.

Crawl/technical SEO review

Step two, you don’t need to worry about every single crawl issue. I just want you to worry about the most severe, most important ones with your one hour. Those are things like 404s and 500s, which can really cause a lot of problems, duplicate content, where you potentially need to use a rel=canonical or a 301 redirect, broken links, where you just go in and fix the broken link to something that’s not broken, missing or bad titles, title elements that are particularly long or include misspellings or that just don’t exist, bad, very bad to have a page on the web with no title, and thin content or no crawlable content. Those are really the worst of the bunch. There’s a number more that you could take care of. But if you only have that limited time, take care of this. If you’ve already done this, then we can move on here.

Every time you publish a piece of content: 3 minutes

Finally, last thing, but not the least, every time you publish a piece of content, I’m going to ask for just three minutes of your time, and that is going to be around this minimum viable pre-publish checklist.

The minimum viable pre-publish checklist

So does the content have a keyword target? Yes, no, maybe? If it doesn’t, you’re going to need to go and refer over to your keyword content list and make sure that it does. So if you’re publishing something, I’m assuming you’re not publishing a tremendous amount of content, but a little bit. Make sure everyone has a keyword target. Make sure, if you can, that it’s targeting two to three additional keywords, related keywords. So let’s say I’m going after something like Faberge eggs. I probably also want to target Carl Faberge, or I want to target Faberge eggs museums, or I want to target Faberge eggs replicas, so these other terms and phrases that people are likely searching for that could have the same or similar keyword intent, that could live on the same page, that kind of thing.

Is that keyword in the title, the main one you’re targeting? Do you have a compelling meta description? Is your content doing a good job of truly answering the searchers’ queries? So if they’ve searched for this thing, are you serving up the content they need?

Then, have you used related topics? You can get those from places like the MozBar or MarketMuse or SEO Zone or Moz Pro. Related topics are essentially the words and phrases that you should also be using in addition to your keyword to indicate to the search engines, “Hey, this is really about this topic.” We’ve seen some nice bumps from that.

You’re doing this every time you publish content. It only takes three minutes.

Start of the week: 10 minutes

And the last thing, at the start of the week, I’m also asking you for these 10 minutes to do one or two actions. I just want you to plan one or two actions at the start of the week to bump your SEO. It could include some publication stuff. But let’s assume you’re just doing these three minutes every time you do that.

Take a few actions to boost your SEO

Link outreach and targeting keywords with content

At the start of the week, the last thing you’re doing is just choosing one of these, maybe two. I don’t need more. I want you to do something like link outreach. Reach out to a couple of high-potential targets. Maybe you use like a LinkedIn or SecTool to figure out people who are linking to two of your competitors. Or reach out to partners, to friends, do some content contributions, just a little thing to get one or two links. Or maybe create some content that’s targeting a missed keyword. When you do that, of course, you go through your pre-publish checklist.

Upgrade ranking content

Maybe you are upgrading some content that’s already ranking, like number 5 through 20. That’s where there’s a lot of opportunity for a high-value keyword to get bumped up. You could just do little things, like make sure that it’s serving all of these items, try and get it a featured snippet, identify content that might be old, that needs a refresh, that’s not serving the searcher intent as well because the information in there is old.

Contribute off-site content

Or you could try contributing some offsite content. That could be to places like YouTube, maybe you’ve seen videos show up for something, guest posts, a forum where you contribute, answers some questions on Quora, contribute something to LinkedIn or Medium, just something to get your brand, your content, and hopefully a link out there to a different audience than what’s already coming to your site.

You do these things, right, you start the month with an hour. Every time you publish content, you put in 3 minutes, and at the start of the week, you put in 10 minutes to do a couple pieces of planning, this will take you a long way. Look, SEO professionals are going to do a lot more than this, for sure. But this can be a great start, a great way to get that SEO kicked off, to have a minimum viable SEO plan.

I look forward to your thoughts. And we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Can You Apply The Minimum Viable Product Principle To Information Products?

I recently listened to the audiobook version of the Lean Startup by Eric Ries.

My motivation for “reading” it is to help with the development of CrankyAds, my software startup. Lean startup principles are particularly good for a software company because of the unlimited features you could develop (given unlimited resources) contrasting with the need to just get Read the rest of this entry »

Entrepreneurs-Journey.com by Yaro Starak

www.4socialmediaconsulting.com Providing the best Social Media Manager for your campaign. We are the worlds largest social media content provider and we will replace your entire Social Media Department for as little as 0 per month. Call Toll Free to 866-998-8524 for details!
Video Rating: 4 / 5

Posted in IM NewsComments Off


Advert