Tag Archive | "Need"

Google Home & Assistant Listings May Need Google Guaranteed Label

You’ve seen the Google local services ads before, they have the Google Guarantee label on them. But when the Google Assistant started showing ads, people asked how does one not pay to show up in the local listings if you need the guarantee label but those come from local services ads.


Search Engine Roundtable

Posted in IM NewsComments Off

There’s no shortcut to authority: Why you need to take E-A-T seriously

Following expertise, authoritativeness and trustworthiness guidelines should be a part of every SEO strategy no matter your niche.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Get started with marketing automation–learn the terms you need to know

Your cheat sheet to demystifying marketing automation.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Why Local Businesses Will Need Websites More than Ever in 2019

Posted by MiriamEllis

64% of 1,411 surveyed local business marketers agree that Google is becoming the new “homepage” for local businesses. Via Moz State of Local SEO Industry Report

…but please don’t come away with the wrong storyline from this statistic.

As local brands and their marketers watch Google play Trojan horse, shifting from top benefactor to top competitor by replacing former “free” publicity with paid packs, Local Service Ads, zero-click SERPs, and related structures, it’s no surprise to see forum members asking, “Do I even need a website anymore?”

Our answer to this question is,“Yes, you’ve never needed a website more than you will in 2019.” In this post, we’ll examine:

  • Why it looks like local businesses don’t need websites
  • Statistical proofs of why local businesses need websites now more than ever
  • The current status of local business websites and most-needed improvements

How Google stopped bearing so many gifts

Within recent memory, a Google query with local intent brought up a big pack of ten nearby businesses, with each entry taking the user directly to these brands’ websites for all of their next steps. A modest amount of marketing effort was rewarded with a shower of Google gifts in the form of rankings, traffic, and conversions.

Then these generous SERPs shrank to seven spots, and then three, with the mobile sea change thrown into the bargain and consisting of layers and layers of Google-owned interfaces instead of direct-to-website links. In 2018, when we rustle through the wrapping paper, the presents we find from Google look cheaper, smaller, and less magnificent.

Consider these five key developments:

1) Zero-click mobile SERPs

This slide from a recent presentation by Rand Fishkin encapsulates his findings regarding the growth of no-click SERPs between 2016–2018. Mobile users have experienced a 20% increase in delivery of search engine results that don’t require them to go any deeper than Google’s own interface.

2) The encroachment of paid ads into local packs

When Dr. Peter J. Myers surveyed 11,000 SERPs in 2018, he found that 35% of competitive local packs feature ads.

3) Google becoming a lead gen agency

At last count, Google’s Local Service Ads program via which they interposition themselves as the paid lead gen agent between businesses and consumers has taken over 23 business categories in 77 US cities.

4) Even your branded SERPs don’t belong to you

When a user specifically searches for your brand and your Google Knowledge Panel pops up, you can likely cope with the long-standing “People Also Search For” set of competitors at the bottom of it. But that’s not the same as Google allowing Groupon to advertise at the top of your KP, or putting lead gen from Doordash and GrubHub front and center to nickel and dime you on your own customers’ orders.

5) Google is being called the new “homepage” for local businesses

As highlighted at the beginning of this post, 64% of marketers agree that Google is becoming the new “homepage” for local businesses. This concept, coined by Mike Blumenthal, signifies that a user looking at a Google Knowledge Panel can get basic business info, make a phone call, get directions, book something, ask a question, take a virtual tour, read microblog posts, see hours of operation, thumb through photos, see busy times, read and leave reviews. Without ever having to click through to a brand’s domain, the user may be fully satisfied.

“Nothing is enough for the man to whom enough is too little.”
- Epicurus

There are many more examples we could gather, but they can all be summed up in one way: None of Google’s most recent local initiatives are about driving customers to brands’ own websites. Local SERPs have shrunk and have been re-engineered to keep users within Google’s platforms to generate maximum revenue for Google and their partners.

You may be as philosophical as Epicurus about this and say that Google has every right to be as profitable as they can with their own product, even if they don’t really need to siphon more revenue off local businesses. But if Google’s recent trajectory causes your brand or agency to conclude that websites have become obsolete in this heavily controlled environment, please keep reading.

Your website is your bedrock

“65% of 1,411 surveyed marketers observe strong correlation between organic and local rank.” – Via Moz State of Local SEO Industry Report

What this means is that businesses which rank highly organically are very likely to have high associated local pack rankings. In the following screenshot, if you take away the directory-type platforms, you will see how the brand websites ranking on page 1 for “deli athens ga” are also the two businesses that have made it into Google’s local pack:

How often do the top 3 Google local pack results also have a 1st page organic rankings?

In a small study, we looked at 15 head keywords across 7 US cities and towns. This yielded 315 possible entries in Google’s local pack. Of that 315, 235 of the businesses ranking in the local packs also had page 1 organic rankings. That’s a 75% correlation between organic website rankings and local pack presence.

*It’s worth noting that where local and organic results did not correlate, it was sometimes due the presence of spam GMB listings, or to mystery SERPs that did not make sense at first glance — perhaps as a result of Google testing, in some cases.

Additionally, many local businesses are not making it to the first page of Google anymore in some categories because the organic SERPs are inundated with best-of lists and directories. Often, local business websites were pushed down to the second page of the organic results. In other words, if spam, “best-ofs,” and mysteries were removed, the local-organic correlation would likely be much higher than 75%.

Further, one recent study found that even when Google’s Local Service Ads are present, 43.9% of clicks went to the organic SERPs. Obviously, if you can make it to the top of the organic SERPs, this puts you in very good CTR shape from a purely organic standpoint.

Your takeaway from this

The local businesses you market may not be able to stave off the onslaught of Google’s zero-click SERPs, paid SERPs, and lead gen features, but where “free” local 3-packs still exist, your very best bet for being included in them is to have the strongest possible website. Moreover, organic SERPs remain a substantial source of clicks.

Far from it being the case that websites have become obsolete, they are the firmest bedrock for maintaining free local SERP visibility amidst an increasing scarcity of opportunities.

This calls for an industry-wide doubling down on organic metrics that matter most.

Bridging the local-organic gap

“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
- Aristotle

A 2017 CNBC survey found that 45% of small businesses have no website, and, while most large enterprises have websites, many local businesses qualify as “small.”

Moreover, a recent audit of 9,392 Google My Business listings found that 27% have no website link.

When asked which one task 1,411 marketers want clients to devote more resources to, it’s no coincidence that 66% listed a website-oriented asset. This includes local content development, on-site optimization, local link building, technical analysis of rankings/traffic/conversions, and website design as shown in the following Moz survey graphic:

In an environment in which websites are table stakes for competitive local pack rankings, virtually all local businesses not only need one, but they need it to be as strong as possible so that it achieves maximum organic rankings.

What makes a website strong?

The Moz Beginner’s Guide to SEO offers incredibly detailed guidelines for creating the best possible website. While we recommend that everyone marketing a local business read through this in-depth guide, we can sum up its contents here by stating that strong websites combine:

  • Technical basics
  • Excellent usability
  • On-site optimization
  • Relevant content publication
  • Publicity

For our present purpose, let’s take a special look at those last three elements.

On-site optimization and relevant content publication

There was a time when on-site SEO and content development were treated almost independently of one another. And while local businesses will need a make a little extra effort to put their basic contact information in prominent places on their websites (such as the footer and Contact Us page), publication and optimization should be viewed as a single topic. A modern strategy takes all of the following into account:

  • Keyword and real-world research tell a local business what consumers want
  • These consumer desires are then reflected in what the business publishes on its website, including its homepage, location landing pages, about page, blog and other components
  • Full reflection of consumer desires includes ensuring that human language (discovered via keyword and real-world research) is implemented in all elements of each page, including its tags, headings, descriptions, text, and in some cases, markup

What we’re describing here isn’t a set of disconnected efforts. It’s a single effort that’s integral to researching, writing, and publishing the website. Far from stuffing keywords into a tag or a page’s content, focus has shifted to building topical authority in the eyes of search engines like Google by building an authoritative resource for a particular consumer demographic. The more closely a business is able to reflect customers’ needs (including the language of their needs), in every possible component of its website, the more relevant it becomes.

A hypothetical example of this would be a large medical clinic in Dallas. Last year, their phone staff was inundated with basic questions about flu shots, like where and when to get them, what they cost, would they cause side effects, what about side effects on people with pre-existing health conditions, etc. This year, the medical center’s marketing team took a look at Moz Keyword Explorer and saw that there’s an enormous volume of questions surrounding flu shots:

This tiny segment of the findings of the free keyword research tool, Answer the Public, further illustrates how many questions people have about flu shots:

The medical clinic need not compete nationally for these topics, but at a local level, a page on the website can answer nearly every question a nearby patient could have about this subject. The page, created properly, will reflect human language in its tags, headings, descriptions, text, and markup. It will tell all patients where to come and when to come for this procedure. It has the potential to cut down on time-consuming phone calls.

And, finally, it will build topical authority in the eyes of Google to strengthen the clinic’s chances of ranking well organically… which can then translate to improved local rankings.

It’s important to note that keyword research tools typically do not reflect location very accurately, so research is typically done at a national level, and then adjusted to reflect regional or local language differences and geographic terms, after the fact. In other words, a keyword tool may not accurately reflect exactly how many local consumers in Dallas are asking “Where do I get a flu shot?”, but keyword and real-world research signals that this type of question is definitely being asked. The local business website can reflect this question while also adding in the necessary geographic terms.

Local link building must be brought to the fore of publicity efforts

Moz’s industry survey found that more than one-third of respondents had no local link building strategy in place. Meanwhile, link building was listed as one of the top three tasks to which marketers want their clients to devote more resources. There’s clearly a disconnect going on here. Given the fundamental role links play in building Domain Authority, organic rankings, and subsequent local rankings, building strong websites means bridging this gap.

First, it might help to examine old prejudices that could cause local business marketers and their clients to feel dubious about link building. These most likely stem from link spam which has gotten so out of hand in the general world of SEO that Google has had to penalize it and filter it to the best of their ability.

Not long ago, many digital-only businesses were having a heyday with paid links, link farms, reciprocal links, abusive link anchor text and the like. An online company might accrue thousands of links from completely irrelevant sources, all in hopes of escalating rank. Clearly, these practices aren’t ones an ethical business can feel good about investing in, but they do serve as an interesting object lesson, especially when a local marketer can point out to a client, that best local links are typically going to result from real-world relationship-building.

Local businesses are truly special because they serve a distinct, physical community made up of their own neighbors. The more involved a local business is in its own community, the more naturally link opportunities arise from things like local:

  • Sponsorships
  • Event participation and hosting
  • Online news
  • Blogs
  • Business associations
  • B2B cross-promotions

There are so many ways a local business can build genuine topical and domain authority in a given community by dint of the relationships it develops with neighbors.

An excellent way to get started on this effort is to look at high-ranking local businesses in the same or similar business categories to discover what work they’ve put in to achieve a supportive backlink profile. Moz Link Intersect is an extremely actionable resource for this, enabling a business to input its top competitors to find who is linking to them.

In the following example, a small B&B in Albuquerque looks up two luxurious Tribal resorts in its city:

Link Intersect then lists out a blueprint of opportunities, showing which links one or both competitors have earned. Drilling down, the B&B finds that Marriott.com is linking to both Tribal resorts on an Albuquerque things-to-do page:

The small B&B can then try to earn a spot on that same page, because it hosts lavish tea parties as a thing-to-do. Outreach could depend on the B&B owner knowing someone who works at the local Marriott personally. It could include meeting with them in person, or on the phone, or even via email. If this outreach succeeds, an excellent, relevant link will have been earned to boost organic rank, underpinning local rank.

Then, repeat the process. Aristotle might well have been speaking of link building when he said we are what we repeatedly do and that excellence is a habit. Good marketers can teach customers to have excellent habits in recognizing a good link opportunity when they see it.

Taken altogether

Without a website, a local business lacks the brand-controlled publishing and link-earning platform that so strongly influences organic rankings. In the absence of this, the chances of ranking well in competitive local packs will be significantly less. Taken altogether, the case is clear for local businesses investing substantially in their websites.

Acting now is actually a strategy for the future

“There is nothing permanent except change.”
- Heraclitus

You’ve now determined that strong websites are fundamental to local rankings in competitive markets. You’ve absorbed numerous reasons to encourage local businesses you market to prioritize care of their domains. But there’s one more thing you’ll need to be able to convey, and that’s a sense of urgency.

Right now, every single customer you can still earn from a free local pack listing is immensely valuable for the future.

This isn’t a customer you’ve had to pay Google for, as you very well might six months, a year, or five years from now. Yes, you’ve had to invest plenty in developing the strong website that contributed to the high local ranking, but you haven’t paid a penny directly to Google for this particular lead. Soon, you may be having to fork over commissions to Google for a large portion of your new customers, so acting now is like insurance against future spend.

For this to work out properly, local businesses must take the leads Google is sending them right now for free, and convert them into long-term, loyal customers, with an ultimate value of multiple future transactions without Google as a the middle man. And if these freely won customers can be inspired to act as word-of-mouth advocates for your brand, you will have done something substantial to develop a stream of non-Google-dependent revenue.

This offer may well expire as time goes by. When it comes to the capricious local SERPs, marketers resemble the Greek philosophers who knew that change is the only constant. The Trojan horse has rolled into every US city, and it’s a gift with a questionable shelf life. We can’t predict if or when free packs might become obsolete, but we share your concerns about the way the wind is blowing.

What we can see clearly right now is that websites will be anything but obsolete in 2019. Rather, they are the building blocks of local rankings, precious free leads, and loyal revenue, regardless of how SERPs may alter in future.

For more insights into where local businesses should focus in 2019, be sure to explore the Moz State of Local SEO industry report:

Read the State of Local SEO industry report

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

5 Social Media Tactics You Need to STOP Using (And What You Should Do Instead)

These days, it seems like everybody is using social media. You’d be hard-pressed to find someone who doesn’t have a Facebook or Instagram account. Statistics have shown that there are now 2.2 billion social media users around the world, and the numbers are expected to reach 3 billion by 2020. With such a massive reach, it’s no wonder that every year more companies use social media as part of their marketing strategy.

However, it’s not enough to have a social media account; you also need to use effective strategies to make them work. Unfortunately, a lot of companies are still behind the times and are using outdated tactics that may actually be doing them more harm than good.

Are you guilty of any of these social media faux pas?

1. Engaging Only When You Need Something

Social media is a communication tool and the interaction goes two ways. Some brands look at social media strictly as a promotional tool and only post when they need something. But today’s consumers are pretty savvy and know when they’re being used so don’t expect this strategy to be well-received.

Better Tactic:

Engage your audience regularly. Ask questions. Join conversations and make sure you actually have something worthwhile to say. Don’t just show up, post a link, and then disappear. Personalizing your interactions with customers is time-consuming, but it’s a great way of engaging them and build a rapport.

2. Using Too Many Hashtags

Hashtags are great! They make your post easy to find on social media platforms like Twitter and Instagram. Plus, it’s fun trying to come up with witty hashtags. What’s not fun is when hashtags are used excessively so stop if you’re guilty of this. An avalanche of hashtags makes you look desperate and spammy, especially if you’re hashtagging every adjective that comes to your mind even if they’re not relevant to your product (ex. #blue, #cool, #nice, #small).

Better Tactic:

Take the time to come up with an appropriate hashtag. Be deliberate in your description and ensure they’re relevant to your product. More importantly, make sure your post has more words than hashtags. This will ensure that your audience is focusing on your message and not on the #.

3. Jumping on the Social Media Bandwagon

Reacting to every trending topic is one social media trick that you need to let go. Some brands jump on a popular topic or meme simply to start a conversation or to appear relevant. If it doesn’t fit your demographic or brand then your audience doesn’t need to hear your thoughts about it. For instance, your post congratulating Prince Harry about becoming a father will fall flat when your main audience is in Southeast Asia.

Better Tactic:

If you are going to say something about a particular topic, make sure your post will bring something to the table. Ask yourself if what you’ll be sharing is relevant to the discussion, your brand and market. If not, then there’s no need to post that meme.

4. Inappropriate Tagging of People or Companies

Tagging is a great way of calling attention to your posts. But it doesn’t make sense to tag people or brands in promos or images when they’re not in it or have no clear connection to the post. This move is reminiscent to a mass email campaign. It’s obviously generic, sloppy, and just as irritating. It’s also quite rude to tag someone without making an effort to personalize the request or post.

Better Tactic:

You’ll have a higher chance of getting a brand to help you if you send a direct message or tag them in a separate post first. If the company or influencer is someone you have worked with in the past, then include their links in your post. For instance, you can thank the influencer for their article on your company and include the link. Then segue to your promo and call-to-action.

5. Limiting Posts to the “Best Time”

Studies have shown that there are best times to post on social media. However, these are calculated based on averages; on the times that the majority of users are active and engaged. But every demographic is different. What if your specific followers are not active during those reported “best times?”

Better Tactic:

Instead of relying on the aforementioned study, you should also conduct your own research. Utilize your social media tools and check when your audiences are really online. FB Insights will display this for your Page. There are also tools that will tell you when your Twitter followers are active. Experiment and post at different times and days. This will help you come up with your own unique pattern of engagement.

Social media is a great marketing tool. However, a strategy that works for one brand might not work for another. So make sure that the tactics you use are relevant to your company and your market.

[Featured image via Pixabay]

The post 5 Social Media Tactics You Need to STOP Using (And What You Should Do Instead) appeared first on WebProNews – Breaking News in Tech, Search, Social, & Business.


WebProNews – Breaking News in Tech, Search, Social, & Business

Posted in IM NewsComments Off

Why Content Marketers Need Editors

I’m good at math. If you looked at my standardized test results from when I was back in school, you’d…

The post Why Content Marketers Need Editors appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Do You Need Local Pages? – Whiteboard Friday

Posted by Tom.Capper

Does it make sense for you to create local-specific pages on your website? Regardless of whether you own or market a local business, it may make sense to compete for space in the organic SERPs using local pages. Please give a warm welcome to our friend Tom Capper as he shares a 4-point process for determining whether local pages are something you should explore in this week’s Whiteboard Friday!


Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hello, Moz fans. Welcome to another Whiteboard Friday. I’m Tom Capper. I’m a consultant at Distilled, and today I’m going to be talking to you about whether you need local pages. Just to be clear right off the bat what I’m talking about, I’m not talking about local rankings as we normally think of them, the local map pack results that you see in search results, the Google Maps rankings, that kind of thing.

A 4-step process to deciding whether you need local pages

I’m talking about conventional, 10 blue links rankings but for local pages, and by local pages I mean pages from a national or international business that are location-specific. What are some examples of that? Maybe on Indeed.com they would have a page for jobs in Seattle. Indeed doesn’t have a bricks-and-mortar premises in Seattle, but they do have a page that is about jobs in Seattle.

You might get a similar thing with flower delivery. You might get a similar thing with used cars, all sorts of different verticals. I think it can actually be quite a broadly applicable tactic. There’s a four-step process I’m going to outline for you. The first step is actually not on the board. It’s just doing some keyword research.

1. Know (or discover) your key transactional terms

I haven’t done much on that here because hopefully you’ve already done that. You already know what your key transactional terms are. Because whatever happens you don’t want to end up developing location pages for too many different keyword types because it’s gong to bloat your site, you probably just need to pick one or two key transactional terms that you’re going to make up the local variants of. For this purpose, I’m going to talk through an SEO job board as an example.

2. Categorize your keywords as implicit, explicit, or near me and log their search volumes

We might have “SEO jobs” as our core head term. We then want to figure out what the implicit, explicit, and near me versions of that keyword are and what the different volumes are. In this case, the implicit version is probably just “SEO jobs.” If you search for “SEO jobs” now, like if you open a new tab in your browser, you’re probably going to find that a lot of local orientated results appear because that is an implicitly local term and actually an awful lot of terms are using local data to affect rankings now, which does affect how you should consider your rank tracking, but we’ll get on to that later.

SEO jobs, maybe SEO vacancies, that kind of thing, those are all going to be going into your implicitly local terms bucket. The next bucket is your explicitly local terms. That’s going to be things like SEO jobs in Seattle, SEO jobs in London, and so on. You’re never going to get a complete coverage of different locations. Try to keep it simple.

You’re just trying to get a rough idea here. Lastly you’ve got your near me or nearby terms, and it turns out that for SEO jobs not many people search SEO jobs near me or SEO jobs nearby. This is also going to vary a lot by vertical. I would imagine that if you’re in food delivery or something like that, then that would be huge.

3. Examine the SERPs to see whether local-specific pages are ranking

Now we’ve categorized our keywords. We want to figure out what kind of results are going to do well for what kind of keywords, because obviously if local pages is the answer, then we might want to build some.

In this case, I’m looking at the SERP for “SEO jobs.” This is imaginary. The rankings don’t really look like this. But we’ve got SEO jobs in Seattle from Indeed. That’s an example of a local page, because this is a national business with a location-specific page. Then we’ve got SEO jobs Glassdoor. That’s a national page, because in this case they’re not putting anything on this page that makes it location specific.

Then we’ve got SEO jobs Seattle Times. That’s a local business. The Seattle Times only operates in Seattle. It probably has a bricks-and-mortar location. If you’re going to be pulling a lot of data of this type, maybe from stats or something like that, obviously tracking from the locations that you’re mentioning, where you are mentioning locations, then you’re probably going to want to categorize these at scale rather than going through one at a time.

I’ve drawn up a little flowchart here that you could encapsulate in a Excel formula or something like that. If the location is mentioned in the URL and in the domain, then we know we’ve got a local business. Most of the time it’s just a rule of thumb. If the location is mentioned in the URL but not mentioned in the domain, then we know we’ve got a local page and so on.

4. Compare & decide where to focus your efforts

You can just sort of categorize at scale all the different result types that we’ve got. Then we can start to fill out a chart like this using the rankings. What I’d recommend doing is finding a click-through rate curve that you are happy to use. You could go to somewhere like AdvancedWebRanking.com, download some example click-through rate curves.

Again, this doesn’t have to be super precise. We’re looking to get a proportionate directional indication of what would be useful here. I’ve got Implicit, Explicit, and Near Me keyword groups. I’ve got Local Business, Local Page, and National Page result types. Then I’m just figuring out what the visibility share of all these types is. In my particular example, it turns out that for explicit terms, it could be worth building some local pages.

That’s all. I’d love to hear your thoughts in the comments. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Can’t Tell if Your Social Media Campaign is Really Working? Here’s What You Need to Know

The number of companies integrating social media into their marketing campaigns has been growing steadily over the past decade. Some businesses even rely solely on platforms like Facebook, Instagram, and Twitter to promote their goods and services. However, measuring the impact these campaigns have on their business remains a challenge.

A 2015 CMO survey underlined this difficulty, with only 15 percent of participating marketers being able to quantitatively measure the effectiveness of their social media marketing plans. Meanwhile, a recent MDG Advertising infographic shows that not much has changed with regards to measuring the effectivity of social media marketing and its impact on a company’s ROI.

According to the accompanying MDG report, only 20 percent of companies said they were able to determine the success of their social media campaigns while 44 percent could not determine social media’s impact on their business. This problem also affects marketing agencies, with 28 percent facing challenges in measuring the effectivity of social media. However, 55 percent of said agencies claim they could somewhat determine the ROI generated by social media while a mere 17 percent could accurately measure it.

[Graphic via mdgadvertising.com]

Challenges of Measuring Social Media Campaigns

Because social media is a relatively new (and constantly evolving) marketing channel, measuring its true impact of ROI remains a conundrum for many businesses. What’s more, a lot of companies remain unsure of social media’s place in the big picture.

There are other reasons why measuring social media impact remains complicated.

  • Businesses Have Different KPIs: Brands have their own goals, values, and propositions and the Key Performance Indicators (KPI) they want to measure depend on these. However, KPIs can change depending on the direction the company wants to take. This makes it hard to set specific metrics and data points.
  • Data is Limited: Each social media platform has its own set of analytics. Some tools engage followers while others show demographic information. It would also require companies to do a lot of mining just to put everything together.
  • Qualitative Results are Hard to See: It’s easy to see quantitative results such as the numbers of comments, likes, and shares. But the more important question is the kind of action consumers are actually taking — the qualitative results. For instance, are they buying products or just sharing content?
  • Business Impact is Hard to Determine: ROIs are about returns and investments. Even if companies are able to tie their social media campaigns to their KPIs and business goals, most remain confused as to what it means for their bottom line. Companies would have to consider the number of people working on social media accounts and their salaries, social media software, and advertising costs and compare them against KPIs.

Best Ways to Check Effectiveness of Social Media Drive

Despite the ambiguity, social media does have a positive influence on a company’s sales and revenue. The question now is how to measure and quantify this impact. Knowing the following metrics of your campaigns can help you measure their effectiveness:

  • Click-Through Rate: While click-throughs are a key metric, companies should do more than just track clicks. They should also focus on metrics geared towards specifically designed landing pages and content. Companies should also look at click-throughs in relation to bounce rates. High bounce rates imply that the site’s content is not delivering on the call-to-action or headline’s promise.
  • Conversions: Whether it’s a sign-up, filling out a form, or an online sale, companies should have a goal when it comes to conversions, especially when creating paid ads. This is significant as it provides direct ROI numbers. Conversions are also relatively easy to track. Some companies utilize lead generation forms while others opt for pixel codes.
  • Engagement: This metric is more than just the volume of likes a page or post has since it doesn’t give a clear indication of commitment. A meaningful engagement that results in brand awareness, product interest or sales are the best testaments to the impact of social media activity. Companies should put real effort into having a dialogue with their audience and influencers.
  • Traffic: Identifying the actual value of traffic is about checking the share of driven traffic and the actions generated by click-throughs. Tools like Google Analytics makes tracking the impact of social media on site traffic simpler. Companies should look more closely at how much of the site traffic was driven by social media since this will provide you with concrete numbers that you can work with.

Remember, you can’t market what you can’t measure (at least not effectively). So, before you run a social media campaign, be sure to set up adequate analytic tools that measure the data that correlates with the outcome you desire. For many businesses, picking the right tools and correctly assessing the data they collect comes with a learning curve. However, once you get past that hurdle, you can use the data to grow your business by leaps and bounds.

[Featured image via Pixabay]

The post Can't Tell if Your Social Media Campaign is Really Working? Here's What You Need to Know appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

The Minimum Viable Knowledge You Need to Work with JavaScript & SEO Today

Posted by sergeystefoglo

If your work involves SEO at some level, you’ve most likely been hearing more and more about JavaScript and the implications it has on crawling and indexing. Frankly, Googlebot struggles with it, and many websites utilize modern-day JavaScript to load in crucial content today. Because of this, we need to be equipped to discuss this topic when it comes up in order to be effective.

The goal of this post is to equip you with the minimum viable knowledge required to do so. This post won’t go into the nitty gritty details, describe the history, or give you extreme detail on specifics. There are a lot of incredible write-ups that already do this — I suggest giving them a read if you are interested in diving deeper (I’ll link out to my favorites at the bottom).

In order to be effective consultants when it comes to the topic of JavaScript and SEO, we need to be able to answer three questions:

  1. Does the domain/page in question rely on client-side JavaScript to load/change on-page content or links?
  2. If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?
  3. If not, what is the ideal solution?

With some quick searching, I was able to find three examples of landing pages that utilize JavaScript to load in crucial content.

I’m going to be using Sitecore’s Symposium landing page through each of these talking points to illustrate how to answer the questions above.

We’ll cover the “how do I do this” aspect first, and at the end I’ll expand on a few core concepts and link to further resources.

Question 1: Does the domain in question rely on client-side JavaScript to load/change on-page content or links?

The first step to diagnosing any issues involving JavaScript is to check if the domain uses it to load in crucial content that could impact SEO (on-page content or links). Ideally this will happen anytime you get a new client (during the initial technical audit), or whenever your client redesigns/launches new features of the site.

How do we go about doing this?

Ask the client

Ask, and you shall receive! Seriously though, one of the quickest/easiest things you can do as a consultant is contact your POC (or developers on the account) and ask them. After all, these are the people who work on the website day-in and day-out!

“Hi [client], we’re currently doing a technical sweep on the site. One thing we check is if any crucial content (links, on-page content) gets loaded in via JavaScript. We will do some manual testing, but an easy way to confirm this is to ask! Could you (or the team) answer the following, please?

1. Are we using client-side JavaScript to load in important content?

2. If yes, can we get a bulleted list of where/what content is loaded in via JavaScript?”

Check manually

Even on a large e-commerce website with millions of pages, there are usually only a handful of important page templates. In my experience, it should only take an hour max to check manually. I use the Chrome Web Developers plugin, disable JavaScript from there, and manually check the important templates of the site (homepage, category page, product page, blog post, etc.)

In the example above, once we turn off JavaScript and reload the page, we can see that we are looking at a blank page.

As you make progress, jot down notes about content that isn’t being loaded in, is being loaded in wrong, or any internal linking that isn’t working properly.

At the end of this step we should know if the domain in question relies on JavaScript to load/change on-page content or links. If the answer is yes, we should also know where this happens (homepage, category pages, specific modules, etc.)

Crawl

You could also crawl the site (with a tool like Screaming Frog or Sitebulb) with JavaScript rendering turned off, and then run the same crawl with JavaScript turned on, and compare the differences with internal links and on-page elements.

For example, it could be that when you crawl the site with JavaScript rendering turned off, the title tags don’t appear. In my mind this would trigger an action to crawl the site with JavaScript rendering turned on to see if the title tags do appear (as well as checking manually).

Example

For our example, I went ahead and did a manual check. As we can see from the screenshot below, when we disable JavaScript, the content does not load.

In other words, the answer to our first question for this pages is “yes, JavaScript is being used to load in crucial parts of the site.”

Question 2: If yes, is Googlebot seeing the content that’s loaded in via JavaScript properly?

If your client is relying on JavaScript on certain parts of their website (in our example they are), it is our job to try and replicate how Google is actually seeing the page(s). We want to answer the question, “Is Google seeing the page/site the way we want it to?”

In order to get a more accurate depiction of what Googlebot is seeing, we need to attempt to mimic how it crawls the page.

How do we do that?

Use Google’s new mobile-friendly testing tool

At the moment, the quickest and most accurate way to try and replicate what Googlebot is seeing on a site is by using Google’s new mobile friendliness tool. My colleague Dom recently wrote an in-depth post comparing Search Console Fetch and Render, Googlebot, and the mobile friendliness tool. His findings were that most of the time, Googlebot and the mobile friendliness tool resulted in the same output.

In Google’s mobile friendliness tool, simply input your URL, hit “run test,” and then once the test is complete, click on “source code” on the right side of the window. You can take that code and search for any on-page content (title tags, canonicals, etc.) or links. If they appear here, Google is most likely seeing the content.

Search for visible content in Google

It’s always good to sense-check. Another quick way to check if GoogleBot has indexed content on your page is by simply selecting visible text on your page, and doing a site:search for it in Google with quotations around said text.

In our example there is visible text on the page that reads…

“Whether you are in marketing, business development, or IT, you feel a sense of urgency. Or maybe opportunity?”

When we do a site:search for this exact phrase, for this exact page, we get nothing. This means Google hasn’t indexed the content.

Crawling with a tool

Most crawling tools have the functionality to crawl JavaScript now. For example, in Screaming Frog you can head to configuration > spider > rendering > then select “JavaScript” from the dropdown and hit save. DeepCrawl and SiteBulb both have this feature as well.

From here you can input your domain/URL and see the rendered page/code once your tool of choice has completed the crawl.

Example:

When attempting to answer this question, my preference is to start by inputting the domain into Google’s mobile friendliness tool, copy the source code, and searching for important on-page elements (think title tag, <h1>, body copy, etc.) It’s also helpful to use a tool like diff checker to compare the rendered HTML with the original HTML (Screaming Frog also has a function where you can do this side by side).

For our example, here is what the output of the mobile friendliness tool shows us.

After a few searches, it becomes clear that important on-page elements are missing here.

We also did the second test and confirmed that Google hasn’t indexed the body content found on this page.

The implication at this point is that Googlebot is not seeing our content the way we want it to, which is a problem.

Let’s jump ahead and see what we can recommend the client.

Question 3: If we’re confident Googlebot isn’t seeing our content properly, what should we recommend?

Now we know that the domain is using JavaScript to load in crucial content and we know that Googlebot is most likely not seeing that content, the final step is to recommend an ideal solution to the client. Key word: recommend, not implement. It’s 100% our job to flag the issue to our client, explain why it’s important (as well as the possible implications), and highlight an ideal solution. It is 100% not our job to try to do the developer’s job of figuring out an ideal solution with their unique stack/resources/etc.

How do we do that?

You want server-side rendering

The main reason why Google is having trouble seeing Sitecore’s landing page right now, is because Sitecore’s landing page is asking the user (us, Googlebot) to do the heavy work of loading the JavaScript on their page. In other words, they’re using client-side JavaScript.

Googlebot is literally landing on the page, trying to execute JavaScript as best as possible, and then needing to leave before it has a chance to see any content.

The fix here is to instead have Sitecore’s landing page load on their server. In other words, we want to take the heavy lifting off of Googlebot, and put it on Sitecore’s servers. This will ensure that when Googlebot comes to the page, it doesn’t have to do any heavy lifting and instead can crawl the rendered HTML.

In this scenario, Googlebot lands on the page and already sees the HTML (and all the content).

There are more specific options (like isomorphic setups)

This is where it gets to be a bit in the weeds, but there are hybrid solutions. The best one at the moment is called isomorphic.

In this model, we’re asking the client to load the first request on their server, and then any future requests are made client-side.

So Googlebot comes to the page, the client’s server has already executed the initial JavaScript needed for the page, sends the rendered HTML down to the browser, and anything after that is done on the client-side.

If you’re looking to recommend this as a solution, please read this post from the AirBNB team which covers isomorphic setups in detail.

AJAX crawling = no go

I won’t go into details on this, but just know that Google’s previous AJAX crawling solution for JavaScript has since been discontinued and will eventually not work. We shouldn’t be recommending this method.

(However, I am interested to hear any case studies from anyone who has implemented this solution recently. How has Google responded? Also, here’s a great write-up on this from my colleague Rob.)

Summary

At the risk of severely oversimplifying, here’s what you need to do in order to start working with JavaScript and SEO in 2018:

  1. Know when/where your client’s domain uses client-side JavaScript to load in on-page content or links.
    1. Ask the developers.
    2. Turn off JavaScript and do some manual testing by page template.
    3. Crawl using a JavaScript crawler.
  2. Check to see if GoogleBot is seeing content the way we intend it to.
    1. Google’s mobile friendliness checker.
    2. Doing a site:search for visible content on the page.
    3. Crawl using a JavaScript crawler.
  3. Give an ideal recommendation to client.
    1. Server-side rendering.
    2. Hybrid solutions (isomorphic).
    3. Not AJAX crawling.

Further resources

I’m really interested to hear about any of your experiences with JavaScript and SEO. What are some examples of things that have worked well for you? What about things that haven’t worked so well? If you’ve implemented an isomorphic setup, I’m curious to hear how that’s impacted how Googlebot sees your site.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

If You’re Building A Personal Brand You Need A ‘Claim To Fame’ Breakthrough Result

I was recently a guest expert on a panel interview as part of a marketing summit. As I was listening to the other speakers and hearing their stories, it became clear that everyone involved had some kind of ‘claim to fame‘ result. They had experienced a breakthrough success in their past…

The post If You’re Building A Personal Brand You Need A ‘Claim To Fame’ Breakthrough Result appeared first on Entrepreneurs-Journey.com.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

Advert