Tag Archive | "Ever"

Simple Spam Fighting: The Easiest Local Rankings You’ll Ever Earn

Posted by MiriamEllis

Image credit: Visit Lakeland

Reporting fake and duplicate listings to Google sounds hard. Sometimes it can be. But very often, it’s as easy as falling off a log, takes only a modest session of spam fighting and can yield significant local ranking improvements.

If your local business/the local brands your agency markets aren’t using spam fighting as a ranking tactic because you feel you lack the time or skills, please sit down with me for a sec.

What if I told you I spent about an hour yesterday doing something that moved a Home Depot location up 3 spots in a competitive market in Google’s local rankings less than 24 hours later? What if, for you, moving up a spot or two would get you out of Google’s local finder limbo and into the actual local pack limelight?

Today I’m going to show you exactly what I did to fight spam, how fast and easy it was to sweep out junk listings, and how rewarding it can be to see results transform in favor of the legitimate businesses you market.

Washing up the shady world of window blinds

Image credit: Aqua Mechanical

Who knew that shopping for window coverings would lead me into a den of spammers throwing shade all over Google?

The story of Google My Business spam is now more than a decade in the making, with scandalous examples like fake listings for locksmiths and addiction treatment centers proving how unsafe and unacceptable local business platforms can become when left unguarded.

But even in non-YMYL industries, spam listings deceive the public, waste consumers’ time, inhibit legitimate businesses from being discovered, and erode trust in the spam-hosting platform. I saw all of this in action when I was shopping to replace some broken blinds in my home, and it was such a hassle trying to find an actual vendor amid the chaff of broken, duplicate, and lead gen listings, I decided to do something about it.

I selected an SF Bay area branch of Home Depot as my hypothetical “client.” I knew they had a legitimate location in the city of Vallejo, CA — a place I don’t live but sometimes travel to, thereby excluding the influence of proximity from my study. I knew that they were only earning an 8th place ranking in Google’s Local Finder, pushed down by spam. I wanted to see how quickly I could impact Home Depot’s surprisingly bad ranking.

I took the following steps, and encourage you to take them for any local business you’re marketing, too:

Step 1: Search

While located at the place of business you’re marketing, perform a Google search (or have your client perform it) for the keyword phrase for which you most desire improved local rankings. Of course, if you’re already ranking well as you want to for the searchers nearest you, you can still follow this process for investigating somewhat more distant areas within your potential reach where you want to increase visibility.

In the results from your search, click on the “more businesses” link at the bottom of the local pack, and you’ll be taken to the interface commonly called the “Local Finder.”

The Local Finder isn’t typically 100% identical to the local pack in exact ranking order, but it’s the best place I know of to see how things stand beyond the first 3 results that make up Google’s local packs, telling a business which companies they need to surpass to move up towards local pack inclusion.

Step 2: Copy my spreadsheet

Find yourself in the local finder. In my case, the Home Depot location was at position 8. I hope you’re somewhere within the first set of 20 results Google typically gives, but if you’re not, keep paging through until you locate your listing. If you don’t find yourself at all, you may need to troubleshoot whether an eligibility issue, suspension, or filter is at play. But, hopefully that’s not you today.

Next, create a custom spreadsheet to record your findings. Or, much easier, just make a copy of mine!

Populate the spreadsheet by cutting and pasting the basic NAP (name, address, phone) for every competitor ranking above you, and include your own listing, too, of course! If you work for an agency, you’ll need to get the client to help you with this step by filling the spreadsheet out based on their search from their place of business.

In my case, I recorded everything in the first 20 results of the Local Finder, because I saw spam both above and below my “client,” and wanted to see the total movement resulting from my work in that result set.

Step 3: Identify obvious spam

We want to catch the easy fish today. You can go down rabbit holes another day, trying to ferret out weirdly woven webs of lead gen sites spanning the nation, but today, we’re just looking to weed out listings that clearly, blatantly don’t belong in the Local Finder. 

Go through these five easy steps:

  1. Look at the Google Streetview image for each business outranking you.
    Do you see a business with signage that matches the name on the listing? Move on. But if you see a house, an empty parking lot, or Google is marking the listing as “location approximate”, jot that down in the Notes section of your spreadsheet. For example, I saw a supposed window coverings showroom that Streetview was locating in an empty lot on a military base. Big red flag there.
  2. Make note of any businesses that share an address, phone number, or very similar name.
    Make note of anything with an overly long name that seems more like a string of keywords than a brand. For example, a listing in my set was called: Custom Window Treatments in Fairfield, CA Hunter Douglas Dealer.
  3. For every business you noted down in steps one and two, get on the phone.
    Is the number a working number? If someone answers, do they answer with the name of the business? Note it down. Say, “Hi, where is your shop located?” If the answer is that it’s not a shop, it’s a mobile business, note that down. Finally, If anything seems off, check the Guidelines for representing your business on Google to see what’s allowed in the industry you’re investigating. For example, it’s perfectly okay for a window blinds dealer to operate out of their home, but if they’re operating out of 5 homes in the same city, it’s likely a violation. In my case, just a couple of minutes on the phone identified multiple listings with phone numbers that were no longer in service.
  4. Visit the iffy websites. 
    Now that you’re narrowing your spreadsheet down to a set of businesses that are either obviously legitimate or “iffy,” visit the websites of the iffy ones. Does the name on the listing match the name on the website? Does anything else look odd? Note it down.
  5. Highlight businesses that are clearly spammy.
    Your dive hasn’t been deep, but by now, it may have identified one or more listings that you strongly believe don’t belong because they have spammy names, fake addresses, or out-of-service phone numbers. My lightning-quick pass through my data set showed that six of the twenty listings were clearly junk. That’s 30% of Google’s info being worthless! I suggest marking these in red text in your spreadsheet to make the next step fast and easy.

Step 4: Report it!

If you want to become a spam-fighting ace later, you’ll need to become familiar with Google’s Business Redressal Complaint Form which gives you lots of room for sharing your documentation of why a listing should be removed. In fact, if an aggravating spammer remains in the Local Finder despite what we’re doing in this session, this form is where you’d head next for a more concerted effort.

But, today, I promised the easiness of falling off a log, so our first effort at impacting the results will simply focus on the “suggest an edit” function you’ll see on each listing you’re trying to get rid of. This is how you do it:

After you click the “suggest an edit” button on the listing, a popup will appear. If you’re reporting something like a spammy name, click the “change name or other details” option and fill out the form. If you’ve determined a listing represents a non-existent, closed, unreachable, or duplicate entity, choose the “remove this place” option and then select the dropdown entry that most closely matches the problem. You can add a screenshot or other image if you like, but in my quick pass through the data, I didn’t bother.

Record the exact action you took for each spam listing in the “Actions” column of the spreadsheet. In my case, I was reporting a mixture or non-existent buildings, out-of-service phone numbers, and one duplicate listing with a spammy name.

Finally, hit the “send” button and you’re done.

Step 5: Record the results

Within an hour of filing my reports with Google, I received an email like this for 5 of the 6 entries I had flagged:

The only entry I received no email for was the duplicate listing with the spammy name. But I didn’t let this worry me. I went about the rest of my day and checked back in the morning.

I’m not fond of calling out businesses in public. Sometimes, there are good folks who are honestly confused about what’s allowed and what isn’t. Also, I sometimes find screenshots of the local finder overwhelmingly cluttered and endlessly long to look at. Instead, I created a bare-bones representational schematic of the total outcome of my hour of spam-fighting work.

The red markers are legit businesses. The grey ones are spam. The green one is the Home Depot I was trying to positively impact. I attributed a letter of the alphabet to each listing, to better help me see how the order changed from day one to day two. The lines show the movement over the course of the 24 hours.

The results were that:

  • A stayed the same, and B and C swapping positions was unlikely due to my work; local rankings can fluctuate like this from hour to hour.
  • Five out of six spam listings I reported disappeared. The keyword-stuffed duplicate listing which was initially at position K was replaced by the brand’s legitimate listing one spot lower than it had been.
  • The majority of the legitimate businesses enjoyed upward movement, with the exception of position I which went down, and M and R which disappeared. Perhaps new businesses moving into the Local Finder triggered a filter, or perhaps it was just the endless tide of position changes and they’ll be back tomorrow.
  • Seven new listings made it into the top 20. Unfortunately, at a glance, it looked to me like 3 of these new listings were new spam. Dang, Google!
  • Most rewardingly, my hypothetical client, Home Depot, moved up 3 spots. What a super easy win!

Fill out the final column in your spreadsheet with your results.

What we’ve learned

You battle upstream every day for your business or clients. You twist yourself like a paperclip complying with Google’s guidelines, seeking new link and unstructured citation opportunities, straining your brain to shake out new content, monitoring reviews like a chef trying to keep a cream sauce from separating. You do all this in the struggle for better, broader visibility, hoping that each effort will incrementally improve reputation, rankings, traffic, and conversions.

Catch your breath. Not everything in life has to be so hard. The river of work ahead is always wide, but don’t overlook the simplest stepping stones. Saunter past the spam listings without breaking a sweat and enjoy the easy upward progress!

I’d like to close today with three meditations:

1. Google is in over their heads with spam

Google is in over their heads with spam. My single local search for a single keyword phrase yielded 30% worthless data in their top local results. Google says they process 63,000 searches per second and that as much as 50% of mobile queries have a local intent. I don’t know any other way to look at Google than as having become an under-regulated public utility at this point.

Expert local SEOs can spot spam listings in query after query, industry after industry, but Google has yet to staff a workforce or design an algorithm sufficient to address bad data that has direct, real-world impacts on businesses and customers. I don’t know if they lack the skills or the will to take responsibility for this enormous problem they’ve created, but the problem is plain. Until Google steps up, my best advice is to do the smart and civic work of watchdogging the results that most affect the local community you serve. It’s a positive not just for your brand, but for every legitimate business and every neighbor near you.

2. You may get in over your head with spam

You may get in over your head with spam. Today’s session was as simple as possible, but GMB spam can stem from complex, global networks. The Home Depot location I randomly rewarded with a 3-place jump in Local Finder rankings clearly isn’t dedicating sufficient resources to spam fighting or they would’ve done this work themselves.

But the extent of spam is severe. If your market is one that’s heavily spammed, you can quickly become overwhelmed by the problem. In such cases, I recommend that you:

  • Read this excellent recent article by Jessie Low on the many forms spam can take, plus some great tips for more strenuous fighting than we’ve covered today.
  • Follow Joy Hawkins, Mike Blumenthal, and Jason Brown, all of whom publish ongoing information on this subject. If you wade into a spam network, I recommend reporting it to one or more of these experts on Twitter, and, if you wish to become a skilled spam fighter yourself, you will learn a lot from what these three have published.
  • If you don’t want to fight spam yourself, hire an agency that has the smarts to be offering this as a service.
  • You can also report listing spam to the Google My Business Community Forum, but it’s a crowded place and it can sometimes be hard to get your issue seen.
  • Finally, if the effect of spam in your market is egregious enough, your ability to publicize it may be your greatest hope. Major media have now repeatedly featured broadcasts and stories on this topic, and shame will sometimes move Google to action when no other motivation appears to.

3. Try to build a local anti-spam movement

What if you built a local movement? What if you and your friendlier competitors joined forces to knock spam out of Google together? Imagine all of the florists, hair salons, or medical practitioners in a town coming together to watch the local SERPs in shifts so that everyone in their market could benefit from bad actors being reported.

Maybe you’re already in a local business association with many hands that could lighten the work of protecting a whole community from unethical business practices. Maybe your town could then join up with the nearest major city, and that city could begin putting pressure on legislators. Maybe legislators would begin to realize the extent of the impacts when legitimate businesses face competition from fake entities and illegal practices. Maybe new anti-trust and communications regulations would ensue.

Now, I promised you “simple,” and this isn’t it, is it? But every time I see a fake listing, I know I’m looking at a single pebble and I’m beginning to think it may take an avalanche to bring about change great enough to protect both local brands and consumers. Google is now 15 years into this dynamic with no serious commitment in sight to resolve it.

At least in your own backyard, in your own community, you can be one small part of the solution with the easy tactics I’ve shared today, but maybe it’s time for local commerce to begin both doing more and expecting more in the way of protections. 

I’m ready for that. And you?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Will Google Home Hub ever become a useful local search tool?

Just a few modest improvements would make Home and Home Hub more effective for local search.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Why Local Businesses Will Need Websites More than Ever in 2019

Posted by MiriamEllis

64% of 1,411 surveyed local business marketers agree that Google is becoming the new “homepage” for local businesses. Via Moz State of Local SEO Industry Report

…but please don’t come away with the wrong storyline from this statistic.

As local brands and their marketers watch Google play Trojan horse, shifting from top benefactor to top competitor by replacing former “free” publicity with paid packs, Local Service Ads, zero-click SERPs, and related structures, it’s no surprise to see forum members asking, “Do I even need a website anymore?”

Our answer to this question is,“Yes, you’ve never needed a website more than you will in 2019.” In this post, we’ll examine:

  • Why it looks like local businesses don’t need websites
  • Statistical proofs of why local businesses need websites now more than ever
  • The current status of local business websites and most-needed improvements

How Google stopped bearing so many gifts

Within recent memory, a Google query with local intent brought up a big pack of ten nearby businesses, with each entry taking the user directly to these brands’ websites for all of their next steps. A modest amount of marketing effort was rewarded with a shower of Google gifts in the form of rankings, traffic, and conversions.

Then these generous SERPs shrank to seven spots, and then three, with the mobile sea change thrown into the bargain and consisting of layers and layers of Google-owned interfaces instead of direct-to-website links. In 2018, when we rustle through the wrapping paper, the presents we find from Google look cheaper, smaller, and less magnificent.

Consider these five key developments:

1) Zero-click mobile SERPs

This slide from a recent presentation by Rand Fishkin encapsulates his findings regarding the growth of no-click SERPs between 2016–2018. Mobile users have experienced a 20% increase in delivery of search engine results that don’t require them to go any deeper than Google’s own interface.

2) The encroachment of paid ads into local packs

When Dr. Peter J. Myers surveyed 11,000 SERPs in 2018, he found that 35% of competitive local packs feature ads.

3) Google becoming a lead gen agency

At last count, Google’s Local Service Ads program via which they interposition themselves as the paid lead gen agent between businesses and consumers has taken over 23 business categories in 77 US cities.

4) Even your branded SERPs don’t belong to you

When a user specifically searches for your brand and your Google Knowledge Panel pops up, you can likely cope with the long-standing “People Also Search For” set of competitors at the bottom of it. But that’s not the same as Google allowing Groupon to advertise at the top of your KP, or putting lead gen from Doordash and GrubHub front and center to nickel and dime you on your own customers’ orders.

5) Google is being called the new “homepage” for local businesses

As highlighted at the beginning of this post, 64% of marketers agree that Google is becoming the new “homepage” for local businesses. This concept, coined by Mike Blumenthal, signifies that a user looking at a Google Knowledge Panel can get basic business info, make a phone call, get directions, book something, ask a question, take a virtual tour, read microblog posts, see hours of operation, thumb through photos, see busy times, read and leave reviews. Without ever having to click through to a brand’s domain, the user may be fully satisfied.

“Nothing is enough for the man to whom enough is too little.”
- Epicurus

There are many more examples we could gather, but they can all be summed up in one way: None of Google’s most recent local initiatives are about driving customers to brands’ own websites. Local SERPs have shrunk and have been re-engineered to keep users within Google’s platforms to generate maximum revenue for Google and their partners.

You may be as philosophical as Epicurus about this and say that Google has every right to be as profitable as they can with their own product, even if they don’t really need to siphon more revenue off local businesses. But if Google’s recent trajectory causes your brand or agency to conclude that websites have become obsolete in this heavily controlled environment, please keep reading.

Your website is your bedrock

“65% of 1,411 surveyed marketers observe strong correlation between organic and local rank.” – Via Moz State of Local SEO Industry Report

What this means is that businesses which rank highly organically are very likely to have high associated local pack rankings. In the following screenshot, if you take away the directory-type platforms, you will see how the brand websites ranking on page 1 for “deli athens ga” are also the two businesses that have made it into Google’s local pack:

How often do the top 3 Google local pack results also have a 1st page organic rankings?

In a small study, we looked at 15 head keywords across 7 US cities and towns. This yielded 315 possible entries in Google’s local pack. Of that 315, 235 of the businesses ranking in the local packs also had page 1 organic rankings. That’s a 75% correlation between organic website rankings and local pack presence.

*It’s worth noting that where local and organic results did not correlate, it was sometimes due the presence of spam GMB listings, or to mystery SERPs that did not make sense at first glance — perhaps as a result of Google testing, in some cases.

Additionally, many local businesses are not making it to the first page of Google anymore in some categories because the organic SERPs are inundated with best-of lists and directories. Often, local business websites were pushed down to the second page of the organic results. In other words, if spam, “best-ofs,” and mysteries were removed, the local-organic correlation would likely be much higher than 75%.

Further, one recent study found that even when Google’s Local Service Ads are present, 43.9% of clicks went to the organic SERPs. Obviously, if you can make it to the top of the organic SERPs, this puts you in very good CTR shape from a purely organic standpoint.

Your takeaway from this

The local businesses you market may not be able to stave off the onslaught of Google’s zero-click SERPs, paid SERPs, and lead gen features, but where “free” local 3-packs still exist, your very best bet for being included in them is to have the strongest possible website. Moreover, organic SERPs remain a substantial source of clicks.

Far from it being the case that websites have become obsolete, they are the firmest bedrock for maintaining free local SERP visibility amidst an increasing scarcity of opportunities.

This calls for an industry-wide doubling down on organic metrics that matter most.

Bridging the local-organic gap

“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
- Aristotle

A 2017 CNBC survey found that 45% of small businesses have no website, and, while most large enterprises have websites, many local businesses qualify as “small.”

Moreover, a recent audit of 9,392 Google My Business listings found that 27% have no website link.

When asked which one task 1,411 marketers want clients to devote more resources to, it’s no coincidence that 66% listed a website-oriented asset. This includes local content development, on-site optimization, local link building, technical analysis of rankings/traffic/conversions, and website design as shown in the following Moz survey graphic:

In an environment in which websites are table stakes for competitive local pack rankings, virtually all local businesses not only need one, but they need it to be as strong as possible so that it achieves maximum organic rankings.

What makes a website strong?

The Moz Beginner’s Guide to SEO offers incredibly detailed guidelines for creating the best possible website. While we recommend that everyone marketing a local business read through this in-depth guide, we can sum up its contents here by stating that strong websites combine:

  • Technical basics
  • Excellent usability
  • On-site optimization
  • Relevant content publication
  • Publicity

For our present purpose, let’s take a special look at those last three elements.

On-site optimization and relevant content publication

There was a time when on-site SEO and content development were treated almost independently of one another. And while local businesses will need a make a little extra effort to put their basic contact information in prominent places on their websites (such as the footer and Contact Us page), publication and optimization should be viewed as a single topic. A modern strategy takes all of the following into account:

  • Keyword and real-world research tell a local business what consumers want
  • These consumer desires are then reflected in what the business publishes on its website, including its homepage, location landing pages, about page, blog and other components
  • Full reflection of consumer desires includes ensuring that human language (discovered via keyword and real-world research) is implemented in all elements of each page, including its tags, headings, descriptions, text, and in some cases, markup

What we’re describing here isn’t a set of disconnected efforts. It’s a single effort that’s integral to researching, writing, and publishing the website. Far from stuffing keywords into a tag or a page’s content, focus has shifted to building topical authority in the eyes of search engines like Google by building an authoritative resource for a particular consumer demographic. The more closely a business is able to reflect customers’ needs (including the language of their needs), in every possible component of its website, the more relevant it becomes.

A hypothetical example of this would be a large medical clinic in Dallas. Last year, their phone staff was inundated with basic questions about flu shots, like where and when to get them, what they cost, would they cause side effects, what about side effects on people with pre-existing health conditions, etc. This year, the medical center’s marketing team took a look at Moz Keyword Explorer and saw that there’s an enormous volume of questions surrounding flu shots:

This tiny segment of the findings of the free keyword research tool, Answer the Public, further illustrates how many questions people have about flu shots:

The medical clinic need not compete nationally for these topics, but at a local level, a page on the website can answer nearly every question a nearby patient could have about this subject. The page, created properly, will reflect human language in its tags, headings, descriptions, text, and markup. It will tell all patients where to come and when to come for this procedure. It has the potential to cut down on time-consuming phone calls.

And, finally, it will build topical authority in the eyes of Google to strengthen the clinic’s chances of ranking well organically… which can then translate to improved local rankings.

It’s important to note that keyword research tools typically do not reflect location very accurately, so research is typically done at a national level, and then adjusted to reflect regional or local language differences and geographic terms, after the fact. In other words, a keyword tool may not accurately reflect exactly how many local consumers in Dallas are asking “Where do I get a flu shot?”, but keyword and real-world research signals that this type of question is definitely being asked. The local business website can reflect this question while also adding in the necessary geographic terms.

Local link building must be brought to the fore of publicity efforts

Moz’s industry survey found that more than one-third of respondents had no local link building strategy in place. Meanwhile, link building was listed as one of the top three tasks to which marketers want their clients to devote more resources. There’s clearly a disconnect going on here. Given the fundamental role links play in building Domain Authority, organic rankings, and subsequent local rankings, building strong websites means bridging this gap.

First, it might help to examine old prejudices that could cause local business marketers and their clients to feel dubious about link building. These most likely stem from link spam which has gotten so out of hand in the general world of SEO that Google has had to penalize it and filter it to the best of their ability.

Not long ago, many digital-only businesses were having a heyday with paid links, link farms, reciprocal links, abusive link anchor text and the like. An online company might accrue thousands of links from completely irrelevant sources, all in hopes of escalating rank. Clearly, these practices aren’t ones an ethical business can feel good about investing in, but they do serve as an interesting object lesson, especially when a local marketer can point out to a client, that best local links are typically going to result from real-world relationship-building.

Local businesses are truly special because they serve a distinct, physical community made up of their own neighbors. The more involved a local business is in its own community, the more naturally link opportunities arise from things like local:

  • Sponsorships
  • Event participation and hosting
  • Online news
  • Blogs
  • Business associations
  • B2B cross-promotions

There are so many ways a local business can build genuine topical and domain authority in a given community by dint of the relationships it develops with neighbors.

An excellent way to get started on this effort is to look at high-ranking local businesses in the same or similar business categories to discover what work they’ve put in to achieve a supportive backlink profile. Moz Link Intersect is an extremely actionable resource for this, enabling a business to input its top competitors to find who is linking to them.

In the following example, a small B&B in Albuquerque looks up two luxurious Tribal resorts in its city:

Link Intersect then lists out a blueprint of opportunities, showing which links one or both competitors have earned. Drilling down, the B&B finds that Marriott.com is linking to both Tribal resorts on an Albuquerque things-to-do page:

The small B&B can then try to earn a spot on that same page, because it hosts lavish tea parties as a thing-to-do. Outreach could depend on the B&B owner knowing someone who works at the local Marriott personally. It could include meeting with them in person, or on the phone, or even via email. If this outreach succeeds, an excellent, relevant link will have been earned to boost organic rank, underpinning local rank.

Then, repeat the process. Aristotle might well have been speaking of link building when he said we are what we repeatedly do and that excellence is a habit. Good marketers can teach customers to have excellent habits in recognizing a good link opportunity when they see it.

Taken altogether

Without a website, a local business lacks the brand-controlled publishing and link-earning platform that so strongly influences organic rankings. In the absence of this, the chances of ranking well in competitive local packs will be significantly less. Taken altogether, the case is clear for local businesses investing substantially in their websites.

Acting now is actually a strategy for the future

“There is nothing permanent except change.”
- Heraclitus

You’ve now determined that strong websites are fundamental to local rankings in competitive markets. You’ve absorbed numerous reasons to encourage local businesses you market to prioritize care of their domains. But there’s one more thing you’ll need to be able to convey, and that’s a sense of urgency.

Right now, every single customer you can still earn from a free local pack listing is immensely valuable for the future.

This isn’t a customer you’ve had to pay Google for, as you very well might six months, a year, or five years from now. Yes, you’ve had to invest plenty in developing the strong website that contributed to the high local ranking, but you haven’t paid a penny directly to Google for this particular lead. Soon, you may be having to fork over commissions to Google for a large portion of your new customers, so acting now is like insurance against future spend.

For this to work out properly, local businesses must take the leads Google is sending them right now for free, and convert them into long-term, loyal customers, with an ultimate value of multiple future transactions without Google as a the middle man. And if these freely won customers can be inspired to act as word-of-mouth advocates for your brand, you will have done something substantial to develop a stream of non-Google-dependent revenue.

This offer may well expire as time goes by. When it comes to the capricious local SERPs, marketers resemble the Greek philosophers who knew that change is the only constant. The Trojan horse has rolled into every US city, and it’s a gift with a questionable shelf life. We can’t predict if or when free packs might become obsolete, but we share your concerns about the way the wind is blowing.

What we can see clearly right now is that websites will be anything but obsolete in 2019. Rather, they are the building blocks of local rankings, precious free leads, and loyal revenue, regardless of how SERPs may alter in future.

For more insights into where local businesses should focus in 2019, be sure to explore the Moz State of Local SEO industry report:

Read the State of Local SEO industry report

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

NFIB Index Shows Small Business Optimism in the US is at Its Highest Level Ever

Small businesses are feeling very positive these days. According to the National Federation of Independent Businesses (NFIB), this optimism has reached record-breaking levels.

A recent NFIB report showed that August’s Small Business Optimism Index came in at an all-time high of 108.8. The previous index record was 108 and was set 35 years ago in July 1983.

Juanita Duggan, the President and CEO of NFIB, said in a press release that the amazing number was a clear indication that business in the US is indeed booming, a claim that many small business owners have reaffirmed.

There were also several key points that the August index survey revealed, like:

  • Inventory investment plans were the most stable since 2005. Meanwhile, capital spending plans are at the highest peak since 2007.
  • New records have also been set with regards to job creation and unfilled job openings.
  • The number of small company owners who said it was a good time to expand tied with the record high seen in May 2018.

Duggan also stated that as taxes and regulations were changed, small companies also adjusted their business plans and expectations.

“We’re now seeing the tangible results of those plans as small businesses report historically high, some record-breaking, levels of increased sales, investment, earnings, and hiring,” Duggan explained.

The NFIB president also pointed out in an earlier report, most of the optimism seen in the index was due to the component gains generated by expectations. For instance, expectations regarding business conditions, real sales, and even when would be a good time for businesses to expand.

However, the new report highlights real industry activities, like capital spending plans, inventory investment plans, and job openings. This data, based as it is on real activity, shows that higher GDP growth is on the horizon.

While the Optimism Index is confirming facts that people want to hear, it also showed problem areas. For instance, companies are still having difficulty securing qualified workers. About 90 percent of businesses trying to fill a position have reported they found very few to no qualified applicants. What’s more, the percentage of firms who might offer a higher salary remained unchanged at 32 percent while businesses who planned to give employees a pay raise dropped to a low 21 percent.

[Featured image via Pexels]

The post NFIB Index Shows Small Business Optimism in the US is at Its Highest Level Ever appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

Srinivas Rao: The Story Behind The Unmistakable Creative Podcast And Why Being Creative Matters More Than Ever

 [ Download MP3 | Transcript | iTunes | Soundcloud | Raw RSS ] Srinivas Rao (Srini) begins this podcast by sharing a story from a decade ago where he borrowed money to sign up for my Blog Mastermind course back when it was first released. In one of the lessons, I gave the task […]

The post Srinivas Rao: The Story Behind The Unmistakable Creative Podcast And Why Being Creative Matters More Than Ever appeared first on Yaro.Blog.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

If Content Is a Performance, Is It Ever Authentic?

Producing more effective content that helps you build an audience of interested prospects is a common theme in my articles. In the past few months, I’ve written about ways to show how likable you are and how to make your writing personal, but not self-indulgent. And I realized that neither of those posts mentioned authenticity,
Read More…

The post If Content Is a Performance, Is It Ever Authentic? appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

The Biggest Mistake Digital Marketers Ever Made: Claiming to Measure Everything

Posted by willcritchlow

Digital marketing is measurable.

It’s probably the single most common claim everyone hears about digital, and I can’t count the number of times I’ve seen conference speakers talk about it (heck, I’ve even done it myself).

I mean, look at those offline dinosaurs, the argument goes. They all know that half their spend is wasted — they just don’t know which half.

Maybe the joke’s on us digital marketers though, who garnered only 41% of global ad spend even in 2017 after years of strong growth.

Unfortunately, while we were geeking out about attribution models and cross-device tracking, we were accidentally triggering a common human cognitive bias that kept us anchored on small amounts, leaving buckets of money on the table and fundamentally reducing our impact and access to the C-suite.

And what’s worse is that we have convinced ourselves that it’s a critical part of what makes digital marketing great. The simplest way to see this is to realize that, for most of us, I very much doubt that if you removed all our measurement ability we’d reduce our digital marketing investment to nothing.

In truth, of course, we’re nowhere close to measuring all the benefits of most of the things we do. We certainly track the last clicks, and we’re not bad at tracking any clicks on the path to conversion on the same device, but we generally suck at capturing:

  • Anything that happens on a different device
  • Brand awareness impacts that lead to much later improvements in conversion rate, average order value, or lifetime value
  • Benefits of visibility or impressions that aren’t clicked
  • Brand affinity generally

The cognitive bias that leads us astray

All of this means that the returns we report on tend to be just the most direct returns. This should be fine — it’s just a floor on the true value (“this activity has generated at least this much value for the brand”) — but the “anchoring” cognitive bias means that it messes with our minds and our clients’ minds. Anchoring is the process whereby we fixate on the first number we hear and subsequently estimate unknowns closer to the anchoring number than we should. Famous experiments have shown that even showing people a totally random number can drag their subsequent estimates up or down.

So even if the true value of our activity was 10x the measured value, we’d be stuck on estimating the true value as very close to the single concrete, exact number we heard along the way.

This tends to result in the measured value being seen as a ceiling on the true value. Other biases like the availability heuristic (which results in us overstating the likelihood of things that are easy to remember) tend to mean that we tend to want to factor in obvious ways that the direct value measurement could be overstating things, and leave to one side all the unmeasured extra value.

The mistake became a really big one because fortunately/unfortunately, the measured return in digital has often been enough to justify at least a reasonable level of the activity. If it hadn’t been (think the vanishingly small number of people who see a billboard and immediately buy a car within the next week when they weren’t otherwise going to do so) we’d have been forced to talk more about the other benefits. But we weren’t. So we lazily talked about the measured value, and about the measurability as a benefit and a differentiator.

The threats of relying on exact measurement

Not only do we leave a whole load of credit (read: cash) on the table, but it also leads to threats to measurability being seen as existential threats to digital marketing activity as a whole. We know that there are growing threats to measuring accurately, including regulatory, technological, and user-behavior shifts:

Now, imagine that the combination of these trends meant that you lost 100% of your analytics and data. Would it mean that your leads stopped? Would you immediately turn your website off? Stop marketing?

I suggest that the answer to all of that is “no.” There’s a ton of value to digital marketing beyond the ability to track specific interactions.

We’re obviously not going to see our measurable insights disappear to zero, but for all the reasons I outlined above, it’s worth thinking about all the ways that our activities add value, how that value manifests, and some ways of proving it exists even if you can’t measure it.

How should we talk about value?

There are two pieces to the brand value puzzle:

  1. Figuring out the value of increasing brand awareness or affinity
  2. Understanding how our digital activities are changing said awareness or affinity

There’s obviously a lot of research into brand valuations generally, and while it’s outside the scope of this piece to think about total brand value, it’s worth noting that some methodologies place as much as 75% of the enterprise value of even some large companies in the value of their brands:

Image source

My colleague Tom Capper has written about a variety of ways to measure changes in brand awareness, which attacks a good chunk of the second challenge. But challenge #1 remains: how do we figure out what it’s worth to carry out some marketing activity that changes brand awareness or affinity?

In a recent post, I discussed different ways of building marketing models and one of the methodologies I described might be useful for this – namely so-called “top-down” modelling which I defined as being about percentages and trends (as opposed to raw numbers and units of production).

The top-down approach

I’ve come up with two possible ways of modelling brand value in a transactional sense:

1. The Sherlock approach

When you have eliminated the impossible, whatever remains, however improbable, must be the truth.”
-
Sherlock Holmes

The outline would be to take the total new revenue acquired in a period. Subtract from this any elements that can be attributed to specific acquisition channels; whatever remains must be brand. If this is in any way stable or predictable over multiple periods, you can use it as a baseline value from which to apply the methodologies outlined above for measuring changes in brand awareness and affinity.

2. Aggressive attribution

If you run normal first-touch attribution reports, the limitations of measurement (clearing cookies, multiple devices etc) mean that you will show first-touch revenue that seems somewhat implausible (e.g. email; email surely can’t be a first-touch source — how did they get on your email list in the first place?):

Click for a larger version

In this screenshot we see that although first-touch dramatically reduces the influence of direct, for instance, it still accounts for more than 15% of new revenue.

The aggressive attribution model takes total revenue and splits it between the acquisition channels (unbranded search, paid social, referral). A first pass on this would simply split it in the relative proportion to the size of each of those channels, effectively normalizing them, though you could build more sophisticated models.

Note that there is no way of perfectly identifying branded/unbranded organic search since (not provided) and so you’ll have to use a proxy like homepage search vs. non-homepage search.

But fundamentally, the argument here would be that any revenue coming from a “first touch” of:

  • Branded search
  • Direct
  • Organic social
  • Email

…was actually acquired previously via one of the acquisition channels and so we attempt to attribute it to those channels.

Even this under-represents brand value

Both of those methodologies are pretty aggressive — but they might still under-represent brand value. Here are two additional mechanics where brand drives organic search volume in ways I haven’t figured out how to measure yet:

Trusting Amazon to rank

I like reading on the Kindle. If I hear of a book I’d like to read, I’ll often Google the name of the book on its own and trust that Amazon will rank first or second so I can get to the Kindle page to buy it. This is effectively a branded search for Amazon (and if it doesn’t rank, I’ll likely follow up with a [book name amazon] search or head on over to Amazon to search there directly).

But because all I’ve appeared to do is search [book name] on Google and then click through to Amazon, there is nothing to differentiate this from an unbranded search.

Spotting brands you trust in the SERPs

I imagine we all have anecdotal experience of doing this: you do a search and you spot a website you know and trust (or where you have an account) ranking somewhere other than #1 and click on it regardless of position.

One time that I can specifically recall noticing this tendency growing in myself was when I started doing tons more baby-related searches after my first child was born. Up until that point, I had effectively zero brand affinity with anyone in the space, but I quickly grew to rate the content put out by babycentre (babycenter in the US) and I found myself often clicking on their result in position 3 or 4 even when I hadn’t set out to look for them, e.g. in results like this one:

It was fascinating to me to observe this behavior in myself because I had no real interaction with babycentre outside of search, and yet, by consistently ranking well across tons of long-tail queries and providing consistently good content and user experience I came to know and trust them and click on them even when they were outranked. I find this to be a great example because it is entirely self-contained within organic search. They built a brand effect through organic search and reaped the reward in increased organic search.

I have essentially no ideas on how to measure either of these effects. If you have any bright ideas, do let me know in the comments.

Budgets will come under pressure

My belief is that total digital budgets will continue to grow (especially as TV continues to fragment), but I also believe that individual budgets are going to come under scrutiny and pressure making this kind of thinking increasingly important.

We know that there is going to be pressure on referral traffic from Facebook following the recent news feed announcements, but there is also pressure on trust in Google:

While I believe that the opportunity is large and still growing (see, for example, this slide showing Google growing as a referrer of traffic even as CTR has declined in some areas), it’s clear that the narrative is going to lead to more challenging conversations and budgets under increased scrutiny.

Can you justify your SEO investment?

What do you say when your CMO asks what you’re getting for your SEO investment?

What do you say when she asks whether the organic search opportunity is tapped out?

I’ll probably explore the answers to both these questions more in another post, but suffice it to say that I do a lot of thinking about these kinds of questions.

The first is why we have built our split-testing platform to make organic SEO investments measurable, quantifiable and accountable.

The second is why I think it’s super important to remember the big picture while the media is running around with their hair on fire. Media companies saw Facebook overtake Google as a traffic channel (and then are likely seeing that reverse right now), but most of the web has Google as the largest growing source of traffic and value.

The reality (from clickstream data) is that it’s really easy to forget how long the long-tail is and how sparse search features and ads are on the extreme long-tail:

  1. Only 3–4% of all searches result in a click on an ad, for example. Google’s incredible (and still growing) business is based on a small subset of commercial searches
  2. Google’s share of all outbound referral traffic across the web is growing (and Facebook’s is shrinking as they increasingly wall off their garden)

The opportunity is for smart brands to capitalize on a growing opportunity while their competitors sink time and money into a social space that is increasingly all about Facebook, and increasingly pay-to-play.

What do you think? Are you having these hard conversations with leadership? How are you measuring your digital brand’s value?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Amazon Exceeds Analyst Predictions, Posts its Highest Q4 Profits Ever

Buoyed by strong holiday sales and the robust performance of its cloud computing division, Amazon exceeded previous expectations set by analysts for its fourth quarter performance. The eCommerce giant posted a staggering $ 60.5 billion in revenue, surpassing Wall Street estimates which projected its revenues for the period to only reach $ 59.83 billion.

For the fourth quarter last year, Amazon posted a net profit of $ 1.9 billion, which is a record for the company. By comparison, the 2017 Q4 profit is more than double its net profit for the same period the previous year.

However, Amazon’s profits got a big boost from a tax benefit. The company received a provisional $ 789 million boost from a new tax law passed in December.

In addition, the strong performance of its cloud computing business Amazon Web Services (AWS) is also a contributory factor to its record performance. AWS’s $ 5.11 billion revenue for the same period likewise defied analysts’ expectations, which was only anticipated to reach $ 4.97 billion.

The biggest factor to Amazon’s stratospheric Q4 performance still comes from holiday shopping especially during the period starting on the Thanksgiving holiday until New Year. Pushed by the holiday shopping rush, Amazon’s sales rose to $ 60.5 billion or a 38 percent increase from the year-ago level.

According to Amazon CEO Jeff Bezos, the company’s success is, in large part, a result of its AI-powered digital assistant Alexa. In fact, there are indications that Amazon could be investing more in the technology given its initial success.

“Our 2017 projections for Alexa were very optimistic, and we far exceeded them. We don’t see positive surprises of this magnitude very often—expect us to double down,” Bezos said in a statement.

For its 2017 full year performance, Amazon posted a 31 percent rise in sales with its 2017 full year revenue of $ 177.9 billion, as compared to its 2016 sales of only $ 136 billion. However, its operating profit is only $ 4.1 billion, a 2 percent decrease from the previous year due to reinvestments.

Wall Street still remains overwhelmingly positive on Amazon’s future prospects. Recently, its stock rose by 70 percent which resulted in Jeff Bezos overthrowing Bill Gates as the world’s richest man.

The post Amazon Exceeds Analyst Predictions, Posts its Highest Q4 Profits Ever appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customers have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn’t any. It’s bigger, better, faster, and you won’t pay an extra dime for it.

A moment of humility, though — if you’ve used our existing site crawl, you know it hasn’t always lived up to your expectations. Truth is, it hasn’t lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt “Aardwolf” engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on “All Crawled Pages” in the new crawler, and you’ll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let’s say I just wanted to see all of the pages crawled for Moz.com in the “/blog” directory…

I just click the [+], select “URL,” enter “/blog,” and I’m on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can’t wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click “Recrawl my site” from the top of any page in the Site Crawl section, and you’ll be on your way…

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you’re under tight deadlines for client reviews, we understand that waiting just isn’t an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what’s critical for one site is barely a nuisance for another. For example, let’s say I don’t care about a handful of overly dynamic URLs (for many sites, it’s a minor issue). With the new Site Crawl, I can just select those issues and then “Ignore” them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We’ll also keep tracking any new issues that pop up over time. Just because you don’t care about something today doesn’t mean you won’t need to know about it a month from now.

Fix duplicate content

Under “Content Issues,” we’ve launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the “parent” page. Here’s a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we’ve misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what’s a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we’ve attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we’re going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven’t helped you do as well as we could. We’re going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we’ve been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you’ll have history available from day one! Stay tuned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and be sure to register for the upcoming webinar.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

One of the Largest DDoS Attack Ever Seen Kills Krebs Security Site

One of the largest Denial of Service (DDoS) attacks ever seen on the internet has caused Akamai to dump a site it hosted, KrebsOnSecurity.com. The DDoS attack was apparently in retaliation for journalist Brian Krebs‘ recent article about vDOS, which is allegedly a cyberattack service. According to BI following Krebs reporting two Israeli men were arrested. and the site was taken down.

One Twitter post noted the irony in a security expert having his site taken down because of a DDoS attack. “Brian Krebs, the man who gives cybercriminals nightmares, has been hit with a Godzilla-sized DDoS attack,” noted cybercrime researcher, blogger and speaker, Graham Cluley, “Sad news, hope he’s back soon.”

The Attack Was Huge

Before his site was take down Krebs posted about the attack on his website saying that KrebsOnSecurity.com was the target of an extremely large and unusual distributed denial-of-service (DDoS) attack designed to knock the site offline. “The attack did not succeed thanks to the hard work of the engineers at Akamai, the company that protects my site from such digital sieges. But according to Akamai, it was nearly double the size of the largest attack they’d seen previously, and was among the biggest assaults the Internet has ever witnessed.”

Later Akamai did take down the site and Krebs was understanding:

“The attack began around 8 p.m. ET on Sept. 20, and initial reports put it at approximately 665 Gigabits of traffic per second,” writes Krebs. “Additional analysis on the attack traffic suggests the assault was closer to 620 Gbps in size, but in any case this is many orders of magnitude more traffic than is typically needed to knock most sites offline.”

Krebs said that Martin McKeay, Akamai’s senior security advocate, told him that this was the largest attack that they had seen. Earlier this year they clocked an attack at 363 Gbps, but there was a major difference: This attack was launched by a “very large” botnet of hacked devices, where typical DDoS attacks use the common amplifying technique that bulks up a small attack into a large one.

Krebs last tweets about the attack:

The post One of the Largest DDoS Attack Ever Seen Kills Krebs Security Site appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

Advert