Tag Archive | "Guide"

The Bot Plan: Your Guide to Making Conversations Convert

Posted by purna_v

Let’s start off with a quick “True or False?” game:

“By 2020, the average person will have more conversations with their bot than with their spouse.”

True, or false? You may be surprised to learn that speaking more with bots than our spouse is precisely what Gartner is predicting.

And when Facebook’s Mark Zuckerberg says “messaging is one of the few things that people do more than social networking,” it requires no leap of faith to see that chatbots are an integral part of marketing’s future.

But you don’t need to stock up on canned peaches and head for the hills because “the robots are coming.” The truth is, the robots aren’t coming because they’re already here, and they love us from the bottom of their little AI-powered hearts.

Bots aren’t a new thing for many parts of the world such as China or India. As reported by Business Insider, sixty-seven percent of consumers worldwide have used a chatbot for customer support in the last year.

Within the United States, an impressive 60% of millennials have used chatbots with 70% of those reporting positive experiences, according to Forbes.

There’s no putting bots back in the box.

And it’s not just that brands have to jump on board to keep up with those pesky new generations, either. Bots are great for them, too.

Bots offer companies:

  1. A revolutionary way to reach consumers. For the first time in history, brands of any size can reach consumers on a personal level. Note my emphasis on “of any size.” You can be a company of one and your bot army can give your customers a highly personal experience. Bots are democratizing business!
  2. Snackable data. This “one-to-one” communication gives you personal insights and specificity, plus a whole feast of snackable data that is actionable.
  3. Non-robot-like interaction. An intelligent bot can keep up with back-and-forth customer messages in a natural, contextual, human way.
  4. Savings. According to Juniper Research, the average time saving per chatbot inquiry compared to traditional call centers is over four minutes, which has the potential to make a truly extraordinary impact on a company’s bottom line (not to mention the immeasurable impact it has on customers’ feelings about the company).
  5. Always on. It doesn’t matter what time zone your customer is in. Bots don’t need to sleep, or take breaks. Your company can always be accessible via your friendly bot.

Here in the West, we are still in the equivalent of the Jurassic Period for bots. What they can be used for is truly limited only by our imagination.

One of my most recent favorites is an innovation from the BBC News Labs and Visual Journalism teams, who have launched a bot-builder app designed to, per Nieman Lab, “make it as easy as possible for reporters to build chatbots and insert them in their stories.”

So, in a story about President Trump from earlier this year, you see this:

Source: BBC.com

It’s one of my favorites not just because it’s innovative and impressive, but because it neatly illustrates how bots can add to and improve our lives… not steal our jobs.

Don’t be a dinosaur

A staggering eighty percent of brands will use chatbots for customer interactions by 2020, according to research. That means that if you don’t want to get left behind, you need to join the bot arms race right now.

“But where do I start?” you wonder.

I’m happy you asked that. Building a bot may seem like an endeavor that requires lots of tech savvy, but it’s surprisingly low-risk to get started.

Many websites allow you to build bots for free, and then there’s QNAMaker.ai (created by Microsoft, my employer), which does a lot of the work for you.

You simply input your company’s FAQ section, and it builds the foundation for an easy chatbot that can be taken live via almost any platform, using natural language processing to parse your FAQ and develop a list of questions your customers are likely to ask.

This is just the beginning — the potential for bots is wow-tastic.

That’s what I’m going to show you today — how you can harness bot-power to build strong, lasting relationships with your customers.

Your 3-step plan to make conversations convert

Step 1: Find the right place to start

The first step isn’t to build a bot straightaway. After all, you can build the world’s most elaborate bot and it is worth exactly nothing to you or your customer if it does not address their needs.

That’s why the first step is figuring out the ways bots can be most helpful to your customers. You need to find their pain points.

You can do this by pretending you’re one of your customers, and navigating through your purchase funnel. Or better again, find data within your CRM system and analytics tools that can help you answer key questions about how your audience interacts with your business.

Here’s a handy checklist of questions you should get answers to during this research phase:

  • How do customers get information or seek help from your company? ☑
  • How do they make a purchase? ☑
  • Do pain points differ across channels and devices? ☑
  • How can we reduce the number of steps in each interaction? ☑

Next, you’ll want to build your hypothesis. And here’s a template to help you do just that:

I believe [type of person] needs to solve [problem] which happens while [situation], which will allow them to [get value].

For example, you’re the manager of a small spa, whose biggest time-suck is people calling to ask simple questions, meaning other customers are on hold for a long time. If those customers can ask a bot these simple questions, you get three important results:

  1. The hold time for customers overall will diminish
  2. The customer-facing staff in your spa will be able to pay more attention to clients who are physically in front of them
  3. Customers with lengthier questions will be helped sooner

Everybody wins.

Finally, now that you’ve identified and prioritized the situations where conversation can help, you’ll be ready to build a bot as well as a skill.

Wait a minute — what’s a skill in this context, and how do they relate to bots? Here’s a great explanation from Chris Messina:

  • A bot is an autonomous program on a network
  • A chatbot is a bot that uses human language to communicate
  • An AI assistant is a chatbot that performs tasks or services for an individual
  • A skill is a capability that an AI assistant can learn

Each of them can help look things up, place orders, solve problems, and make things happen easier, better, and faster.

A few handy resources to build a bot are:

Step 2: Add conversation across the entire customer journey

There are three distinct areas of the customer decision journey where bots and skills can make a big difference.

Bot as introducer

Bots can help your company by being present at the very first event in a purchase path.

Adidas did this wonderfully when they designed a chatbot for their female-focused community Studio LDN, to help create an interactive booking process for the free fitness sessions offered. To drive engagement further, as soon as a booking was made the user would receive reminders and messages from influencer fitness instructors.

The chatbot was the only way for people to book these sessions and it worked spectacularly well.

In the first two weeks, 2,000 people signed up to participate, with repeat use at 80%. Retention after week one was 60%, which the brand claims is far better compared to an app.

Adidas did something really clever. They advertised the bot across many of their other channels to help promote the bot and help with its discoverability.

You can do the same.

There are countless examples where bots can put their best suit on and act as the first introduction to your company:

  • Email marketing: According to MailChimp research, the average email open rates are between 15% to 26% with click rates being just a fraction of that at approximately 2%–5%. That’s pretty low when you compare that to Messenger messages, which can have an open rate of well over 90%. Why not make your call-to-action within your email be an incentive for people to engage with your chatbot? For example, something like “message us for 10% off” could be a compelling reason for people to engage with your chatbot.
  • Social media: How about instead of running Facebook ads which direct people to websites, you run an ad connecting people to bots instead? For example, in the ad, advise people to “chat to see the latest styles” or “chat now to get 20% off” and then have your bot start a conversation. Instant engagement! Plus, it’s a more gentle call-to-action as opposed to a hard sell such as “buy now.”
  • Video: How about creating instructional YouTube videos on how to use your bot? Especially helpful since one of the barriers to using this new technology is a lack of awareness about how to use it. A short, quick video that demonstrates what your skill can do could be very impactful. Check out this great example from FitBit and Cortana:

  • Search: As you’ve likely seen by now, Bing has been integrating chatbots within the SERPs itself. You can do a search for bots across different platforms and you’ll be able to add relevant bots directly to your preferred platform right from the search results themselves:

Travel Bots

  • You can engage with local businesses such as restaurants via the Bing Business bot that shows up as part of the local listings:

Monsoon Seattle search with chatbot

The key lesson here is that when your bot is acting as an introducer, give your audience plenty of ways and reasons to chat. Use conversation to tell people about new stuff, and get them to kick off that conversation.

Bot as influencer

To see a bot acting as an effective influencer, let’s turn to Chinese giant Alibaba. They developed a customizable chatbot store concierge that they offer free to brands and markets.

Cutely named dian xiao mi, or “little shop bee,” the concierge is designed to be the most helpful store assistant you could wish for.

For example, if a customer interacting with a clothing brand uploads a photograph of a t-shirt, the bot buzzes in with suggestions of pants to match. Or, if a customer provides his height and weight, the bot can offer suggested sizing. Anyone who has ever shopped online for clothing knows exactly how much pain the latter offering could eliminate.

This helpful style is essentially changing the conversation from “BUY NOW!” to “What do you need right now?”

We should no longer ask: “How should we sell to customers?” The gazillion-dollar question instead is: How can we connect with them?

An interesting thing about this change is that, when you think about it for a second, it seems like common sense. How much more trust would you have for a brand that was only trying to help you? If you bought a red dress, how much more helpful would it be if the brand showed you a pic of complementary heels and asked if you want to “complete the look”?

For the chatbot to be truly helpful as an influencer, it needs to learn from each conversation. It needs to remember what you shared from the last conversation, and use it to shape future conversations.

So, say a chatbot from my favorite shoe store knew all about my shoe addiction (is there a cure? Would I event want to be cured of it?), then it could be more helpful via its remarketing efforts.

Imagine how much more effective it would be if we could have an interaction like this:

Shoestore Chatbot: Hi Purna! We’re launching a new collection of boots. Would you like a sneak peek?

Me: YES please!!!

Shoestore Chatbot: Great! I’ll email pics to you. You can also save 15% off your next order with code “MozBlog”. Hurry, code expires in 24 hours.

Me: *buys all the shoes, obvs*

This is Bot-topia. Your brand is being helpful, not pushy. Your bot is cultivating relationships with your customers, not throwing ads at them.

The key lesson here? For your bot to be a successful influencer, you must always consider how they can be helpful and how they can add value.

Bot as closer

Bot: “A, B, C. Always be closing.”

Imagine you want to buy flowers for Mother’s Day, but you have very little interest in flowers, and when you scroll through the endless options on the website, and then a long checkout form, you just feel overwhelmed.

1-800-Flowers found your pain point, and acted on it by creating a bot for Facebook Messenger.

It asks you whether you want to select a bunch from one of their curated collections, instantly eliminating the choice paralysis that could see consumers leave the website without purchasing anything.

And once you’ve chosen, you can easily complete the checkout process using your phone’s payment system (e.g. Apple Pay) to make checkout a cinch. So easy, and so friction-free.

The result? According to Digiday, within two months of launch the company saw 70% of the orders through the bot came from brand-new customers. By building a bot, 1-800 Flowers slam-dunked their way into the hearts of a whole new, young demographic.

Can you think of a better, more inexpensive way to unlock a big demographic? I can’t.

To quote Mr. Zuckerberg again: “It’s pretty ironic. To order from 1-800-Flowers, you never have to call 1-800-Flowers again.”

Think back to that handy checklist of questions from Step 1, especially this one: “How can we reduce the number of steps in each interaction?”

Your goal is to make every step easy and empathetic.

Think of what people would want/need to know to as they complete their tasks. For example, if you’re looking to transfer money from your bank account, the banking chatbot could save you from overdraft fees if it warns you that your account could be overdrawn before you make the transfer.

The key lesson here: Leverage your bots to remove any friction and make the experience super relevant and empathetic.

Step 3: Measure the conversation with the right metrics

One of my favorite quotes around how we view metrics versus how we should view metrics comes from Automat CEO Andy Mauro, who says:

“Rather than tracking users with pixels and cookies, why not actually engage them, learn about them, and provide value that actually meets their needs?”

Again, this is common sense once you’ve read it. Of course it makes sense to engage our users and provide value that meets their needs!

We can do this because the bots and skills give us information in our customers’ own words.

Here’s a short list of KPIs that you should look at (let’s call it “bot-alytics”):

  • Delivery and open rates: If the bot starts a conversation, did your customer open it?
  • Click rates: If your bot delivered a link in a chat, did your customer click on it?
  • Retention: How often do they come back and chat with you?
  • Top messages: What messages are resonating with your customers more than others?
  • Conversion rates: Do they buy?
  • Sentiment analysis: Do your customers express happiness and enthusiasm in their conversation with the bot, or frustration and anger?

Using bot-alytics, you can easily build up a clear picture of what is working for you, and more importantly, what is working for your customer.

And don’t forget to ask: What can you learn from bot-alytics that can help other channels?

The future’s bright, the future’s bots

What were once dumb machines are now smart enough that we can engage with them in a very human way. It presents the opportunity of a generation for businesses of all shapes and sizes.

Our customers are beginning to trust bots and digital personal assistants for recommendations, needs, and more. They are the friendly neighborhood machines that the utopian vision of a robotic future presents. They should be available to people anywhere: from any device, in any way.

And if that hasn’t made you pencil in a “we need to talk about bots” meeting with your company, here’s a startling prediction from Accenture. They believe that in five years, more than half of your customers will select your services based on your AI instead of your traditional brand.

In three steps, you can start your journey toward bot-topia and having your conversations convert. What are you waiting for?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The Guide to Local Sponsorship Marketing – The 2018 Edition

Posted by Claudia0428

For most Moz readers, local marketing means content, reviews, AdWords, local listings, and of course citations. If you’re a larger brand, you might be doing outdoor, radio, print, and television advertising as well. Today we’re here to humbly submit that local sponsorships remain the most-overlooked and opportunity-rich channel, and they build real local connections for both large brands and small business alike.

This article is the second edition of the ZipSprout team’s guide to local sponsorships. We wrote the first edition in 2016 after a few months of securing local sponsorship campaigns for a handful of clients. Since then, we’ve tripled our client roster and we’ve worked with more than 8,000 local organizations, donating nearly $ 1,000,000 in local sponsorships to 1,300+ opportunities. Since then we’ve also learned how to build campaigns for local presence.

So we knew the guide was due for a reboot.

One of our most significant learnings of the past two years is the understanding of local sponsorships as a channel in their own right. They can be directed toward local SEO or local marketing campaigns, but sponsorships are their own breed of local connection — and just like content campaigns, local PR campaigns, or review management, local sponsorships have their own set of conventions and best practices.

This article is meant for anyone with an eye toward local sponsorships as a marketing channel. Agencies and enterprise organizations may find it particularly helpful, but we’re big believers in encouraging smaller local businesses to engage in sponsorships too. Get out there and meet your neighbors!


The what & why of local sponsorships

Local events, nonprofits, and associations constitute a disjointed but very real network of opportunities. Unlike other channels, local sponsorships aren’t accessible from a single platform, but we’ve found that many sponsorships share similarities. This makes it possible to develop processes that work for campaigns in any metro area.

Local sponsorships are also a unique channel in that the benefits can range from the digital to the analog: from local links to a booth, from social posts to signage on a soccer field. The common thread is joining the community by partnering with local organizations, but the benefits themselves vary widely.

We’ve identified and track 24 unique benefits of sponsorships related to local marketing:

  1. Ad (full or partial)
  2. Advertising on event app
  3. Blog post featuring sponsor
  4. Booth, tent, or table at event
  5. Event named for sponsor
  6. Guest post on organization blog
  7. Inclusion in press release
  8. Link in email newsletter
  9. Link on website
  10. Logo on event t-shirt or other swag
  11. Logo on signage
  12. Logo or name on website
  13. Media spots (television/radio/newspaper)
  14. Mention in email newsletter
  15. Mention in publicity materials, such as programs & other printed materials
  16. Networking opportunity
  17. Physical thing (building, etc.) named for sponsor
  18. Social media mention
  19. Speaking opportunity at event
  20. Sponsor & sponsor’s employees receive discounts on services/products/events
  21. Sponsor can donate merchandise for goodie bags
  22. Sponsored post (on blog or online magazine)
  23. Tickets to event
  24. Verbal recognition

There are probably more, but in our experience most benefits fall into these core categories. That said, these benefits aren’t necessarily for everyone…

Who shouldn’t do local sponsorships?

1. Don’t do local sponsorships if you need fast turnaround.

Campaigns can take 1–3 months from launch until fulfillment. If you’re in a hurry to see a return, just increase your search ad budget.

2. Don’t do local sponsorships if you’re not okay with the branding component.

Local link building can certainly be measured, as can coupon usage, email addresses gathered for a drawing, etc… But measuring local brand lift still isn’t a perfect art form. Leave pure attribution to digital ads.

3. Don’t do local sponsorships with a “one size fits all” expectation.

The great thing about local events and opportunities is their diversity. While some components can be scaled, others require high touch outreach, more similar to a PR campaign.

Considerations for agencies vs brands in local sponsorship campaigns

Agencies, especially if they’re creating sponsorship campaigns for multiple clients, can cast a wide net and select from the best opportunities that return. Even if a potential partnership isn’t a good fit for a current client, they may work for a client down the road. Brands, on the other hand, need to be a little more goal and mission-focused during prospecting and outreach. If they’re reaching out to organizations that are clearly a bad fit, they’re wasting everyone’s time.

Brands also need to be more careful because they have a consumer-facing image to protect. As with any outreach campaign, there are dos and don’ts and best practices that all should follow (DO be respectful; DON’T over-email), but brands especially have more to lose from an outreach faux pas.


Our process

Outreach

Once we’ve identified local organizations in a given metro area, we recommend reaching out with an email to introduce ourselves and learn more about sponsorship opportunities. In two years, the ZipSprout team has A/B tested 100 different email templates.

With these initial emails, we’re trying to inform without confusing or scaring away potential new partners. Some templates have resulted in local organizations thinking we’re asking them for sponsorship money or that we want to charge them for a service. Oops! A/B tests have helped to find the best wording for clarity and, in turn, response rate.

Here are some of our learnings:

1. Mentioning location matters.

We reached out to almost 1,000 Chicago organizations in the spring of 2017. When we mentioned Chicago in the email, the response rate increased by 20%.

2. Emails sent to organizations who already had sponsorship info on their websites were most successful if the email acknowledged the onsite sponsorship info and asked for confirmation.

These are also our most successful outreach attempts, likely because these organizations are actively looking for sponsors (as signified by having sponsorship info on their site). Further, by demonstrating that we’ve been on their site, we’re signaling a higher level of intent.

3. Whether or not we included an outreacher phone number in email signatures had no effect on response rate.

If anything, response rates were higher for emails with no phone number in signature, at 41% compared with 40.2%.

4. Shorter is better when it comes to outreach emails.

Consider the following two emails:

EMAIL A


Hi [NAME],

I sent an email last week, but in case you missed it, I figured I’d follow up. :)

I work to help corporate clients find local sponsorships. We’re an agency that helps our business clients identify and sponsor local organizations like [ORG NAME]. We’re paid by businesses who are looking for local sponsorships.

Often, local organizations are overlooked, so my company, ZipSprout, works for businesses who want to sponsor locally, but aren’t sure who to partner with. To that end, I’d love to learn more about [ORG NAME] and see what sponsorship opportunities you have available. Is there a PDF or list of cost and benefits you can share over email or a phone call?


Thanks,

___

EMAIL B

Hi [NAME],

I sent an email last week, but in case you missed it, I figured I’d follow up. :)

I’d love to learn more about [ORG NAME] and see what sponsorships you have available. Is there a PDF or list of cost and benefits you can share over email or a phone call?


Thanks,

___

In an 800-email test, Email B performed 30% better than Email A.

Matchmaking: How can I choose a sponsorship opportunity that fits my brand?

There are many ways to evaluate potential sponsorships.

These are the questions that help us match organizations with clients:

  • Who is your brand targeting (women, senior citizens, family-friendly, dog owners, new parents)?
  • Do you want to tie your brand with a particular cause (eco-friendly, professional associations, awareness foundations, advocacy groups)?
  • Is your campaign based on location? Are you launching your brand in a particular city? A particular zip code?
  • What is your total budget and per-sponsorship range? A top max price or a price range is a useful parameter — and perhaps the most important.

Once the campaign goals are determined, we filter through opportunities based partially on their online presence. We look at Domain Authority, location, website aesthetics, and other sponsors (competitors and non-competitors) in addition to Reach Score (details below).

Further, we review backlinks, organic traffic, and referring domains. We make sure that this nonprofit partnership is not spammy or funky from an SEO perspective and that is a frequently visited website. A small organization may not have all the juicy digital metrics, but by gauging event attendance or measuring organic traffic we can further identify solid prospects that could have been missed otherwise.

We also look at social media presence; event attendance, event dates and how responsive these organizations or event organizers are. Responsiveness, we have learned, is a CRITICAL variable. It can be the determining point of your link going live in 48 hours or less, as opposed to 6+ months from payment.

Reach Score

From a numbers perspective, Domain Authority is a good way to appreciate the value of a website, but it doesn’t tell the whole story when it comes to local marketing. To help fill in the gaps we created Reach Score, which combines virtual measures (like Domain Authority) with social measures (friends/followers) and physical measures (event attendance). The score ranks entities based on their metro area, so we’re not comparing the reach of an organization in Louisville, KY to one in NYC.

As of March 2018, we have about 8,000 organizations with valid Reach Scores across four metro areas — Raleigh/Durham, Boston, Houston, and Chicago. The average Reach Score is 37 out of 100. Of the 34 types of organizations that we track, the most common is Event Venue/Company (average Reach Score of 38), followed by Advocacy Groups (43) and Sports Teams/Clubs/Leagues (22). The types of organizations with the highest Reach Scores are Local Government (64), Museums (63), and Parks and Recreation (55).

Thanks to Reach Score, we’ve found differences between organizations from city to city as well. In Raleigh-Durham, the entities with the highest reach tend to be government-related organizations, such as Chambers of Commerce and Parks & Rec Departments.

In Boston, the highest reach tends to fall to arts organizations, such as music ensembles, as well as professional associations. This score serves as a good reminder that each metro area has a unique community of local organizations. (Read more about our Reach Score findings here.)

Fulfillment

Our campaigns used to take several months to complete, from contract to final sponsorship. Now our average fulfillment rate is 18.7 days, regardless of our project size! Staying (politely) on top of the communication with the nonprofit organizations was the main driver for this improvement.

We find further that the first 48 hours from sending a notification of sponsorship on behalf of your brand are crucial to speedy campaigns. Be ready to award the sponsorship funds in a timely manner and follow up with a phone call or an email, checking in to see if these funds have been received.

It’s okay to ask when can you expect the sponsorship digital benefits to go live and how to streamline the process for any other deliverables needed to complete the sponsorship.

Applying these simple best practices, our team has been able to run a campaign in a week or less.

Two important concepts to remember about the sponsorship channel from the fulfillment perspective:

  1. It’s difficult to fulfill. If your city project involves any more than two or three sponsorships, you’re in for multiple hours of follow ups, reminders, phone calls, etc. There is the desire from most local organizations to honor their sponsors and keep them happy. That said, we’ve learned that keeping the momentum going serves as an important reminder for the nonprofit. This can involve phone call reminders and emails for links to go live and other benefits to come through. Again, be polite and respectful.
  2. It’s SO worth all the effort though! It shows that your brand cares. A sponsorship campaign is a fantastic way to get in front of your target audience in areas that have a special meaning at a personal level. And not in a broad general scope, but locally. Locally sponsoring a beach cleanup in Santa Monica gives you the opportunity to impact a highly localized audience with a very particular cause in mind that would ultimately affect their everyday life, as opposed to partnering with a huge foundation advocating for clean oceans.

Enhancing a local campaign

Some prefer to use local sponsorships as a link building effort, but there are ways — and ample benefit — to going far beyond the link.

Local event attendance

So, so many local sponsorship campaigns come with the opportunity for event attendance. We currently have 11,345 opportunities in our database (62.2% of our total inventory) that feature events: 5Ks, galas, performances, parades, and even a rubber ducky derby or two! If you’re able to send local team members, find opportunities that match your target audience and test it out — and bring your camera so your social and brand team will have material for publication. If local team members aren’t an option, consider working with a notable and ambitious startup such as Field Day, which can send locals out on behalf of your brand. We’ve spoken with them on several occasions and found them adaptable and wonderful to work with.

Coupons/invitations

One client, FunBrands, used local sponsorships as a way to reach out to locals ahead of stores’ grand re-openings (read the full case study here).

For another client, we created unique coupons for each local organization, using print and social media posts for distribution.

An example coupon — use codes to track attribution back to an event.


Conclusion: Local sponsorships are a channel

Sponsorships are an actionable strategy that contribute to your local rankings, while providing unprecedented opportunities for community engagement and neighborly branding. We hope that this updated guide will provide a strong operational overview along with realistic expectations — and even inspirations — for a local sponsorship campaign in your target cities.

Last but not least: As with all outreach campaigns, please remember to be human. Keep in mind that local engagements are the living extension of your brand in the real world. And if somehow this article wasn’t enough, we just finished up The Local Sponsorship Playbook. Every purchase comes with a 30-minute consultation with the author. We hope everyone chooses to get out, get local, and join the community in the channel that truly benefits everyone.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Just How Much is Your Website Worth, Anyhow? An Easy Guide to Valuation

Posted by efgreg

We all work hard building our businesses.

We put in the sweat equity and all the tears that can come with it to build something truly great. After another day hustling at the office or typing furiously on your keyboard, you might be wondering… what is the end game here?

What are you really going for? Is there a glowing neon sign with the word “Exit” marking the path to your ultimate goal?

For the majority of businesses, the end goal is to eventually sell that business to another entrepreneur who wants to take the reins and simply enjoy the profits from the sale. Alas, most of us don’t even know what our business is worth, much less how to go about selling it — or if it’s even sellable to begin with.

That’s where Empire Flippers comes in. We’ve been brokering deals for years in the online business space, serving a quiet but hungry group of investors who are looking to acquire digital assets. The demand for profitable digital assets has been growing so much that our brokerage was able to get on the Inc. 5000 list two years in a row, both times under the 500 mark.

We can say with confidence that, yes, there is indeed an exit for your business.

By the end of this article you’re going to know more about how online businesses are valued, what buyers are looking for, and how you can get the absolute top dollar for your content website, software as a service (SaaS), or e-commerce store.

(You might have noticed I didn’t include the word “agency” in the last paragraph. Digital agencies are incredibly hard to sell; to do so, you need to have streamlined your process as much as possible. Even though having clients is great, other digital assets are far easier to sell.)

If you’ve built a digital asset you’re looking to exit from, the first question you likely have is, “This sounds fantastic, but how do I go about putting an actual price tag on what I’ve created?”

We’ll dive into those answers below, but first let’s talk about why you’re already in a great position just by being a reader of the Moz Blog.

Why is SEO the most valuable traffic for a digital asset?

SEO is by far the most attractive traffic source for people looking at purchasing online businesses.

The beauty of SEO is that once you’ve put in the work to achieve the rankings, they can maintain and bring in traffic for sometimes months without significant upkeep. That’s in stark contrast with pay-per-click (PPC) campaigns, such as Facebook ads, which require daily monitoring to make sure nothing strange is happening with your conversions or that you’re not overspending.

For someone who has no experience with traffic generation but wants to purchase a profitable online business, an SEO-fueled website just makes sense. They can earn while they learn. When they purchase the asset (typically a content website for people just starting out), they can play around with adding new high-quality pieces of content and learn about more complicated SEO techniques down the road.

Even someone who is a master at paid traffic loves SEO. They might buy an e-commerce store that has some real potential with Facebook ads that’s currently driving the majority of its traffic through SEO, and treat the SEO as gravy on top of the paid traffic they plan to drive toward that e-commerce store.

Whether the buyer is a newbie or a veteran, SEO as a traffic method has one of the widest appeals of any other traffic strategy. While SEO itself does not increase the value of the business in most cases, it does attract more buyers than other forms of traffic.

Now, let’s get down to what your business is worth.

How are online businesses actually valued?

How businesses are valued is such a common question we get at our brokerage that we created an automated valuation tool that gives a free estimate of your business’s value, which our audience uses with all of their different projects.

At the heart of any valuation is a fairly basic formula:

You look at your rolling 12-month net profit average and then times that by a multiple. Typically, a multiple will range between 20–50x of the 12-month average net profit for healthy, profitable online businesses. As you get closer to 50x you have to be able to show your business is growing in a BIG way month over month and that your business is truly defensible (something we’ll talk about later in this article).

You might see some brokers using a 2x or 3x EBITDA, which stands for earnings before interest, tax, depreciation, and amortization.

When you see this formula, they’re using an annual multiple, whereas at Empire Flippers we use a monthly multiple. There’s really not much of a difference between the two formulas; it mainly depends on your preference, but if you’re brand new to buying and selling online businesses, then it’s helpful to know how different brokers price businesses.

We prefer the monthly multiple since it shows a more granular picture of the business and where it’s trending.

Just like you can influence Google SERPs with SEO knowledge, so can you manipulate this formula to give you a better valuation as long as you know what you’re looking at.

How to move the multiple needle in your favor

There are various things you can do to get a higher multiple. A lot of it comes down to just common sense and really putting yourself in the buyer’s shoes.

A useful thing to ask: “Would I ever buy my business? Why? Why not?”

This exercise can lead you to change a lot of things about your business for the better.

The two areas that most affect the multiple come down to your actual average net profit and how long the business has been around making money.

Average net profit

The higher your average net profit, the higher your multiple will tend to be because it’s a bigger cash-flowing asset. It makes sense then to look at various ways you can increase that net profit and decrease your total amount of expenses.

Every digital asset is a little different in where their expenses are coming from. For content sites, content creation costs are typically the lion’s share of expenses. As you approach the time of sale, you might want to scale back your content. In other cases, you may want to move to an agency solution where you can scale or minimize your content expenses at will rather than having in-house writers on the payroll.

There are also expenses that you might be applying to the business but aren’t really “needed” in operating the business, known as add-backs.

Add-backs

Add-backs are where you add certain expenses BACK into the net profit. These are items that you might’ve charged on the business account but aren’t really relevant to running the business.

These could be drinks, meals, or vacations put on the business account, and sometimes even business conferences. For example, going to a conference about email marketing might not be considered a “required” expense to running a health content site, whereas going to a sourcing conference like the Canton Fair would be a harder add-back to justify when it comes to running an e-commerce store.

Other things, such as SEO tools you’re using on a monthly basis, can likely be added back to the business. Most people won’t need them constantly to run and grow their business. They might subscribe for a month, get all the keyword data they need for a while, cancel, and then come back when they’re ready to do more keyword research.

Most of your expenses won’t be add-backs, but it is good to keep these in mind as they can definitely increase the ultimate sales price of your business.

When not to cut expenses

While there’s usually a lot of fat you can cut from your business, you need to be reasonable about it. Cutting some things might improve your overall net profit, but vastly decrease the attractability of your business.

One common thing we see in the e-commerce space is solopreneurs starting to package and ship all of the items themselves to their customers. The thinking goes that they’re saving money by doing it themselves. While this may be true, it’s not an attractive solution to a potential buyer.

It’s far more attractive to spend money on a third-party solution that can store and ship the product for you as orders come in. After all, many buyers are busy traveling the world while having an online business. Forcing them to settle down just so they can ship products versus hanging out on the beaches of Bali for a few months during winter is a tough ask.

When selling a business, you don’t want to worry only about expenses, but also how easy it is to plug into and start running that business for a buyer.

Even if the systems you create to do that add extra expenses, like using a third party to handle fulfillment, they’re often more than worth keeping around because they make the business look more attractive to buyers.

Length of history

The more history you can show, the more attractive your business will be, as long as it’s holding at a steady profit level or showing an upward trend.

The more your business is trending upward, the higher multiple you’re going to get.

While you can’t do much in terms of lengthening the business’s history, you can prepare yourself for the eventual sale by investing in needed items early on in your business. For example, if you know your website needs a big makeover and you’re 24 months out from selling, it’s better to do that big website redesign now instead of during the 12-month average your business will be priced on.

Showing year-over-year growth is also beneficial in getting a better multiple, because it shows your business can weather growing pains. This ability to weather business challenges is especially true in a business whose primary traffic is Google-organic. It shows that the site has done quality SEO by surviving several big updates over the course of a few years.

On the flipside, a trending downward business is going to get a much worse multiple, likely in the 12–18x range. A business in decline can still be sold, though. There are specific buyers that only want distressed assets because they can get them at deep discounts and often have the skill sets needed to fix the site.

You just have to be willing to take a lower sales price due to the decline, and since a buyer pool on distressed assets is smaller, you’ll likely have a longer sales cycle before you find someone willing to acquire the asset.

Other factors that lead to a higher multiple

While profit and length of history are the two main factors, there are a bunch of smaller factors that can add up to a significant increase in your multiple and ultimate valuation price.

You’ll have a fair amount of control with a lot of these, so they’re worth maximizing as much as possible in the 12–24 month window where you are preparing your online business for sale.

1. Minimize critical points of failure

Critical points of failure are anything in your business that has the power to be a total deal breaker. It’s not rare to sell a business that has one or two critical points, but even so you want to try to minimize this as much as possible.

An example of a critical point of failure could be where all of your website traffic is purely Google-organic. If the site gets penalized by a Google algorithm update, it could kill all of your traffic and revenue overnight.

Likewise, if you’re an Amazon affiliate and Amazon suddenly changes their Terms of Service, you could get banned for reasons you don’t understand or even have time to react to, ending up with a highly trafficked site that makes zero money.

In the e-commerce space, we see situations where the entrepreneur only has one supplier that can make their product. What happens if that supplier wants to jack up the prices or suddenly goes out of business completely?

It’s worth your while to diversify your traffic sources, have multiple monetization strategies for a content site, or investigate having backup or even competing suppliers for your e-commerce products.

Every business has some kind of weakness; your job is to minimize those weaknesses as much as possible to get the most value out of your business from a potential buyer.

2. High amounts of traffic

Higher traffic tends to correlate with higher revenue, which ultimately should increase your net profit. That all goes without saying; however, high traffic also can be an added bonus to your multiple on top of helping create a solid net profit.

Many buyers look for businesses they can optimize to the extreme at every point of the marketing funnel. When you have a high amount of traffic, you give them a lot of room to play with different conversion rate optimization factors like increasing email options, creating or crafting a better abandoned cart sequence, and changing the various calls to action on the site.

While many sellers might be fantastic at driving traffic, they might not exactly be the biggest pro at copywriting or CRO in general; this is where a big opportunity lies for the right buyer who might be able to increase conversions with their own copywriting or CRO skill.

3. Email subscribers

It’s almost a cliche in the Internet marketing space to say “the money is in the list.” Email has often been one of the biggest drivers of revenue for companies, but there’s a weird paradigm we’ve discovered after selling hundreds of online businesses.

Telling someone they should use an email list is pretty similar to telling someone to go to the gym: they agree it’s useful and they should do it, but often they do nothing about it. Then there are those who do build an email list because they understand its power, but then never do anything useful with it.

This results in email lists being a hit-or-miss on whether they actually add any value to your business’s final valuation.

If you can prove the email list is adding value to your business, then your email list CAN improve your overall multiple. If you use good email automation sequences to up-sell your traffic and routinely email the list with new offers and pieces of high-quality content, then your email list has real value associated with it, which will reflect on your final valuation.

4. Social media following

Social media has become more and more important as time goes on, but it can also be an incredibly fickle beast.

It’s best to think of your social media following as a “soft” email list. The reach of your social media following compared to your email list will tend to be lower, especially as social organic reach keeps declining on bigger social platforms like Facebook. In addition, you don’t own the platform that following is built off of, meaning it can be taken away from you anytime for reasons outside of your control.

Plus, it’s just too easy to fake followers and likes.

However, if you can wade through all that and prove that your social following and social media promotion are driving real traffic and sales to your business, it will definitely help in increasing your multiple.

5. How many product offerings you have

Earning everything from a single product is somewhat risky.

What happens if that product goes out of style? Or gets discontinued?

Whether you’re running an e-commerce store or a content site monetizing through affiliate links, you want to have several different product offerings.

When you have several products earning good money through your website, then a buyer will find the business ultimately more attractive and value it more because you won’t be hurt in a big way if one of the “flavors of the month” disappears on you.

6. Hours required

Remember, the majority of buyers are not looking at acquiring a job. They want a leveraged cash-flowing investment they can ideally scale up.

While there’s nothing wrong with working 40–50+ hours per week on a business that is really special, it will narrow your overall buyer pool and make the business less attractive. The truth is, most of the digital assets we’re creating don’t really require this amount of work from the owner.

What we typically see is that there are a lot of areas for improvement that the seller can use to minimize their weekly hour allotment to the business. We recommend that everyone looking to sell their business first consider how they can minimize their actual involvement.

The three most effective ways to cut down on your time spent are:

  • Systemization: Automating as much of your business as possible
  • Developing a team: The biggest wins we see here tend to be in content creation, customer service, general operations, and hiring a marketing agency to do the majority of the heavy lifting for you. While these add costs that drive down the average net profit, they also make your business far more attractive.
  • Creating standard operating procedures (SOPs): SOPs should outline the entire process of a specific function of the business and should be good enough that if you handed them off to someone, they could do the job 80 percent as well as you.

You should always be in a position where you’re working ON your business and not IN.

7. Dig a deeper moat

At Empire Flippers, we’re always asking people if they built a deep enough moat around their business. A deep moat means your business is harder to copy. A copycat can’t just go buy a domain and some hosting and copy your business in an afternoon.

A drop-shipping store that can be copied in a single day is not going to be nearly as attractive as one that has built up a real following and a community around their brand, even if they sell the same products.

This fact becomes more and more important as your business valuation goes into the multiple six-figure and seven-figure valuation ranges because buyers are looking to buy a real brand at this point, not just a niche site.

Here are a few actions you can take to deepen this moat:

  • Niche down and own the market with your brand (a woodworking website might focus specifically on benches, for example, where you’re hiring expert artisans to write content on the subject).
  • Source your products and make them unique, rather than another “me too” product.
  • Negotiate special terms with your affiliate managers or suppliers. If you’ve been sending profitable traffic to an affiliate offer, often you can just email the affiliate manager asking for a pay bump and they’ll gladly give it. Likewise, if you’re doing good business for a drop-shipping supplier, they might be open to doing an exclusivity agreement with you. Make sure all of these special terms are transferable to the buyer, though.

The harder it is to copy what you’ve built, the higher the multiple you’ll get.

But why would you EVER sell your online business in the first place?

You’re now well-equipped with knowledge on how to increase your business’s ultimate value, but why would you actually sell it?

The reasons are vast and numerous — too many to list in this post. However, there are a few common reasons you might resonate with.

Here are a few business reasons why people sell their businesses:

  • Starting a new business or wanting to focus on other current projects
  • Seeking to use the capital to leverage themselves into a more competitive (and lucrative) space
  • Having lost any interest in running the business and want to sell the asset off before it starts reflecting their lack of interest through declining revenue
  • Wanting to cash out of the business to invest in offline investments like real estate, stocks, bonds, etc.

Just as there are a ton of business reasons to sell, there are also a ton of personal reasons why people sell their business:

  • Getting a divorce
  • Purchasing a home for their family (selling one digital asset can be a hefty down payment for a home, or even cover the entirety of the home)
  • Having medical issues
  • Other reasons: We had one seller on our marketplace whose reason for selling his business was to get enough money to adopt a child.

When you can collect 20–50 months of your net profit upfront, you can do a lot of things that just weren’t options before.

When you have a multiple six-figure or even seven-figure war chest, you can often outspend the competition, invest in infrastructure and teams you couldn’t before, and in general jumpstart your next project or business idea far faster without ever having to worry about if a Google update is going tank your earnings or some other unforeseen market change.

That begs the question…

When should you sell?

Honestly, it depends.

The answer to this question is more of an art than a science.

As a rule of thumb, you should ask yourself if you’re excited by the kind of money you’ll get from the successful sale of your online business.

You can use our valuation tool to get a ballpark estimate or do some back-of-the-napkin math of what you’re likely to receive for the business using the basic multiple formula I outlined. I prefer to always be on the conservative side with my estimations, so your napkin math might be taking your 12-month average net profit with a multiple of 25x.

Does that number raise your eyebrows? Is it even interesting?

If it is, then you might want to start asking yourself if you really are ready to part with your business to focus on other things. Remember, you should always set a MINIMUM sales price that you’d be willing to walk away from the business with, something that would still make you happy if you went through with it.

Most of us Internet marketers are always working on multiple projects at once. Sadly, some projects just don’t get the love they deserve or used to get from us.

Instead of letting those projects just die off in the background, consider selling your online business instead to a very hungry market of investors starting to flood our digital realm.

Selling a business, even if it’s a side project that you’re winding down, is always going to be an intimate process. When you’re ready to pull the trigger, we’ll be there to help you every step of the way.

Have you thought about selling your online business, or gone through a sale in the past? Let us know your advice, questions, or anecdotes in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

A Step-by-Step Guide to Setting Up and Growing Your YouTube Presence

Posted by AnnSmarty

When was the last time you saw a video on YouTube? I bet you’ve seen one today. YouTube is too huge and too popular for marketers to ignore.

If you don’t have a YouTube channel, now’s the time to start one.

If you have a channel and you never got it off the ground, now’s the time to take action.

This article will take you through the process of setting up your YouTube presence, listing steps, tools, and important tips to get you started and moving forward.

1. Define your goals

If your goal is to become a YouTube star, you might be a bit late to the party: it’s really hard to get noticed these days — too competitive. Stardom will take years of hard work to achieve because of the number of channels users have to choose from.

Even back in 2014, when I was reading about YouTube celebrity bloggers, one quote really stood out to me:

“We think, if we were coming to YouTube today, it would be too hard. We couldn’t do it.”

That’s not to say, however, that you cannot achieve other, more tangible goals on YouTube. It’s an excellent venue for business owners and marketers.

Here are three achievable goals that make more sense than fame from a business perspective:

1.1. YouTube for reputation management

Here’s one thing about reputation management on Google: You’re never finished.

Even if your reputation is fabulous and you love every single result that comes up in the SERPs for your business name, you may still want to publish more content around your brand.

The thing is, for reputation management purposes, the more navigational queries you can control, the better:

Reputation

YouTube is the perfect platform for reputation management. YouTube videos rank incredibly well in Google, especially when it comes to low-competition navigational queries that include your brand name.

Furthermore, YouTube videos almost always get that rich snippet treatment (meaning that Google shows the video thumbnail, author, and length of the video in the SERPs). This means you can more easily attract attention to your video search result.

That being said, think about putting videos on YouTube that:

  • Give your product/service overview
  • Show happy customers
  • Visualize customer feedback (for example, visual testimonials beautifully collected and displayed in a video)
  • Offer a glimpse inside your team (show people behind the brand, publish videos from events or conferences, etc.)

1.2 YouTube videos for improved conversions

Videos improve conversions for a clear reason: They offer a low-effort way for your customer to see why they need your product. Over the years, there have been numerous case studies proving the point:

  • An older study (dating back to 2011) states that customers are 144% more likely to add products to a shopping cart after watching the product video
  • Around 1 in 3 millennials state they have bought a product directly as a result of watching a how-to video on it
  • This Animoto survey found that almost all the participants (96%) considered videos “helpful when making purchasing decisions online”
  • Wistia found that visitors who engage with a video are much more likely to convert than those who don’t

That being said, YouTube is a perfect platform to host your video product overviews: it’s free, it offers the additional benefit of ranking well in Google, and it provides additional exposure to your products through their huge community, allowing people to discover your business via native search and suggested videos.

1.3 YouTube for creating alternative traffic and exposure channels

YouTube has huge marketing potential that businesses in most niches just cannot afford to ignore: it serves as a great discovery engine.

Imagine your video being suggested next after your competitor’s product review. Imagine your competitors’ customers stumbling across your video comparison when searching for an alternative service on Youtube.

Just being there increases your chances of getting found.

Again, it’s not easy to reach the YouTube Top 10, but for specific low-competition queries it’s quite doable.

Note: To be able to build traffic from inside your YouTube videos, you need to build up your channel to 10,000 public overall views to qualify to become a YouTube partner. Once approved, you’ll be able to add clickable links to your site from within your videos using cards and actually build up your own site traffic via video views.

2. Develop a video editorial calendar

As with any type of content, video content requires a lot of brainstorming, organizing, and planning.

My regular routine when it comes to creating an editorial calendar is as follows:

  1. Start with keyword research
  2. Use question research to come up with more specific ideas
  3. Use seasonality to come up with timing for each piece of content
  4. Allocate sufficient time for production and promotion

You can read about my exact editorial process here. Here’s a sample of my content roadmap laying out a major content asset for each month of the year, based on keyword research and seasonality:

Content roadmap

For keyword and question research I use Serpstat because they offer a unique clustering feature. For each keyword list you provide, they use the Google search results page to identify overlapping and similar URLs, evaluate how related different terms in your list are, and based on that, cluster them into groups.

Keyword clustering

This grouping makes content planning easier, allowing you to see the concepts behind keyword groups and put them into your roadmap based on seasonality or other factors that come into play (e.g. is there a slot/gap you need to fill? Are there company milestones or events coming up?).

Depending on how much video content you plan to create, you can set up a separate calendar or include videos in your overall editorial calendar.

When creating your roadmap, keep your goals in mind, as well. Some videos, such as testimonials and product reviews, won’t be based on your keyword research but still need to be included in the roadmap.

3. Proceed to video production

Video production can be intimidating, especially if you have a modest budget, but these days it’s much easier and more affordable than you’d imagine.

Keeping lower-budget campaigns in mind, here are few types of videos and tools you can try out:

3.1 In-house video production

You can actually handle much of your video production in-house without the need to set up a separate room or purchase expensive gadgets.

Here are a few ideas:

  • Put together high-quality explanatory videos using Animatron (starts at $ 15/month): Takes a day or so to get to know all the available tools and options, but after that the production goes quite smoothly
  • Create beautiful visual testimonials, promo videos, and visual takeaways using Animoto ($ 8/month): You don’t need much time to learn to use it; it’s very easy and fun.
  • Create video tutorials using iMovie (free for Mac users): It will take you or your team about a week to properly figure out all its options, but you’ll get there eventually.
  • Create video interviews with niche influencers using Blue Jeans (starts at $ 12.49/month)
  • Create (whiteboard) presentations using ClickMeeting (starts at $ 25/month): Host a webinar first, then use the video recording as a permanent brand asset. ClickMeeting will save your whiteboard notes and let you reuse them in your article. You can brand your room to show your logo and brand colors in the video. Record your entire presentation using presentation mode, then upload them to your channel.

Clickmeeting

3.2 How to affordably outsource video production

The most obvious option for outsourcing video production is a site like Fiverr. Searching its gigs will actually give you even more ideas as to what kinds of videos you might create. While you may get burned there a few times, don’t let it discourage you — there are plenty of creative people who can put together awesome videos for you.

Another great idea is to reach out to YouTube bloggers in your niche. Some of them will be happy to work for you, and as a bonus you’ll be rewarded with additional exposure from their personal branding and social media channels.

I was able to find a great YouTube blogger to work for my client for as low as $ 75 per video; those videos were of top quality and upload-ready.

There’s lots of talent out there: just spend a few weeks searching and reaching out!

4. Optimize each video page

When uploading your videos to YouTube, spend some time optimizing each one. Add ample content to each video page, including a detailed title, a detailed description (at least 300–500 characters), and a lot of tags.

  • Title of the video: Generally, a more eye-catching and detailed title including:
    • Your core term/focus keyword (if any)
    • Product name and your brand name
    • The speaker’s name when applicable (for example, when you post interviews). This may include their other identifiable personal brand elements, such as their Twitter handle
    • Event name and hashtag (when applicable)
    • City, state, country (especially if you’re managing a local business)
  • Description of the video: The full transcript of the video. This can be obtained via services such as Speechpad.
  • A good readable and eye-catching thumbnail: These can be created easily using a tool like Canva.

Use a checklist:

Youtube SEO checklist

5. Generate clicks and engagement

Apart from basic keyword matching using video title and description, YouTube uses other video-specific metrics to determine how often the video should be suggested next to related videos and how high it should rank in search results.

Here’s an example of how that might work:

The more people that view more than the first half of your video, the better. If more than 50% of all your video viewers watched more than 50% of the video, YouTube would assume your video is high quality, and so it could pop up in “suggested” results next to or at the end of other videos. (Please note: These numbers are examples, made up using my best judgment. No one knows the exact percentage points YouTube is using, but you get the general idea of how this works.)

That being said, driving “deep” views to your videos is crucial when it comes to getting the YouTube algorithm to favor you.

5.1 Create a clickable table of contents to drive people in

Your video description and/or the pinned comment should have a clickable table of contents to draw viewers into the video. This will improve deep views into the video, which are a crucial factor in YouTube rankings.

Table of contents

5.2 Use social media to generate extra views

Promoting your videos on social media is an easy way to bring in some extra clicks and positive signals.

5.2.1 First, embed the video to your site

Important: Embed videos to your web page and promote your own URL instead of the actual YouTube page. This approach has two important benefits:

  • Avoid auto-plays: Don’t screw up your YouTube stats! YouTube pages auto-play videos by default, so if you share a YouTube URL on Twitter, many people will click and immediately leave (social media users are mostly lurkers). However, if you share your page with the video embedded on it, it won’t play until the user clicks to play. This way you’ll ensure the video is played only by people who seriously want to watch it.
  • Invest time and effort into your own site promotion instead of marketing the youtube.com page: Promoting your own site URL with the video embedded on it, you can rest assured that more people will keep interacting with your brand rather than leave to watch other people’s videos from YouTube suggested results.

There are also plenty of ways to embed YouTube videos naturally in your blog and offer more exposure. Look at some of these themes, for example, for ideas to display videos in ways that invite views and engagement.

Video sharing WordPress

5.2.2 Use tools to partially scale social media promotion

For better, easier social media exposure, consider these options:

  • Investing in paid social media ads, especially Facebook ads, as they work best for engagement
  • Use recurring tweets to scale video promotion. There are a few tools you can try, such as DrumUp. Schedule the same update to go live several times on your chosen social media channels, generating more YouTube views from each repeated share. This is especially helpful for Twitter, because the lifespan of a tweet is just several minutes (between two and ten minutes, depending on how active and engaged your Twitter audience is). With recurring tweets, you’ll make sure that more of your followers see your update.

  • A project I co-founded, Viral Content Bee, can put your videos in front of niche influencers on the lookout for more content to share on their social media accounts.

5.3 Build playlists

By sorting your videos into playlists, you achieve two important goals:

  • Keeping your viewers engaged with your brand videos longer: Videos within one playlist keep playing on autopilot until stopped
  • Creating separate brand assets of their own: Playlist URLs are able to rank both in YouTube and Google search results, driving additional exposure to your videos and brand overall, as well as allowing you to control more of those search results:

Playlists

Using playlists, you can also customize the look and feel of your YouTube channel more effectively to give your potential subscribers a glimpse into additional topics you cover:

Customize Youtube channel

Furthermore, by customizing the look of your YouTube channel, you transform it into a more effective landing page, highlighting important content that might otherwise get lost in the archives.

6. Monitor your progress

6.1 Topvisor

Topvisor is the only rank tracker I am aware of that monitors YouTube rankings. You’ll have to create a new project for each of your videos (which is somewhat of a pain), but you can monitor multiple keywords you’re targeting for each video. I always monitor my focus keyword, my brand name, and any other specific information I’m including in the video title (like location and the speaker’s name):

Topvisor

6.2 YouTube Analytics

YouTube provides a good deal of insight into how your channel and each individual video is doing, allowing you to build on your past success.

  • You’ll see traffic sources, i.e. where the views are coming from: suggested videos, YouTube search, external (traffic from websites and apps that embed your videos or link to them on YouTube), etc.
  • The number of times your videos were included in viewers’ playlists, including favorites, for the selected date range, region, and other filters. This is equal to additions minus removals.
  • Average view duration for each video.
  • How many interactions (subscribers, likes, comments) every video brought.

Youtube Analytics

You can see the stats for each individual video, as well as for each of your playlists.

6.3 Using a dashboard for the full picture

If you produce at least one video a month, you may want to set up a dashboard to get an overall picture of how your YouTube channel is growing.

Cyfe (disclaimer: as of recently, Cyfe is a content marketing client of mine) is a tool that offers a great way to keep you organized when it comes to tracking your stats across multiple platforms and assets. I have a separate dashboard there which I use to keep an eye on my YouTube channels.

Cyfe Youtube

Conclusion

Building a YouTube channel is hard work. You’re likely to see little or no activity for weeks at a time, maybe even months after you start working on it. Don’t let this discourage you. It’s a big platform with lots of opportunity, and if you keep working consistently, you’ll see your views and engagement steadily growing.

Do you have a YouTube channel? What are you doing to build it up and increase its exposure? Let us know in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The Website Migration Guide: SEO Strategy, Process, & Checklist

Posted by Modestos

What is a site migration?

A site migration is a term broadly used by SEO professionals to describe any event whereby a website undergoes substantial changes in areas that can significantly affect search engine visibility — typically substantial changes to the site structure, content, coding, site performance, or UX.

Google’s documentation on site migrations doesn’t cover them in great depth and downplays the fact that so often they result in significant traffic and revenue loss, which can last from a few weeks to several months — depending on the extent search engine ranking signals have been affected, as well as how long it may take the affected business to rollout a successful recovery plan.

Quick access links

Site migration examples
Site migration types
Common site migration pitfalls
Site migration process
1. Scope & planning
2. Pre-launch preparation
3. Pre-launch testing
4. Launch day actions
5. Post-launch testing
6. Performance review
Site migration checklist
Appendix: Useful tools


Site migration examples

The following section discusses how both successful and unsuccessful site migrations look and explains why it is 100% possible to come out of a site migration without suffering significant losses.

Debunking the “expected traffic drop” myth

Anyone who has been involved with a site migration has probably heard the widespread theory that it will result in de facto traffic and revenue loss. Even though this assertion holds some truth for some very specific cases (i.e. moving from an established domain to a brand new one) it shouldn’t be treated as gospel. It is entirely possible to migrate without losing any traffic or revenue; you can even enjoy significant growth right after launching a revamped website. However, this can only be achieved if every single step has been well-planned and executed.

Examples of unsuccessful site migrations

The following graph illustrates a big UK retailer’s botched site migration where the website lost 35% of its visibility two weeks after switching from HTTP to HTTPS. It took them about six months to fully recover, which must have had a significant impact on revenue from organic search. This is a typical example of a poor site migration, possibly caused by poor planning or implementation.

Example of a poor site migration — recovery took 6 months!

But recovery may not always be possible. The below visibility graph is from another big UK retailer, where the HTTP to HTTPS switchover resulted in a permanent 20% visibility loss.

Another example of a poor site migration — no signs of recovery 6 months on!

In fact, it’s is entirely possible to migrate from HTTP to HTTPS without losing so much traffic and for so such a long period, aside from the first few weeks where there is high volatility as Google discovers the new URLs and updates search results.

Examples of successful site migrations

What does a successful site migration look like? This largely depends on the site migration type, the objectives, and the KPIs (more details later). But in most cases, a successful site migration shows at least one of the following characteristics:

  1. Minimal visibility loss during the first few weeks (short-term goal)
  2. Visibility growth thereafter — depending on the type of migration (long-term goal)

The following visibility report is taken from an HTTP to HTTPS site migration, which was also accompanied by significant improvements to the site’s page loading times.

The following visibility report is from a complete site overhaul, which I was fortunate to be involved with several months in advance and supported during the strategy, planning, and testing phases, all of which were equally important.

As commonly occurs on site migration projects, the launch date had to be pushed back a few times due to the risks of launching the new site prematurely and before major technical obstacles were fully addressed. But as you can see on the below visibility graph, the wait was well worth it. Organic visibility not only didn’t drop (as most would normally expect) but in fact started growing from the first week.

Visibility growth one month after the migration reached 60%, whilst organic traffic growth two months post-launch exceeded 80%.

Example of a very successful site migration — instant growth following new site launch!

This was a rather complex migration as the new website was re-designed and built from scratch on a new platform with an improved site taxonomy that included new landing pages, an updated URL structure, lots of redirects to preserve link equity, plus a switchover from HTTP to HTTPS.

In general, introducing too many changes at the same time can be tricky because if something goes wrong, you’ll struggle to figure out what exactly is at fault. But at the same time, leaving major changes for a later time isn’t ideal either as it will require more resources. If you know what you’re doing, making multiple positive changes at once can be very cost-effective.

Before getting into the nitty-gritty of how you can turn a complex site migration project into a success, it’s important to run through the main site migration types as well as explain the main reason so many site migrations fail.


Site migration types

There are many site migration types. It all depends on the nature of the changes that take place to the legacy website.

Google’s documentation mostly covers migrations with site location changes, which are categorised as follows:

  • Site moves with URL changes
  • Site moves without URL changes

Site move migrations

URL-structure2.png

These typically occur when a site moves to a different URL due to any of the below:

Protocol change

A classic example is when migrating from HTTP to HTTPS.

Subdomain or subfolder change

Very common in international SEO where a business decides to move one or more ccTLDs into subdomains or subfolders. Another common example is where a mobile site that sits on a separate subdomain or subfolder becomes responsive and both desktop and mobile URLs are uniformed.

Domain name change

Commonly occurs when a business is rebranding and must move from one domain to another.

Top-level domain change

This is common when a business decides to launch international websites and needs to move from a ccTLD (country code top-level domain) to a gTLD (generic top-level domain) or vice versa, e.g. moving from .co.uk to .com, or moving from .com to .co.uk and so on.

Site structure changes

These are changes to the site architecture that usually affect the site’s internal linking and URL structure.

Other types of migrations

There are other types of migration which are triggered by changes to the site’s content, structure, design, or platform.

Replatforming

This is the case when a website is moved from one platform/CMS to another, e.g. migrating from WordPress to Magento or just upgrading to the latest platform version. Replatforming can, in some cases, also result in design and URL changes because of technical limitations that often occur when changing platforms. This is why replatforming migrations rarely result in a website that looks exactly the same as the previous one.

Content migrations

Major content changes such as content rewrites, content consolidation, or content pruning can have a big impact on a site’s organic search visibility, depending on the scale. These changes can often affect the site’s taxonomy, navigation, and internal linking.

Mobile setup changes

With so many options available for a site’s mobile setup moving, enabling app indexing, building an AMP site, or building a PWA website can also be considered as partial site migrations, especially when an existing mobile site is being replaced by an app, AMP, or PWA.

Structural changes

These are often caused by major changes to the site’s taxonomy that impact on the site navigation, internal linking and user journeys.

Site redesigns

These can vary from major design changes in the look and feel to a complete website revamp that may also include significant media, code, and copy changes.

Hybrid migrations

In addition to the above, there are several hybrid migration types that can be combined in practically any way possible. The more changes that get introduced at the same time the higher the complexity and the risks. Even though making too many changes at the same time increases the risks of something going wrong, it can be more cost-effective from a resources perspective if the migration is very well-planned and executed.


Common site migration pitfalls

Even though every site migration is different there are a few common themes behind the most typical site migration disasters, with the biggest being the following:

Poor strategy

Some site migrations are doomed to failure way before the new site is launched. A strategy that is built upon unclear and unrealistic objectives is much less likely to bring success.

Establishing measurable objectives is essential in order to measure the impact of the migration post-launch. For most site migrations, the primary objective should be the retention of the site’s current traffic and revenue levels. In certain cases the bar could be raised higher, but in general anticipating or forecasting growth should be a secondary objective. This will help avoid creating unrealistic expectations.

Poor planning

Coming up with a detailed project plan as early as possible will help avoid delays along the way. Factor in additional time and resources to cope with any unforeseen circumstances that may arise. No matter how well thought out and detailed your plan is, it’s highly unlikely everything will go as expected. Be flexible with your plan and accept the fact that there will almost certainly be delays. Map out all dependencies and make all stakeholders aware of them.

Avoid planning to launch the site near your seasonal peaks, because if anything goes wrong you won’t have enough time to rectify the issues. For instance, retailers should avoid launching a site close to September/October to avoid putting the busy pre-Christmas period at risk. In this case, it would be much wiser launching during the quieter summer months.

Lack of resources

Before committing to a site migration project, estimate the time and effort required to make it a success. If your budget is limited, make a call as to whether it is worth going ahead with a migration that is likely to fail in meeting its established objectives and cause revenue loss.

As a rule of thumb, try to include a buffer of at least 20% in additional resource than you initially think the project will require. This additional buffer will later allow you to quickly address any issues as soon as they arise, without jeopardizing success. If your resources are too tight or you start cutting corners at this early stage, the site migration will be at risk.

Lack of SEO/UX consultation

When changes are taking place on a website, every single decision needs to be weighted from both a UX and SEO standpoint. For instance, removing great amounts of content or links to improve UX may damage the site’s ability to target business-critical keywords or result in crawling and indexing issues. In either case, such changes could damage the site’s organic search visibility. On the other hand, having too much text copy and few images may have a negative impact on user engagement and damage the site’s conversions.

To avoid risks, appoint experienced SEO and UX consultants so they can discuss the potential consequences of every single change with key business stakeholders who understand the business intricacies better than anyone else. The pros and cons of each option need to be weighed before making any decision.

Late involvement

Site migrations can span several months, require great planning and enough time for testing. Seeking professional support late is very risky because crucial steps may have been missed.

Lack of testing

In addition to a great strategy and thoughtful plan, dedicate some time and effort for thorough testing before launching the site. It’s much more preferable to delay the launch if testing has identified critical issues rather than rushing a sketchy implementation into production. It goes without saying that you should not launch a website if it hasn’t been tested by both expert SEO and UX teams.

Attention to detail is also very important. Make sure that the developers are fully aware of the risks associated with poor implementation. Educating the developers about the direct impact of their work on a site’s traffic (and therefore revenue) can make a big difference.

Slow response to bug fixing

There will always be bugs to fix once the new site goes live. However, some bugs are more important than others and may need immediate attention. For instance, launching a new site only to find that search engine spiders have trouble crawling and indexing the site’s content would require an immediate fix. A slow response to major technical obstacles can sometimes be catastrophic and take a long time to recover from.

Underestimating scale

Business stakeholders often do not anticipate site migrations to be so time-consuming and resource-heavy. It’s not uncommon for senior stakeholders to demand that the new site launch on the planned-for day, regardless of whether it’s 100% ready or not. The motto “let’s launch ASAP and fix later” is a classic mistake. What most stakeholders are unaware of is that it can take just a few days for organic search visibility to tank, but recovery can take several months.

It is the responsibility of the consultant and project manager to educate clients, run them through all the different phases and scenarios, and explain what each one entails. Business stakeholders are then able to make more informed decisions and their expectations should be easier to manage.


Site migration process

The site migration process can be split into six main essential phases. They are all equally important and skipping any of the below tasks could hinder the migration’s success to varying extents.


Phase 1: Scope & Planning

Work out the project scope

Regardless of the reasons behind a site migration project, you need to be crystal clear about the objectives right from the beginning because these will help to set and manage expectations. Moving a site from HTTP to HTTPS is very different from going through a complete site overhaul, hence the two should have different objectives. In the first instance, the objective should be to retain the site’s traffic levels, whereas in the second you could potentially aim for growth.

A site migration is a great opportunity to address legacy issues. Including as many of these as possible in the project scope should be very cost-effective because addressing these issues post-launch will require significantly more resources.

However, in every case, identify the most critical aspects for the project to be successful. Identify all risks that could have a negative impact on the site’s visibility and consider which precautions to take. Ideally, prepare a few forecasting scenarios based on the different risks and growth opportunities. It goes without saying that the forecasting scenarios should be prepared by experienced site migration consultants.

Including as many stakeholders as possible at this early stage will help you acquire a deeper understanding of the biggest challenges and opportunities across divisions. Ask for feedback from your content, SEO, UX, and Analytics teams and put together a list of the biggest issues and opportunities. You then need to work out what the potential ROI of addressing each one of these would be. Finally, choose one of the available options based on your objectives and available resources, which will form your site migration strategy.

You should now be left with a prioritized list of activities which are expected to have a positive ROI, if implemented. These should then be communicated and discussed with all stakeholders, so you set realistic targets, agree on the project, scope and set the right expectations from the outset.

Prepare the project plan

Planning is equally important because site migrations can often be very complex projects that can easily span several months. During the planning phase, each task needs an owner (i.e. SEO consultant, UX consultant, content editor, web developer) and an expected delivery date. Any dependencies should be identified and included in the project plan so everyone is aware of any activities that cannot be fulfilled due to being dependent on others. For instance, the redirects cannot be tested unless the redirect mapping has been completed and the redirects have been implemented on staging.

The project plan should be shared with everyone involved as early as possible so there is enough time for discussions and clarifications. Each activity needs to be described in great detail, so that stakeholders are aware of what each task would entail. It goes without saying that flawless project management is necessary in order to organize and carry out the required activities according to the schedule.

A crucial part of the project plan is getting the anticipated launch date right. Ideally, the new site should be launched during a time when traffic is low. Again, avoid launching ahead of or during a peak period because the consequences could be devastating if things don’t go as expected. One thing to bear in mind is that as site migrations never go entirely to plan, a certain degree of flexibility will be required.


Phase 2: Pre-launch preparation

These include any activities that need to be carried out while the new site is still under development. By this point, the new site’s SEO requirements should have been captured already. You should be liaising with the designers and information architects, providing feedback on prototypes and wireframes well before the new site becomes available on a staging environment.

Wireframes review

Review the new site’s prototypes or wireframes before development commences. Reviewing the new site’s main templates can help identify both SEO and UX issues at an early stage. For example, you may find that large portions of content have been removed from the category pages, which should be instantly flagged. Or you may discover that some high traffic-driving pages no longer appear in the main navigation. Any radical changes in the design or copy of the pages should be thoroughly reviewed for potential SEO issues.

Preparing the technical SEO specifications

Once the prototypes and wireframes have been reviewed, prepare a detailed technical SEO specification. The objective of this vital document is to capture all the essential SEO requirements developers need to be aware of before working out the project’s scope in terms of work and costs. It’s during this stage that budgets are signed off on; if the SEO requirements aren’t included, it may be impossible to include them later down the line.

The technical SEO specification needs to be very detailed, yet written in such a way that developers can easily turn the requirements into actions. This isn’t a document to explain why something needs to be implemented, but how it should be implemented.

Make sure to include specific requirements that cover at least the following areas:

  • URL structure
  • Meta data (including dynamically generated default values)
  • Structured data
  • Canonicals and meta robots directives
  • Copy & headings
  • Main & secondary navigation
  • Internal linking (in any form)
  • Pagination
  • XML sitemap(s)
  • HTML sitemap
  • Hreflang (if there are international sites)
  • Mobile setup (including the app, AMP, or PWA site)
  • Redirects
  • Custom 404 page
  • JavaScript, CSS, and image files
  • Page loading times (for desktop & mobile)

The specification should also include areas of the CMS functionality that allows users to:

  • Specify custom URLs and override default ones
  • Update page titles
  • Update meta descriptions
  • Update any h1–h6 headings
  • Add or amend the default canonical tag
  • Set the meta robots attributes to index/noindex/follow/nofollow
  • Add or edit the alt text of each image
  • Include Open Graph fields for description, URL, image, type, sitename
  • Include Twitter Open Graph fields for card, URL, title, description, image
  • Bulk upload or amend redirects
  • Update the robots.txt file

It is also important to make sure that when updating a particular attribute (e.g. an h1), other elements are not affected (i.e. the page title or any navigation menus).

Identifying priority pages

One of the biggest challenges with site migrations is that the success will largely depend on the quantity and quality of pages that have been migrated. Therefore, it’s very important to make sure that you focus on the pages that really matter. These are the pages that have been driving traffic to the legacy site, pages that have accrued links, pages that convert well, etc.

In order to do this, you need to:

  1. Crawl the legacy site
  2. Identify all indexable pages
  3. Identify top performing pages

How to crawl the legacy site

Crawl the old website so that you have a copy of all URLs, page titles, meta data, headers, redirects, broken links etc. Regardless of the crawler application of choice (see Appendix), make sure that the crawl isn’t too restrictive. Pay close attention to the crawler’s settings before crawling the legacy site and consider whether you should:

  • Ignore robots.txt (in case any vital parts are accidentally blocked)
  • Follow internal “nofollow” links (so the crawler reaches more pages)
  • Crawl all subdomains (depending on scope)
  • Crawl outside start folder (depending on scope)
  • Change the user agent to Googlebot (desktop)
  • Change the user agent to Googlebot (smartphone)

Pro tip: Keep a copy of the old site’s crawl data (in a file or on the cloud) for several months after the migration has been completed, just in case you ever need any of the old site’s data once the new site has gone live.

How to identify the indexable pages

Once the crawl is complete, work on identifying the legacy site’s indexed pages. These are any HTML pages with the following characteristics:

  • Return a 200 server response
  • Either do not have a canonical tag or have a self-referring canonical URL
  • Do not have a meta robots noindex
  • Aren’t excluded from the robots.txt file
  • Are internally linked from other pages (non-orphan pages)

The indexable pages are the only pages that have the potential to drive traffic to the site and therefore need to be prioritized for the purposes of your site migration. These are the pages worth optimizing (if they will exist on the new site) or redirecting (if they won’t exist on the new site).

How to identify the top performing pages

Once you’ve identified all indexable pages, you may have to carry out more work, especially if the legacy site consists of a large number of pages and optimizing or redirecting all of them is impossible due to time, resource, or technical constraints.

If this is the case, you should identify the legacy site’s top performing pages. This will help with the prioritization of the pages to focus on during the later stages.

It’s recommended to prepare a spreadsheet that includes the below fields:

  • Legacy URL (include only the indexable ones from the craw data)
  • Organic visits during the last 12 months (Analytics)
  • Revenue, conversions, and conversion rate during the last 12 months (Analytics)
  • Pageviews during the last 12 months (Analytics)
  • Number of clicks from the last 90 days (Search Console)
  • Top linked pages (Majestic SEO/Ahrefs)

With the above information in one place, it’s now much easier to identify your most important pages: the ones that generate organic visits, convert well, contribute to revenue, have a good number of referring domains linking to them, etc. These are the pages that you must focus on for a successful site migration.

The top performing pages should ideally also exist on the new site. If for any reason they don’t, they should be redirected to the most relevant page so that users requesting them do not land on 404 pages and the link equity they previously had remains on the site. If any of these pages cease to exist and aren’t properly redirected, your site’s rankings and traffic will negatively be affected.

Benchmarking

Once the launch of the new website is getting close, you should benchmark the legacy site’s performance. Benchmarking is essential, not only to compare the new site’s performance with the previous one but also to help diagnose which areas underperform on the new site and to quickly address them.

Keywords rank tracking

If you don’t track the site’s rankings frequently, you should do so just before the new site goes live. Otherwise, you will later struggle figuring out whether the migration has gone smoothly or where exactly things went wrong. Don’t leave this to the last minute in case something goes awry — a week in advance would be the ideal time.

Spend some time working out which keywords are most representative of the site’s organic search visibility and track them across desktop and mobile. Because monitoring thousands of head, mid-, and long-tail keyword combinations is usually unrealistic, the bare minimum you should monitor are keywords that are driving traffic to the site (keywords ranking on page one) and have decent search volume (head/mid-tail focus)

If you do get traffic from both brand and non-brand keywords, you should also decide which type of keywords to focus on more from a tracking POV. In general, non-brand keywords tend to be more competitive and volatile. For most sites it would make sense to focus mostly on these.

Don’t forget to track rankings across desktop and mobile. This will make it much easier to diagnose problems post-launch should there be performance issues on one device type. If you receive a high volume of traffic from more than one country, consider rank tracking keywords in other markets, too, because visibility and rankings can vary significantly from country to country.

Site performance

The new site’s page loading times can have a big impact on both traffic and sales. Several studies have shown that the longer a page takes to load, the higher the bounce rate. Unless the old site’s page loading times and site performance scores have been recorded, it will be very difficult to attribute any traffic or revenue loss to site performance related issues once the new site has gone live.

It’s recommended that you review all major page types using Google’s PageSpeed Insights and Lighthouse tools. You could use summary tables like the ones below to benchmark some of the most important performance metrics, which will be useful for comparisons once the new site goes live.

MOBILE

Speed

FCP

DCL

Optimization

Optimization score

Homepage

Fast

0.7s

1.4s

Good

81/100

Category page

Slow

1.8s

5.1s

Medium

78/100

Subcategory page

Average

0.9s

2.4s

Medium

69/100

Product page

Slow

1.9s

5.5s

Good

83/100

DESKTOP

Speed

FCP

DCL

Optimization

Optimization score

Homepage

Good

0.7s

1.4s

Average

81/100

Category page

Fast

0.6s

1.2s

Medium

78/100

Subcategory page

Fast

0.6s

1.3s

Medium

78/100

Product page

Good

0.8s

1.3s

Good

83/100

Old site crawl data

A few days before the new site replaces the old one, run a final crawl of the old site. Doing so could later prove invaluable, should there be any optimization issues on the new site. A final crawl will allow you to save vital information about the old site’s page titles, meta descriptions, h1–h6 headings, server status, canonical tags, noindex/nofollow pages, inlinks/outlinks, level, etc. Having all this information available could save you a lot of trouble if, say, the new site isn’t well optimized or suffers from technical misconfiguration issues. Try also to save a copy of the old site’s robots.txt and XML sitemaps in case you need these later.

Search Console data

Also consider exporting as much of the old site’s Search Console data as possible. These are only available for 90 days, and chances are that once the new site goes live the old site’s Search Console data will disappear sooner or later. Data worth exporting includes:

  • Search analytics queries & pages
  • Crawl errors
  • Blocked resources
  • Mobile usability issues
  • URL parameters
  • Structured data errors
  • Links to your site
  • Internal links
  • Index status

Redirects preparation

The redirects implementation is one of the most crucial activities during a site migration. If the legacy site’s URLs cease to exist and aren’t correctly redirected, the website’s rankings and visibility will simply tank.

Why are redirects important in site migrations?

Redirects are extremely important because they help both search engines and users find pages that may no longer exist, have been renamed, or moved to another location. From an SEO point of view, redirects help search engines discover and index a site’s new URLs quicker but also understand how the old site’s pages are associated with the new site’s pages. This association will allow for ranking signals to pass from the old pages to the new ones, so rankings are retained without being negatively affected.

What happens when redirects aren’t correctly implemented?

When redirects are poorly implemented, the consequences can be catastrophic. Users will either land on Not Found pages (404s) or irrelevant pages that do not meet the user intent. In either case, the site’s bounce and conversion rates will be negatively affected. The consequences for search engines can be equally catastrophic: they’ll be unable to associate the old site’s pages with those on the new site if the URLs aren’t identical. Ranking signals won’t be passed over from the old to the new site, which will result in ranking drops and organic search visibility loss. In addition, it will take search engines longer to discover and index the new site’s pages.

301, 302, JavaScript redirects, or meta refresh?

When the URLs between the old and new version of the site are different, use 301 (permanent) redirects. These will tell search engines to index the new URLs as well as forward any ranking signals from the old URLs to the new ones. Therefore, you must use 301 redirects if your site moves to/from another domain/subdomain, if you switch from HTTP to HTTPS, or if the site or parts of it have been restructured. Despite some of Google’s claims that 302 redirects pass PageRank, indexing the new URLs would be slower and ranking signals could take much longer to be passed on from the old to the new page.

302 (temporary) redirects should only be used in situations where a redirect does not need to live permanently and therefore indexing the new URL isn’t a priority. With 302 redirects, search engines will initially be reluctant to index the content of the redirect destination URL and pass any ranking signals to it. However, if the temporary redirects remain for a long period of time without being removed or updated, they could end up behaving similarly to permanent (301) redirects. Use 302 redirects when a redirect is likely to require updating or removal in the near future, as well as for any country-, language-, or device-specific redirects.

Meta refresh and JavaScript redirects should be avoided. Even though Google is getting better and better at crawling JavaScript, there are no guarantees these will get discovered or pass ranking signals to the new pages.

If you’d like to find out more about how Google deals with the different types of redirects, please refer to John Mueller’s post.

Redirect mapping process

If you are lucky enough to work on a migration that doesn’t involve URL changes, you could skip this section. Otherwise, read on to find out why any legacy pages that won’t be available on the same URL after the migration should be redirected.

The redirect mapping file is a spreadsheet that includes the following two columns:

  • Legacy site URL –> a page’s URL on the old site.
  • New site URL –> a page’s URL on the new site.

When mapping (redirecting) a page from the old to the new site, always try mapping it to the most relevant corresponding page. In cases where a relevant page doesn’t exist, avoid redirecting the page to the homepage. First and foremost, redirecting users to irrelevant pages results in a very poor user experience. Google has stated that redirecting pages “en masse” to irrelevant pages will be treated as soft 404s and because of this won’t be passing any SEO value. If you can’t find an equivalent page on the new site, try mapping it to its parent category page.

Once the mapping is complete, the file will need to be sent to the development team to create the redirects, so that these can be tested before launching the new site. The implementation of redirects is another part in the site migration cycle where things can often go wrong.

Increasing efficiencies during the redirect mapping process

Redirect mapping requires great attention to detail and needs to be carried out by experienced SEOs. The URL mapping on small sites could in theory be done by manually mapping each URL of the legacy site to a URL on the new site. But on large sites that consist of thousands or even hundreds of thousands of pages, manually mapping every single URL is practically impossible and automation needs to be introduced. Relying on certain common attributes between the legacy and new site can be a massive time-saver. Such attributes may include the page titles, H1 headings, or other unique page identifiers such as product codes, SKUs etc. Make sure the attributes you rely on for the redirect mapping are unique and not repeated across several pages; otherwise, you will end up with incorrect mapping.

Pro tip: Make sure the URL structure of the new site is 100% finalized on staging before you start working on the redirect mapping. There’s nothing riskier than mapping URLs that will be updated before the new site goes live. When URLs are updated after the redirect mapping is completed, you may have to deal with undesired situations upon launch, such as broken redirects, redirect chains, and redirect loops. A content-freeze should be placed on the old site well in advance of the migration date, so there is a cut-off point for new content being published on the old site. This will make sure that no pages will be missed from the redirect mapping and guarantee that all pages on the old site get redirected.

Don’t forget the legacy redirects!

You should get hold of the old site’s existing redirects to ensure they’re considered when preparing the redirect mapping for the new site. Unless you do this, it’s likely that the site’s current redirect file will get overwritten by the new one on the launch date. If this happens, all legacy redirects that were previously in place will cease to exist and the site may lose a decent amount of link equity, the extent of which will largely depend on the site’s volume of legacy redirects. For instance, a site that has undergone a few migrations in the past should have a good number of legacy redirects in place that you don’t want getting lost.

Ideally, preserve as many of the legacy redirects as possible, making sure these won’t cause any issues when combined with the new site’s redirects. It’s strongly recommended to eliminate any potential redirect chains at this early stage, which can easily be done by checking whether the same URL appears both as a “Legacy URL” and “New site URL” in the redirect mapping spreadsheet. If this is the case, you will need to update the “New site URL” accordingly.

Example:

URL A redirects to URL B (legacy redirect)

URL B redirects to URL C (new redirect)

Which results in the following redirect chain:

URL A –> URL B –> URL C

To eliminate this, amend the existing legacy redirect and create a new one so that:

URL A redirects to URL C (amended legacy redirect)

URL B redirects to URL C (new redirect)

Pro tip: Check your redirect mapping spreadsheet for redirect loops. These occur when the “Legacy URL” is identical to the “new site URL.” Redirect loops need to be removed because they result in infinitely loading pages that are inaccessible to users and search engines. Redirect loops must be eliminated because they are instant traffic, conversion, and ranking killers!

Implement blanket redirect rules to avoid duplicate content

It’s strongly recommended to try working out redirect rules that cover as many URL requests as possible. Implementing redirect rules on a web server is much more efficient than relying on numerous one-to-one redirects. If your redirect mapping document consists of a very large number of redirects that need to be implemented as one-to-one redirect rules, site performance could be negatively affected. In any case, double check with the development team the maximum number of redirects the web server can handle without issues.

In any case, there are some standard redirect rules that should be in place to avoid generating duplicate content issues:

Even if some of these standard redirect rules exist on the legacy website, do not assume they’ll necessarily exist on the new site unless they’re explicitly requested.

Avoid internal redirects

Try updating the site’s internal links so they don’t trigger internal redirects. Even though search engines can follow internal redirects, these are not recommended because they add additional latency to page loading times and could also have a negative impact on search engine crawl time.

Don’t forget your image files

If the site’s images have moved to a new location, Google recommends redirecting the old image URLs to the new image URLs to help Google discover and index the new images quicker. If it’s not easy to redirect all images, aim to redirect at least those image URLs that have accrued backlinks.


Phase 3: Pre-launch testing

The earlier you can start testing, the better. Certain things need to be fully implemented to be tested, but others don’t. For example, user journey issues could be identified from as early as the prototypes or wireframes design. Content-related issues between the old and new site or content inconsistencies (e.g. between the desktop and mobile site) could also be identified at an early stage. But the more technical components should only be tested once fully implemented — things like redirects, canonical tags, or XML sitemaps. The earlier issues get identified, the more likely it is that they’ll be addressed before launching the new site. Identifying certain types of issues at a later stage isn’t cost effective, would require more resources, and cause significant delays. Poor testing and not allowing the time required to thoroughly test all building blocks that can affect SEO and UX performance can have disastrous consequences soon after the new site has gone live.

Making sure search engines cannot access the staging/test site

Before making the new site available on a staging/testing environment, take some precautions that search engines do not index it. There are a few different ways to do this, each with different pros and cons.

Site available to specific IPs (most recommended)

Making the test site available only to specific (whitelisted) IP addresses is a very effective way to prevent search engines from crawling it. Anyone trying to access the test site’s URL won’t be able to see any content unless their IP has been whitelisted. The main advantage is that whitelisted users could easily access and crawl the site without any issues. The only downside is that third-party web-based tools (such as Google’s tools) cannot be used because of the IP restrictions.

Password protection

Password protecting the staging/test site is another way to keep search engine crawlers away, but this solution has two main downsides. Depending on the implementation, it may not be possible to crawl and test a password-protected website if the crawler application doesn’t make it past the login screen. The other downside: password-protected websites that use forms for authentication can be crawled using third-party applications, but there is a risk of causing severe and unexpected issues. This is because the crawler clicks on every link on a page (when you’re logged in) and could easily end up clicking on links that create or remove pages, install/uninstall plugins, etc.

Robots.txt blocking

Adding the following lines of code to the test site’s robots.txt file will prevent search engines from crawling the test site’s pages.

User-agent: *
Disallow: /

One downside of this method is that even though the content that appears on the test server won’t get indexed, the disallowed URLs may appear on Google’s search results. Another downside is that if the above robots.txt file moves into the live site, it will cause severe de-indexing issues. This is something I’ve encountered numerous times and for this reason I wouldn’t recommend using this method to block search engines.

User journey review

If the site has been redesigned or restructured, chances are that the user journeys will be affected to some extent. Reviewing the user journeys as early as possible and well before the new site launches is difficult due to the lack of user data. However, an experienced UX professional will be able to flag any concerns that could have a negative impact on the site’s conversion rate. Because A/B testing at this stage is hardly ever possible, it might be worth carrying out some user testing and try to get some feedback from real users. Unfortunately, user experience issues can be some of the harder ones to address because they may require sitewide changes that take a lot of time and effort.

On full site overhauls, not all UX decisions can always be backed up by data and many decisions will have to be based on best practice, past experience, and “gut feeling,” hence getting UX/CRO experts involved as early as possible could pay dividends later.

Site architecture review

A site migration is often a great opportunity to improve the site architecture. In other words, you have a great chance to reorganize your keyword targeted content and maximize its search traffic potential. Carrying out extensive keyword research will help identify the best possible category and subcategory pages so that users and search engines can get to any page on the site within a few clicks — the fewer the better, so you don’t end up with a very deep taxonomy.

Identifying new keywords with decent traffic potential and mapping them into new landing pages can make a big difference to the site’s organic traffic levels. On the other hand, enhancing the site architecture needs to be done thoughtfully. Itt could cause problems if, say, important pages move deeper into the new site architecture or there are too many similar pages optimized for the same keywords. Some of the most successful site migrations are the ones that allocate significant resources to enhance the site architecture.

Meta data & copy review

Make sure that the site’s page titles, meta descriptions, headings, and copy have been transferred from the old to the new site without issues. If you’ve created any new pages, make sure these are optimized and don’t target keywords that have already been targeted by other pages. If you’re re-platforming, be aware that the new platform may have different default values when new pages are being created. Launching the new site without properly optimized page titles or any kind of missing copy will have an immediate negative impact on your site’s rankings and traffic. Do not forget to review whether any user-generated content (i.e. user reviews, comments) has also been uploaded.

Internal linking review

Internal links are the backbone of a website. No matter how well optimized and structured the site’s copy is, it won’t be sufficient to succeed unless it’s supported by a flawless internal linking scheme. Internal links must be reviewed throughout the entire site, including links found in:

  • Main & secondary navigation
  • Header & footer links
  • Body content links
  • Pagination links
  • Horizontal links (related articles, similar products, etc)
  • Vertical links (e.g. breadcrumb navigation)
  • Cross-site links (e.g. links across international sites)

Technical checks

A series of technical checks must be carried out to make sure the new site’s technical setup is sound and to avoid coming across major technical glitches after the new site has gone live.

Robots.txt file review

Prepare the new site’s robots.txt file on the staging environment. This way you can test it for errors or omissions and avoid experiencing search engine crawl issues when the new site goes live. A classic mistake in site migrations is when the robots.txt file prevents search engine access using the following directive:

Disallow: /

If this gets accidentally carried over into the live site (and it often does), it will prevent search engines from crawling the site. And when search engines cannot crawl an indexed page, the keywords associated with the page will get demoted in the search results and eventually the page will get de-indexed.

But if the robots.txt file on staging is populated with the new site’s robots.txt directives, this mishap could be avoided.

When preparing the new site’s robots.txt file, make sure that:

  • It doesn’t block search engine access to pages that are intended to get indexed.
  • It doesn’t block any JavaScript or CSS resources search engines require to render page content.
  • The legacy site’s robots.txt file content has been reviewed and carried over if necessary.
  • It references the new XML sitemaps(s) rather than any legacy ones that no longer exist.

Canonical tags review

Review the site’s canonical tags. Look for pages that either do not have a canonical tag or have a canonical tag that is pointing to another URL and question whether this is intended. Don’t forget to crawl the canonical tags to find out whether they return a 200 server response. If they don’t you will need to update them to eliminate any 3xx, 4xx, or 5xx server responses. You should also look for pages that have a canonical tag pointing to another URL combined with a noindex directive, because these two are conflicting signals and you;’ll need to eliminate one of them.

Meta robots review

Once you’ve crawled the staging site, look for pages with the meta robots properties set to “noindex” or “nofollow.” If this is the case, review each one of them to make sure this is intentional and remove the “noindex” or “nofollow” directive if it isn’t.

XML sitemaps review

Prepare two different types of sitemaps: one that contains all the new site’s indexable pages, and another that includes all the old site’s indexable pages. The former will help make Google aware of the new site’s indexable URLs. The latter will help Google become aware of the redirects that are in place and the fact that some of the indexed URLs have moved to new locations, so that it can discover them and update search results quicker.

You should check each XML sitemap to make sure that:

  • It validates without issues
  • It is encoded as UTF-8
  • It does not contain more than 50,000 rows
  • Its size does not exceed 50MBs when uncompressed

If there are more than 50K rows or the file size exceeds 50MB, you must break the sitemap down into smaller ones. This prevents the server from becoming overloaded if Google requests the sitemap too frequently.

In addition, you must crawl each XML sitemap to make sure it only includes indexable URLs. Any non-indexable URLs should be excluded from the XML sitemaps, such as:

  • 3xx, 4xx, and 5xx pages (e.g. redirected, not found pages, bad requests, etc)
  • Soft 404s. These are pages with no content that return a 200 server response, instead of a 404.
  • Canonicalized pages (apart from self-referring canonical URLs)
  • Pages with a meta robots noindex directive
<!DOCTYPE html>
<html><head>
<meta name="robots" content="noindex" />
(…)
</head>
<body>(…)</body>
</html>
  • Pages with a noindex X-Robots-Tag in the HTTP header
HTTP/1.1 200 OK
Date: Tue, 10 Nov 2017 17:12:43 GMT
(…)
X-Robots-Tag: noindex
(…)
  • Pages blocked from the robots.txt file

Building clean XML sitemaps can help monitor the true indexing levels of the new site once it goes live. If you don’t, it will be very difficult to spot any indexing issues.

Pro tip: Download and open each XML sitemap in Excel to get a detailed overview of any additional attributes, such as hreflang or image attributes.

HTML sitemap review

Depending on the size and type of site that is being migrated, having an HTML sitemap can in certain cases be beneficial. An HTML sitemap that consists of URLs that aren’t linked from the site’s main navigation can significantly boost page discovery and indexing. However, avoid generating an HTML sitemap that includes too many URLs. If you do need to include thousands of URLs, consider building a segmented HTML sitemap.

The number of nested sitemaps as well as the maximum number of URLs you should include in each sitemap depends on the site’s authority. The more authoritative a website, the higher the number of nested sitemaps and URLs it could get away with.

For example, the NYTimes.com HTML sitemap consists of three levels, where each one includes over 1,000 URLs per sitemap. These nested HTML sitemaps aid search engine crawlers in discovering articles published since 1851 that otherwise would be difficult to discover and index, as not all of them would have been internally linked.

The NYTimes HTML sitemap (level 1)

The NYTimes HTML sitemap (level 2)

Structured data review

Errors in the structured data markup need to be identified early so there’s time to fix them before the new site goes live. Ideally, you should test every single page template (rather than every single page) using Google’s Structured Data Testing tool.

Be sure to check the markup on both the desktop and mobile pages, especially if the mobile website isn’t responsive.

Structured Data Testing Tool.png

The tool will only report any existing errors but not omissions. For example, if your product page template does not include the Product structured data schema, the tool won’t report any errors. So, in addition to checking for errors you should also make sure that each page template includes the appropriate structured data markup for its content type.

Please refer to Google’s documentation for the most up to date details on the structured data implementation and supported content types.

JavaScript crawling review

You must test every single page template of the new site to make sure Google will be able to crawl content that requires JavaScript parsing. If you’re able to use Google’s Fetch and Render tool on your staging site, you should definitely do so. Otherwise, carry out some manual tests, following Justin Brigg’s advice.

As Bartosz Góralewicz’s tests proved, even if Google is able to crawl and index JavaScript-generated content, it does not mean that it is able to crawl JavaScript content across all major JavaScript frameworks. The following table summarizes Bartosz’s findings, showing that some JavaScript frameworks are not SEO-friendly, with AngularJS currently being the most problematic of all.

Bartosz also found that other search engines (such as Bing, Yandex, and Baidu) really struggle with indexing JavaScript-generated content, which is important to know if your site’s traffic relies on any of these search engines.

Hopefully, this is something that will improve over time, but with the increasing popularity of JavaScript frameworks in web development, this must be high up on your checklist.

Finally, you should check whether any external resources are being blocked. Unfortunately, this isn’t something you can control 100% because many resources (such as JavaScript and CSS files) are hosted by third-party websites which may be blocking them via their own robots.txt files!

Again, the Fetch and Render tool can help diagnose this type of issue that, if left unresolved, could have a significant negative impact.

Mobile site SEO review

Assets blocking review

First, make sure that the robots.txt file isn’t accidentally blocking any JavaScript, CSS, or image files that are essential for the mobile site’s content to render. This could have a negative impact on how search engines render and index the mobile site’s page content, which in turn could negatively affect the mobile site’s search visibility and performance.

Mobile-first index review

In order to avoid any issues associated with Google’s mobile-first index, thoroughly review the mobile website and make there aren’t any inconsistencies between the desktop and mobile sites in the following areas:

  • Page titles
  • Meta descriptions
  • Headings
  • Copy
  • Canonical tags
  • Meta robots attributes (i.e. noindex, nofollow)
  • Internal links
  • Structured data

A responsive website should serve the same content, links, and markup across devices, and the above SEO attributes should be identical across the desktop and mobile websites.

In addition to the above, you must carry out a few further technical checks depending on the mobile site’s set up.

Responsive site review

A responsive website must serve all devices the same HTML code, which is adjusted (via the use of CSS) depending on the screen size.

Googlebot is able to automatically detect this mobile setup as long as it’s allowed to crawl the page and its assets. It’s therefore extremely important to make sure that Googlebot can access all essential assets, such as images, JavaScript, and CSS files.

To signal browsers that a page is responsive, a meta=”viewport” tag should be in place within the <head> of each HTML page.

<meta name="viewport" content="width=device-width, initial-scale=1.0">

If the meta viewport tag is missing, font sizes may appear in an inconsistent manner, which may cause Google to treat the page as not mobile-friendly.

Separate mobile URLs review

If the mobile website uses separate URLs from desktop, make sure that:

  1. Each desktop page has a tag pointing to the corresponding mobile URL.
  2. Each mobile page has a rel=”canonical” tag pointing to the corresponding desktop URL.
  3. When desktop URLs are requested on mobile devices, they’re redirected to the respective mobile URL.
  4. Redirects work across all mobile devices, including Android, iPhone, and Windows phones.
  5. There aren’t any irrelevant cross-links between the desktop and mobile pages. This means that internal links on found on a desktop page should only link to desktop pages and those found on a mobile page should only link to other mobile pages.
  6. The mobile URLs return a 200 server response.

Dynamic serving review

Dynamic serving websites serve different code to each device, but on the same URL.

On dynamic serving websites, review whether the vary HTTP header has been correctly set up. This is necessary because dynamic serving websites alter the HTML for mobile user agents and the vary HTTP header helps Googlebot discover the mobile content.

Mobile-friendliness review

Regardless of the mobile site set-up (responsive, separate URLs or dynamic serving), review the pages using a mobile user-agent and make sure that:

  1. The viewport has been set correctly. Using a fixed width viewport across devices will cause mobile usability issues.
  2. The font size isn’t too small.
  3. Touch elements (i.e. buttons, links) aren’t too close.
  4. There aren’t any intrusive interstitials, such as Ads, mailing list sign-up forms, App Download pop-ups etc. To avoid any issues, you should use either use a small HTML or image banner.
  5. Mobile pages aren’t too slow to load (see next section).

Google’s mobile-friendly test tool can help diagnose most of the above issues:

Google’s mobile-friendly test tool in action

AMP site review

If there is an AMP website and a desktop version of the site is available, make sure that:

  • Each non-AMP page (i.e. desktop, mobile) has a tag pointing to the corresponding AMP URL.
  • Each AMP page has a rel=”canonical” tag pointing to the corresponding desktop page.
  • Any AMP page that does not have a corresponding desktop URL has a self-referring canonical tag.

You should also make sure that the AMPs are valid. This can be tested using Google’s AMP Test Tool.

Mixed content errors

With Google pushing hard for sites to be fully secure and Chrome becoming the first browser to flag HTTP pages as not secure, aim to launch the new site on HTTPS, making sure all resources such as images, CSS and JavaScript files are requested over secure HTTPS connections.This is essential in order to avoid mixed content issues.

Mixed content occurs when a page that’s loaded over a secure HTTPS connection requests assets over insecure HTTP connections. Most browsers either block dangerous HTTP requests or just display warnings that hinder the user experience.

Mixed content errors in Chrome’s JavaScript Console

There are many ways to identify mixed content errors, including the use of crawler applications, Google’s Lighthouse, etc.

Image assets review

Google crawls images less frequently than HTML pages. If migrating a site’s images from one location to another (e.g. from your domain to a CDN), there are ways to aid Google in discovering the migrated images quicker. Building an image XML sitemap will help, but you also need to make sure that Googlebot can reach the site’s images when crawling the site. The tricky part with image indexing is that both the web page where an image appears on as well as the image file itself have to get indexed.

Site performance review

Last but not least, measure the old site’s page loading times and see how these compare with the new site’s when this becomes available on staging. At this stage, focus on the network-independent aspects of performance such as the use of external resources (images, JavaScript, and CSS), the HTML code, and the web server’s configuration. More information about how to do this is available further down.

Analytics tracking review

Make sure that analytics tracking is properly set up. This review should ideally be carried out by specialist analytics consultants who will look beyond the implementation of the tracking code. Make sure that Goals and Events are properly set up, e-commerce tracking is implemented, enhanced e-commerce tracking is enabled, etc. There’s nothing more frustrating than having no analytics data after your new site is launched.

Redirects testing

Testing the redirects before the new site goes live is critical and can save you a lot of trouble later. There are many ways to check the redirects on a staging/test server, but the bottom line is that you should not launch the new website without having tested the redirects.

Once the redirects become available on the staging/testing environment, crawl the entire list of redirects and check for the following issues:

  • Redirect loops (a URL that infinitely redirects to itself)
  • Redirects with a 4xx or 5xx server response.
  • Redirect chains (a URL that redirects to another URL, which in turn redirects to another URL, etc).
  • Canonical URLs that return a 4xx or 5xx server response.
  • Canonical loops (page A has a canonical pointing to page B, which has a canonical pointing to page A).
  • Canonical chains (a canonical that points to another page that has a canonical pointing to another page, etc).
  • Protocol/host inconsistencies e.g. URLs are redirected to both HTTP and HTTPS URLs or www and non-www URLs.
  • Leading/trailing whitespace characters. Use trim() in Excel to eliminate them.
  • Invalid characters in URLs.

Pro tip: Make sure one of the old site’s URLs redirects to the correct URL on the new site. At this stage, because the new site doesn’t exist yet, you can only test whether the redirect destination URL is the intended one, but it’s definitely worth it. The fact that a URL redirects does not mean it redirects to the right page.


Phase 4: Launch day activities

When the site is down…

While the new site is replacing the old one, chances are that the live site is going to be temporarily down. The downtime should be kept to a minimum, but while this happens the web server should respond to any URL request with a 503 (service unavailable) server response. This will tell search engine crawlers that the site is temporarily down for maintenance so they come back to crawl the site later.

If the site is down for too long without serving a 503 server response and search engines crawl the website, organic search visibility will be negatively affected and recovery won’t be instant once the site is back up. In addition, while the website is temporarily down it should also serve an informative holding page notifying users that the website is temporarily down for maintenance.

Technical spot checks

As soon as the new site has gone live, take a quick look at:

  1. The robots.txt file to make sure search engines are not blocked from crawling
  2. Top pages redirects (e.g. do requests for the old site’s top pages redirect correctly?)
  3. Top pages canonical tags
  4. Top pages server responses
  5. Noindex/nofollow directives, in case they are unintentional

The spot checks need to be carried out across both the mobile and desktop sites, unless the site is fully responsive.

Search Console actions

The following activities should take place as soon as the new website has gone live:

  1. Test & upload the XML sitemap(s)
  2. Set the Preferred location of the domain (www or non-www)
  3. Set the International targeting (if applicable)
  4. Configure the URL parameters to tackle early any potential duplicate content issues.
  5. Upload the Disavow file (if applicable)
  6. Use the Change of Address tool (if switching domains)

Pro tip: Use the “Fetch as Google” feature for each different type of page (e.g. the homepage, a category, a subcategory, a product page) to make sure Googlebot can render the pages without any issues. Review any reported blocked resources and do not forget to use Fetch and Render for desktop and mobile, especially if the mobile website isn’t responsive.

Blocked resources prevent Googlebot from rendering the content of the page


Phase 5: Post-launch review

Once the new site has gone live, a new round of in-depth checks should be carried out. These are largely the same ones as those mentioned in the “Phase 3: Pre-launch Testing” section.

However, the main difference during this phase is that you now have access to a lot more data and tools. Don’t underestimate the amount of effort you’ll need to put in during this phase, because any issues you encounter now directly impacts the site’s performance in the SERPs. On the other hand, the sooner an issue gets identified, the quicker it will get resolved.

In addition to repeating the same testing tasks that were outlined in the Phase 3 section, in certain areas things can be tested more thoroughly, accurately, and in greater detail. You can now take full advantage of the Search Console features.

Check crawl stats and server logs

Keep an eye on the crawl stats available in the Search Console, to make sure Google is crawling the new site’s pages. In general, when Googlebot comes across new pages it tends to accelerate the average number of pages it crawls per day. But if you can’t spot a spike around the time of the launch date, something may be negatively affecting Googlebot’s ability to crawl the site.

Crawl stats on Google’s Search Console

Reviewing the server log files is by far the most effective way to spot any crawl issues or inefficiencies. Tools like Botify and On Crawl can be extremely useful because they combine crawls with server log data and can highlight pages search engines do not crawl, pages that are not linked to internally (orphan pages), low-value pages that are heavily internally linked, and a lot more.

Review crawl errors regularly

Keep an eye on the reported crawl errors, ideally daily during the first few weeks. Downloading these errors daily, crawling the reported URLs, and taking the necessary actions (i.e. implement additional 301 redirects, fix soft 404 errors) will aid a quicker recovery. It’s highly unlikely you will need to redirect every single 404 that is reported, but you should add redirects for the most important ones.

Pro tip: In Google Analytics you can easily find out which are the most commonly requested 404 URLs and fix these first!

Other useful Search Console features

Other Search Console features worth checking include the Blocked Resources, Structured Data errors, Mobile Usability errors, HTML Improvements, and International Targeting (to check for hreflang reported errors).

Pro tip: Keep a close eye on the URL parameters in case they’re causing duplicate content issues. If this is the case, consider taking some urgent remedial action.

Measuring site speed

Once the new site is live, measure site speed to make sure the site’s pages are loading fast enough on both desktop and mobile devices. With site speed being a ranking signal across devices and becauseslow pages lose users and customers, comparing the new site’s speed with the old site’s is extremely important. If the new site’s page loading times appear to be higher you should take some immediate action, otherwise your site’s traffic and conversions will almost certainly take a hit.

Evaluating speed using Google’s tools

Two tools that can help with this are Google’s Lighthouse and Pagespeed Insights.

ThePagespeed Insights Tool measures page performance on both mobile and desktop devices and shows real-world page speed data based on user data Google collects from Chrome. It also checks to see if a page has applied common performance best practices and provides an optimization score. The tool includes the following main categories:

  • Speed score: Categorizes a page as Fast, Average, or Slow using two metrics: The First Contentful Paint (FCP) and DOM Content Loaded (DCL). A page is considered fast if both metrics are in the top one-third of their category.
  • Optimization score: Categorizes a page as Good, Medium, or Low based on performance headroom.
  • Page load distributions: Categorizes a page as Fast (fastest third), Average (middle third), or Slow (bottom third) by comparing against all FCP and DCL events in the Chrome User Experience Report.
  • Page stats: Can indicate if the page might be faster if the developer modifies the appearance and functionality of the page.
  • Optimization suggestions: A list of best practices that could be applied to a page.

Google’s PageSpeed Insights in action

Google’s Lighthouse is very handy for mobile performance, accessibility, and Progressive Web Apps audits. It provides various useful metrics that can be used to measure page performance on mobile devices, such as:

  • First Meaningful Paint that measures when the primary content of a page is visible.
  • Time to Interactive is the point at which the page is ready for a user to interact with.
  • Speed Index measures shows how quickly a page are visibly populated

Both tools provide recommendations to help improve any reported site performance issues.

Google’s Lighthouse in action

You can also use this Google tool to get a rough estimate on the percentage of users you may be losing from your mobile site’s pages due to slow page loading times.

The same tool also provides an industry comparison so you get an idea of how far you are from the top performing sites in your industry.

Measuring speed from real users

Once the site has gone live, you can start evaluating site speed based on the users visiting your site. If you have Google Analytics, you can easily compare the new site’s average load time with the previous one.

In addition, if you have access to a Real User Monitoring tool such as Pingdom, you can evaluate site speed based on the users visiting your website. The below map illustrates how different visitors experience very different loading times depending on their geographic location. In the below example, the page loading times appear to be satisfactory to visitors from the UK, US, and Germany, but to users residing in other countries they are much higher.


Phase 6: Measuring site migration performance

When to measure

Has the site migration been successful? This is the million-dollar question everyone involved would like to know the answer to as soon as the new site goes live. In reality, the longer you wait the clearer the answer becomes, as visibility during the first few weeks or even months can be very volatile depending on the size and authority of your site. For smaller sites, a 4–6 week period should be sufficient before comparing the new site’s visibility with the old site’s. For large websites you may have to wait for at least 2–3 months before measuring.

In addition, if the new site is significantly different from the previous one, users will need some time to get used to the new look and feel and acclimatize themselves with the new taxonomy, user journeys, etc. Such changes initially have a significant negative impact on the site’s conversion rate, which should improve after a few weeks as returning visitors are getting more and more used to the new site. In any case, making data-driven conclusions about the new site’s UX can be risky.

But these are just general rules of thumb and need to be taken into consideration along with other factors. For instance, if a few days or weeks after the new site launch significant additional changes were made (e.g. to address a technical issue), the migration’s evaluation should be pushed further back.

How to measure

Performance measurement is very important and even though business stakeholders would only be interested to hear about the revenue and traffic impact, there are a whole lot of other metrics you should pay attention to. For example, there can be several reasons for revenue going down following a site migration, including seasonal trends, lower brand interest, UX issues that have significantly lowered the site’s conversion rate, poor mobile performance, poor page loading times, etc. So, in addition to the organic traffic and revenue figures, also pay attention to the following:

  • Desktop & mobile visibility (from SearchMetrics, SEMrush, Sistrix)
  • Desktop and mobile rankings (from any reliable rank tracking tool)
  • User engagement (bounce rate, average time on page)
  • Sessions per page type (i.e. are the category pages driving as many sessions as before?)
  • Conversion rate per page type (i.e. are the product pages converting the same way as before?)
  • Conversion rate by device (i.e. has the desktop/mobile conversion rate increased/decreased since launching the new site?)

Reviewing the below could also be very handy, especially from a technical troubleshooting perspective:

  • Number of indexed pages (Search Console)
  • Submitted vs indexed pages in XML sitemaps (Search Console)
  • Pages receiving at least one visit (analytics)
  • Site speed (PageSpeed Insights, Lighthouse, Google Analytics)

It’s only after you’ve looked into all the above areas that you could safely conclude whether your migration has been successful or not.

Good luck and if you need any consultation or assistance with your site migration, please get in touch!


Site migration checklist

An up-to-date site migration checklist is available to download from our site. Please note that the checklist is regularly updated to include all critical areas for a successful site migration.


Appendix: Useful tools

Crawlers

  • Screaming Frog: The SEO Swiss army knife, ideal for crawling small- and medium-sized websites.
  • Sitebulb: Very intuitive crawler application with a neat user interface, nicely organized reports, and many useful data visualizations.
  • Deep Crawl: Cloud-based crawler with the ability to crawl staging sites and make crawl comparisons. Allows for comparisons between different crawls and copes well with large websites.
  • Botify: Another powerful cloud-based crawler supported by exceptional server log file analysis capabilities that can be very insightful in terms of understanding how search engines crawl the site.
  • On-Crawl: Crawler and server log analyzer for enterprise SEO audits with many handy features to identify crawl budget, content quality, and performance issues.

Handy Chrome add-ons

  • Web developer: A collection of developer tools including easy ways to enable/disable JavaScript, CSS, images, etc.
  • User agent switcher: Switch between different user agents including Googlebot, mobile, and other agents.
  • Ayima Redirect Path: A great header and redirect checker.
  • SEO Meta in 1 click: An on-page meta attributes, headers, and links inspector.
  • Scraper: An easy way to scrape website data into a spreadsheet.

Site monitoring tools

  • Uptime Robot: Free website uptime monitoring.
  • Robotto: Free robots.txt monitoring tool.
  • Pingdom tools: Monitors site uptime and page speed from real users (RUM service)
  • SEO Radar: Monitors all critical SEO elements and fires alerts when these change.

Site performance tools

  • PageSpeed Insights: Measures page performance for mobile and desktop devices. It checks to see if a page has applied common performance best practices and provides a score, which ranges from 0 to 100 points.
  • Lighthouse: Handy Chrome extension for performance, accessibility, Progressive Web Apps audits. Can also be run from the command line, or as a Node module.
  • Webpagetest.org: Very detailed page tests from various locations, connections, and devices, including detailed waterfall charts.

Structured data testing tools

Mobile testing tools

Backlink data sources

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Reading Between the Lines: A 3-Step Guide to Reviewing Web Page Content

Posted by Jackie.Francis

In SEO, reviewing content is an unavoidable yet extremely important task. As the driving factor that brings people to a page, best practice dictates that we do what we can to ensure that the work we’ve invested hours and resources into creating remains impactful and relevant over time. This requires occasionally going back and re-evaluating our content to identify areas that can be improved.

That being said, if you’ve ever done a content review, you know how surprisingly challenging this is. A large variety of formats and topics alongside the challenge of defining “good” content makes it hard to pick out the core elements that matter. Without these universal focus areas, you may end up neglecting an element (e.g. tone of voice) in one instance but paying special attention to that same element in another.

Luckily there are certain characteristics — like good spelling, appealing layouts, and relevant keywords — that are universally associated with what we would consider “good” content. In this three-step guide, I’ll show you how to use these characteristics (or elements, as I like to call them) to define your target audience, measure the performance of your content using a scorecard, and assess your changes for quality assurance as part of a review process that can be applied to nearly all types of content across any industry.


Step 1: Know your audience

Arguably the most important step mentioned in this post, knowing your target reader will identify the details that should make up the foundation of your content. This includes insight into the reader’s intent, the ideal look and feel of the page, and the goals your content’s message should be trying to achieve.

To get to this point, however, you first need to answer these two questions:

  1. What does my target audience look like?
  2. Why are they reading my content?

What does my target audience look like?

The first question relies on general demographic information such as age, gender, education, and job title. This gives a face to the ideal audience member(s) and the kind of information that would best suit them. For example, if targeting stay-at-home mothers between the ages of 35 and 40 with two or more kids under the age of 5, we can guess that she has a busy daily schedule, travels frequently for errands, and constantly needs to stay vigilant over her younger children. So, a piece that is personable, quick, easy to read on-the-go, and includes inline imagery to reduce eye fatigue would be better received than something that is lengthy and requires a high level of focus.

Why are they reading my content?

Once you have a face to your reader, the second question must be answered to understand what that reader wants from your content and if your current product is effectively meeting those needs. For example, senior-level executives of mid- to large-sized companies may be reading to become better informed before making an important decision, to become more knowledgeable in their field, or to use the information they learn to teach others. Other questions you may want to consider asking:

  • Are they reading for leisure or work?
  • Would they want to share this with their friends on social media?
  • Where will they most likely be reading this? On the train? At home? Waiting in line at the store?
  • Are they comfortable with long blocks of text, or would inline images be best?
  • Do they prefer bite-sized information or are they comfortable with lengthy reports?

You can find the answers to these questions and collect valuable demographic and psychographic information by using a combination of internal resources, like sales scripts and surveys, and third-party audience insight tools such as Google Analytics and Facebook Audience Insights. With these results you should now have a comprehensive picture of your audience and can start identifying the parts of your content that can be improved.


Step 2: Tear apart your existing content

Now that you understand who your audience is, it’s time to get to the real work: assessing your existing content. This stage requires breaking everything apart to identify the components you should keep, change, or discard. However, this task can be extremely challenging because the performance of most components — such as tone of voice, design, and continuity — can’t simply be bucketed into binary categories like “good” or “bad.” Rather, they fall into a spectrum where the most reasonable level of improvement falls somewhere in the middle. You’ll see what I mean by this statement later on, but one of the most effective ways to evaluate and measure the degree of optimization needed for these components is to use a scorecard. Created by my colleague, Ben Estes, this straightforward, reusable, and easy to apply tool can help you objectively review the performance of your content.

Make a copy of the Content Review Grading Rubric

Note: The card sampled here, and the one I personally use for similar projects, is a slightly altered version of the original.

As you can see, the card is divided into two categories: Writing and Design. Listed under each category are elements that are universally needed to create a good content and should be examined. Each point is assigned a grading scale ranging from 1–5, with 1 being the worst score and 5 being best.

To use, start by choosing a part of your page to look at first. Order doesn’t matter, so whether you choose to first check “spelling and grammar” or “continuity” is up to you. Next, assign it a score on a separate Excel sheet (or mark it directly on the rubric) based on its current performance. For example, if the copy has no spelling errors but some minor grammar issues, you would rank “spelling and grammar” as a four (4).

Finally, repeat this process until all elements are graded. Remember to stay impartial to give an honest assessment.

Once you’re done, look at each grade and see where it falls on the scale. Ideally each element should have a score of 4 or greater, although a grade of 5 should only be given out sparingly. Tying back to my spectrum comment from earlier, a 5 is exclusively reserved for top-level work and should be something to strive for but will typically take more effort to achieve than it is worth. A grade of 4 is often the highest and most reasonable goal to attempt for, in most instances.

A grade of 3 or below indicates an opportunity for improvement and that significant changes need to be made.

If working with multiple pieces of content at once, the grading system can also be used to help prioritize your workload. Just collect the average writing or design score and sort them in ascending/descending order. Pages with a lower average indicate poorer performance and should be prioritized over pages whose averages are higher.

Whether you choose to use this scorecard or make your own, what you review, the span of the grading scale, and the criteria for each grade should be adjusted to fit your specific needs and result in a tool that will help you honestly assess your content across multiple applications.

Don’t forget the keywords

With most areas of your content covered by the scorecard, the last element to check before moving to the editing stage is your keywords.

Before I get slack for this, I’m aware that the general rule of creating content is to do your keyword research first. But I’ve found that when it comes to reviews, evaluating keywords last feels more natural and makes the process a lot smoother. When first running through a page, you’re much more likely to notice spelling and design flaws before you pick up whether a keyword is used correctly — why not make note of those details first?

Depending on the outcomes stemming from the re-evaluation of your target audience and content performance review, you will notice one of two things about your currently targeted keywords:

  1. They have not been impacted by the outcomes of the prior analyses and do not need to be altered
  2. They no longer align with the goals of the page or needs of the audience and should be changed

In the first example, the keywords you originally target are still best suited for your content’s message and no additional research is needed. So, your only remaining task is to determine whether or not your keywords are effectively used throughout the page. This means assessing things like title tag, image alt attributes, URL, and copy.

In an attempt to stay on track, I won’t go into further detail on how to optimize keywords but if you want a little more insight, this post by Ken Lyons is a great resource.

If, however, your target keywords are no longer relevant to the goals of your content, before moving to the editing stage you’ll need to re-do your keyword research to identify the terms you should rank for. For insight into keyword research this chapter in Moz’s Beginner’s Guide to SEO is another invaluable resource.


Step 3: Evaluate your evaluation

At this point your initial review is complete and you should be ready to edit.

That’s right. Your initial review.

The interesting thing about assessing content is that it never really ends. As you make edits you’ll tend to deviate more and more from your initial strategy. And while not always a bad thing, you must continuously monitor these changes to ensure that you are on the right track to create a highly valued piece of content.

The best approach would be to reassess all your material when:

  • 50% of the edits are complete
  • 85% of the edits are complete
  • You have finished editing

At the 50% and 85% marks, keep the assessment quick and simple. Look through your revisions and ask the following questions:

  • Am I still addressing the needs of my target audience?
  • Are my target keywords properly integrated?
  • Am I using the right language and tone of voice?
  • Does it look like the information is structured correctly (hierarchically)?

If your answer is “Yes” to all four questions, then you’ve effectively made your changes and should proceed. For any question you answer “No,” go back and make the necessary corrections. The areas targeted here become more difficult to fix the closer you are to completion and ensuring they’re correct throughout this stage will save a lot of time and stress in the long run.

When you’ve finished and think you’re ready to publish, run one last comprehensive review to check the performance status of all related components. This means confirming you’ve properly addressed the needs of your audience, optimized your keywords, and improved the elements highlighted in the scorecard.


Moving forward

No two pieces of content are the same, but that does not mean there aren’t some important commonalities either. Being able to identify these similarities and understand the role they play across all formats and topics will lead the way to creating your own review process for evaluating subjective material.

So, when you find yourself gearing up for your next project, give these steps a try and always keep the following in mind:

  1. Your audience is what makes or breaks you, so keep them happy
  2. Consistent quality is key! Ensure all components of your content are performing at their best
  3. Keep your keywords optimized and be prepared to do additional research if necessary
  4. Unplanned changes will happen. Just remember to remain observant as to keep yourself on track

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO, Chapter 1: SEO 101

Posted by BritneyMuller

Back in mid-November, we kicked off a campaign to rewrite our biggest piece of content: the Beginner’s Guide to SEO. You offered up a huge amount of helpful advice and insight with regards to our outline, and today we’re here to share our draft of the first chapter.

In many ways, the Beginner’s Guide to SEO belongs to each and every member of our community; it’s important that we get this right, for your sake. So without further ado, here’s the first chapter — let’s dive in!


Chapter 1: SEO 101

What is it, and why is it important?

Welcome! We’re excited that you’re here!

If you already have a solid understanding of SEO and why it’s important, you can skip to Chapter 2 (though we’d still recommend skimming the best practices from Google and Bing at the end of this chapter; they’re useful refreshers).

For everyone else, this chapter will help build your foundational SEO knowledge and confidence as you move forward.

What is SEO?

SEO stands for “search engine optimization.” It’s the practice of increasing both the quality and quantity of website traffic, as well as exposure to your brand, through non-paid (also known as “organic”) search engine results.

Despite the acronym, SEO is as much about people as it is about search engines themselves. It’s about understanding what people are searching for online, the answers they are seeking, the words they’re using, and the type of content they wish to consume. Leveraging this data will allow you to provide high-quality content that your visitors will truly value.

Here’s an example. Frankie & Jo’s (a Seattle-based vegan, gluten-free ice cream shop) has heard about SEO and wants help improving how and how often they show up in organic search results. In order to help them, you need to first understand their potential customers:

  • What types of ice cream, desserts, snacks, etc. are people searching for?
  • Who is searching for these terms?
  • When are people searching for ice cream, snacks, desserts, etc.?
    • Are there seasonality trends throughout the year?
  • How are people searching for ice cream?
    • What words do they use?
    • What questions do they ask?
    • Are more searches performed on mobile devices?
  • Why are people seeking ice cream?
    • Are individuals looking for health conscious ice cream specifically or just looking to satisfy a sweet tooth?
  • Where are potential customers located — locally, nationally, or internationally?

And finally — here’s the kicker — how can you help provide the best content about ice cream to cultivate a community and fulfill what all those people are searching for?

Search engine basics

Search engines are answer machines. They scour billions of pieces of content and evaluate thousands of factors to determine which content is most likely to answer your query.

Search engines do all of this by discovering and cataloguing all available content on the Internet (web pages, PDFs, images, videos, etc.) via a process known as “crawling and indexing.”

What are “organic” search engine results?

Organic search results are search results that aren’t paid for (i.e. not advertising). These are the results that you can influence through effective SEO. Traditionally, these were the familiar “10 blue links.”

Today, search engine results pages — often referred to as “SERPs” — are filled with both more advertising and more dynamic organic results formats (called “SERP features”) than we’ve ever seen before. Some examples of SERP features are featured snippets (or answer boxes), People Also Ask boxes, image carousels, etc. New SERP features continue to emerge, driven largely by what people are seeking.

For example, if you search for “Denver weather,” you’ll see a weather forecast for the city of Denver directly in the SERP instead of a link to a site that might have that forecast. And, if you search for “pizza Denver,” you’ll see a “local pack” result made up of Denver pizza places. Convenient, right?

It’s important to remember that search engines make money from advertising. Their goal is to better solve searcher’s queries (within SERPs), to keep searchers coming back, and to keep them on the SERPs longer.

Some SERP features on Google are organic and can be influenced by SEO. These include featured snippets (a promoted organic result that displays an answer inside a box) and related questions (a.k.a. “People Also Ask” boxes).

It’s worth noting that there are many other search features that, even though they aren’t paid advertising, can’t typically be influenced by SEO. These features often have data acquired from proprietary data sources, such as Wikipedia, WebMD, and IMDb.

Why SEO is important

While paid advertising, social media, and other online platforms can generate traffic to websites, the majority of online traffic is driven by search engines.

Organic search results cover more digital real estate, appear more credible to savvy searchers, and receive way more clicks than paid advertisements. For example, of all US searches, only ~2.8% of people click on paid advertisements.

In a nutshell: SEO has ~20X more traffic opportunity than PPC on both mobile and desktop.

SEO is also one of the only online marketing channels that, when set up correctly, can continue to pay dividends over time. If you provide a solid piece of content that deserves to rank for the right keywords, your traffic can snowball over time, whereas advertising needs continuous funding to send traffic to your site.

Search engines are getting smarter, but they still need our help.

Optimizing your site will help deliver better information to search engines so that your content can be properly indexed and displayed within search results.

Should I hire an SEO professional, consultant, or agency?

Depending on your bandwidth, willingness to learn, and the complexity of your website(s), you could perform some basic SEO yourself. Or, you might discover that you would prefer the help of an expert. Either way is okay!

If you end up looking for expert help, it’s important to know that many agencies and consultants “provide SEO services,” but can vary widely in quality. Knowing how to choose a good SEO company can save you a lot of time and money, as the wrong SEO techniques can actually harm your site more than they will help.

White hat vs black hat SEO

“White hat SEO” refers to SEO techniques, best practices, and strategies that abide by search engine rule, its primary focus to provide more value to people.

“Black hat SEO” refers to techniques and strategies that attempt to spam/fool search engines. While black hat SEO can work, it puts websites at tremendous risk of being penalized and/or de-indexed (removed from search results) and has ethical implications.

Penalized websites have bankrupted businesses. It’s just another reason to be very careful when choosing an SEO expert or agency.

Search engines share similar goals with the SEO industry

Search engines want to help you succeed. They’re actually quite supportive of efforts by the SEO community. Digital marketing conferences, such as Unbounce, MNsearch, SearchLove, and Moz’s own MozCon, regularly attract engineers and representatives from major search engines.

Google assists webmasters and SEOs through their Webmaster Central Help Forum and by hosting live office hour hangouts. (Bing, unfortunately, shut down their Webmaster Forums in 2014.)

While webmaster guidelines vary from search engine to search engine, the underlying principles stay the same: Don’t try to trick search engines. Instead, provide your visitors with a great online experience.

Google webmaster guidelines

Basic principles:

  • Make pages primarily for users, not search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging.

Things to avoid:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content (i.e. copied from somewhere else)
  • Cloaking — the practice of showing search engine crawlers different content than visitors.
  • Hidden text and links
  • Doorway pages — pages created to rank well for specific searches to funnel traffic to your website.

Full Google Webmaster Guidelines version here.

Bing webmaster guidelines

Basic principles:

  • Provide clear, deep, engaging, and easy-to-find content on your site.
  • Keep page titles clear and relevant.
  • Links are regarded as a signal of popularity and Bing rewards links that have grown organically.
  • Social influence and social shares are positive signals and can have an impact on how you rank organically in the long run.
  • Page speed is important, along with a positive, useful user experience.
  • Use alt attributes to describe images, so that Bing can better understand the content.

Things to avoid:

  • Thin content, pages showing mostly ads or affiliate links, or that otherwise redirect visitors away to other sites will not rank well.
  • Abusive link tactics that aim to inflate the number and nature of inbound links such as buying links, participating in link schemes, can lead to de-indexing.
  • Ensure clean, concise, keyword-inclusive URL structures are in place. Dynamic parameters can dirty up your URLs and cause duplicate content issues.
  • Make your URLs descriptive, short, keyword rich when possible, and avoid non-letter characters.
  • Burying links in Javascript/Flash/Silverlight; keep content out of these as well.
  • Duplicate content
  • Keyword stuffing
  • Cloaking — the practice of showing search engine crawlers different content than visitors.

Guidelines for representing your local business on Google

These guidelines govern what you should and shouldn’t do in creating and managing your Google My Business listing(s).

Basic principles:

  • Be sure you’re eligible for inclusion in the Google My Business index; you must have a physical address, even if it’s your home address, and you must serve customers face-to-face, either at your location (like a retail store) or at theirs (like a plumber)
  • Honestly and accurately represent all aspects of your local business data, including its name, address, phone number, website address, business categories, hours of operation, and other features.

Things to avoid

  • Creation of Google My Business listings for entities that aren’t eligible
  • Misrepresentation of any of your core business information, including “stuffing” your business name with geographic or service keywords, or creating listings for fake addresses
  • Use of PO boxes or virtual offices instead of authentic street addresses
  • Abuse of the review portion of the Google My Business listing, via fake positive reviews of your business or fake negative ones of your competitors
  • Costly, novice mistakes stemming from failure to read the fine details of Google’s guidelines

Fulfilling user intent

Understanding and fulfilling user intent is critical. When a person searches for something, they have a desired outcome. Whether it’s an answer, concert tickets, or a cat photo, that desired content is their “user intent.”

If a person performs a search for “bands,” is their intent to find musical bands, wedding bands, band saws, or something else?

Your job as an SEO is to quickly provide users with the content they desire in the format in which they desire it.

Common user intent types:

Informational: Searching for information. Example: “How old is Issa Rae?”

Navigational: Searching for a specific website. Example: “HBOGO Insecure”

Transactional: Searching to buy something. Example: “where to buy ‘We got y’all’ Insecure t-shirt”

You can get a glimpse of user intent by Googling your desired keyword(s) and evaluating the current SERP. For example, if there’s a photo carousel, it’s very likely that people searching for that keyword search for photos.

Also evaluate what content your top-ranking competitors are providing that you currently aren’t. How can you provide 10X the value on your website?

Providing relevant, high-quality content on your website will help you rank higher in search results, and more importantly, it will establish credibility and trust with your online audience.

Before you do any of that, you have to first understand your website’s goals to execute a strategic SEO plan.

Know your website/client’s goals

Every website is different, so take the time to really understand a specific site’s business goals. This will not only help you determine which areas of SEO you should focus on, where to track conversions, and how to set benchmarks, but it will also help you create talking points for negotiating SEO projects with clients, bosses, etc.

What will your KPIs (Key Performance Indicators) be to measure the return on SEO investment? More simply, what is your barometer to measure the success of your organic search efforts? You’ll want to have it documented, even if it’s this simple:

For the website ________________________, my primary SEO KPI is _______________.

Here are a few common KPIs to get you started:

  • Sales
  • Downloads
  • Email signups
  • Contact form submissions
  • Phone calls

And if your business has a local component, you’ll want to define KPIs for your Google My Business listings, as well. These might include:

  • Clicks-to-call
  • Clicks-to-website
  • Clicks-for-driving-directions

Notice how “Traffic” and “Ranking” are not on the above lists? This is because, for most websites, ranking well for keywords and increasing traffic won’t matter if the new traffic doesn’t convert (to help you reach the site’s KPI goals).

You don’t want to send 1,000 people to your website a month and have only 3 people convert (to customers). You want to send 300 people to your site a month and have 40 people convert.

This guide will help you become more data-driven in your SEO efforts. Rather than haphazardly throwing arrows all over the place (and getting lucky every once in awhile), you’ll put more wood behind fewer arrows.

Grab a bow (and some coffee); let’s dive into Chapter 2 (Crawlers & Indexation).


We’re looking forward to hearing your thoughts on this draft of Chapter 1. What works? Anything you feel could be added or explained differently? Let us know your suggestions, questions, and thoughts in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

The Content Marketer’s Guide to Starting a Meditation Practice Today

You and I are storytellers. We’re content creators and copywriters. Our livelihoods depend on spinning creative yarns that compel our readers to action. For the execution of our craft, we depend on some key inner resources every day. Creativity and focus are two biggies. And I’m sure you’ve noticed that — like gold and platinum
Read More…

The post The Content Marketer’s Guide to Starting a Meditation Practice Today appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

The Complete Guide to Direct Traffic in Google Analytics

Posted by tombennet

When it comes to direct traffic in Analytics, there are two deeply entrenched misconceptions.

The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.

In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.

What is direct traffic?

In short, Google Analytics will report a traffic source of “direct” when it has no data on how the session arrived at your website, or when the referring source has been configured to be ignored. You can think of direct as GA’s fall-back option for when its processing logic has failed to attribute a session to a particular source.

To properly understand the causes and fixes for direct traffic, it’s important to understand exactly how GA processes traffic sources. The following flow-chart illustrates how sessions are bucketed — note that direct sits right at the end as a final “catch-all” group.

Broadly speaking, and disregarding user-configured overrides, GA’s processing follows this sequence of checks:

AdWords parameters > Campaign overrides > UTM campaign parameters > Referred by a search engine > Referred by another website > Previous campaign within timeout period > Direct

Note the penultimate processing step (previous campaign within timeout), which has a significant impact on the direct channel. Consider a user who discovers your site via organic search, then returns via direct a week later. Both sessions would be attributed to organic search. In fact, campaign data persists for up to six months by default. The key point here is that Google Analytics is already trying to minimize the impact of direct traffic for you.

What causes direct traffic?

Contrary to popular belief, there are actually many reasons why a session might be missing campaign and traffic source data. Here we will run through some of the most common.

1. Manual address entry and bookmarks

The classic direct-traffic scenario, this one is largely unavoidable. If a user types a URL into their browser’s address bar or clicks on a browser bookmark, that session will appear as direct traffic.

Simple as that.

2. HTTPS > HTTP

When a user follows a link on a secure (HTTPS) page to a non-secure (HTTP) page, no referrer data is passed, meaning the session appears as direct traffic instead of as a referral. Note that this is intended behavior. It’s part of how the secure protocol was designed, and it does not affect other scenarios: HTTP to HTTP, HTTPS to HTTPS, and even HTTP to HTTPS all pass referrer data.

So, if your referral traffic has tanked but direct has spiked, it could be that one of your major referrers has migrated to HTTPS. The inverse is also true: If you’ve migrated to HTTPS and are linking to HTTP websites, the traffic you’re driving to them will appear in their Analytics as direct.

If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.

If, on the other hand, you’ve already migrated to HTTPS and are concerned about your users appearing to partner websites as direct traffic, you can implement the meta referrer tag. Cyrus Shepard has written about this on Moz before, so I won’t delve into it now. Suffice to say, it’s a way of telling browsers to pass some referrer data to non-secure sites, and can be implemented as a <meta> element or HTTP header.

3. Missing or broken tracking code

Let’s say you’ve launched a new landing page template and forgotten to include the GA tracking code. Or, to use a scenario I’m encountering more and more frequently, imagine your GTM container is a horrible mess of poorly configured triggers, and your tracking code is simply failing to fire.

Users land on this page without tracking code. They click on a link to a deeper page which does have tracking code. From GA’s perspective, the first hit of the session is the second page visited, meaning that the referrer appears as your own website (i.e. a self-referral). If your domain is on the referral exclusion list (as per default configuration), the session is bucketed as direct. This will happen even if the first URL is tagged with UTM campaign parameters.

As a short-term fix, you can try to repair the damage by simply adding the missing tracking code. To prevent it happening again, carry out a thorough Analytics audit, move to a GTM-based tracking implementation, and promote a culture of data-driven marketing.

4. Improper redirection

This is an easy one. Don’t use meta refreshes or JavaScript-based redirects — these can wipe or replace referrer data, leading to direct traffic in Analytics. You should also be meticulous with your server-side redirects, and — as is often recommended by SEOs — audit your redirect file frequently. Complex chains are more likely to result in a loss of referrer data, and you run the risk of UTM parameters getting stripped out.

Once again, control what you can: use carefully mapped (i.e. non-chained) code 301 server-side redirects to preserve referrer data wherever possible.

5. Non-web documents

Links in Microsoft Word documents, slide decks, or PDFs do not pass referrer information. By default, users who click these links will appear in your reports as direct traffic. Clicks from native mobile apps (particularly those with embedded “in-app” browsers) are similarly prone to stripping out referrer data.

To a degree, this is unavoidable. Much like so-called “dark social” visits (discussed in detail below), non-web links will inevitably result in some quantity of direct traffic. However, you also have an opportunity here to control the controllables.

If you publish whitepapers or offer downloadable PDF guides, for example, you should be tagging the embedded hyperlinks with UTM campaign parameters. You’d never even contemplate launching an email marketing campaign without campaign tracking (I hope), so why would you distribute any other kind of freebie without similarly tracking its success? In some ways this is even more important, since these kinds of downloadables often have a longevity not seen in a single email campaign. Here’s an example of a properly tagged URL which we would embed as a link:

https://builtvisible.com/embedded-whitepaper-url/?utm_source=whitepaper&utm

The same goes for URLs in your offline marketing materials. For major campaigns it’s common practice to select a short, memorable URL (e.g. moz.com/tv/) and design an entirely new landing page. It’s possible to bypass page creation altogether: simply redirect the vanity URL to an existing page URL which is properly tagged with UTM parameters.

So, whether you tag your URLs directly, use redirected vanity URLs, or — if you think UTM parameters are ugly — opt for some crazy-ass hash-fragment solution with GTM (read more here), the takeaway is the same: use campaign parameters wherever it’s appropriate to do so.

6. “Dark social”

This is a big one, and probably the least well understood by marketers.

The term “dark social” was first coined back in 2012 by Alexis Madrigal in an article for The Atlantic. Essentially it refers to methods of social sharing which cannot easily be attributed to a particular source, like email, instant messaging, Skype, WhatsApp, and Facebook Messenger.

Recent studies have found that upwards of 80% of consumers’ outbound sharing from publishers’ and marketers’ websites now occurs via these private channels. In terms of numbers of active users, messaging apps are outpacing social networking apps. All the activity driven by these thriving platforms is typically bucketed as direct traffic by web analytics software.

People who use the ambiguous phrase “social media marketing” are typically referring to advertising: you broadcast your message and hope people will listen. Even if you overcome consumer indifference with a well-targeted campaign, any subsequent interactions are affected by their very public nature. The privacy of dark social, by contrast, represents a potential goldmine of intimate, targeted, and relevant interactions with high conversion potential. Nebulous and difficult-to-track though it may be, dark social has the potential to let marketers tap into elusive power of word of mouth.

So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.

Checklist: Minimizing direct traffic

To summarize what we’ve already discussed, here are the steps you can take to minimize the level of unnecessary direct traffic in your reports:

  1. Migrate to HTTPS: Not only is the secure protocol your gateway to HTTP/2 and the future of the web, it will also have an enormously positive effect on your ability to track referral traffic.
  2. Manage your use of redirects: Avoid chains and eliminate client-side redirection in favour of carefully-mapped, single-hop, server-side 301s. If you use vanity URLs to redirect to pages with UTM parameters, be meticulous.
  3. Get really good at campaign tagging: Even amongst data-driven marketers I encounter the belief that UTM begins and ends with switching on automatic tagging in your email marketing software. Others go to the other extreme, doing silly things like tagging internal links. Control what you can, and your ability to carry out meaningful attribution will markedly improve.
  4. Conduct an Analytics audit: Data integrity is vital, so consider this essential when assessing the success of your marketing. It’s not simply a case of checking for missing track code: good audits involve a review of your measurement plan and rigorous testing at page and property-level.

Adhere to these principles, and it’s often possible to achieve a dramatic reduction in the level of direct traffic reported in Analytics. The following example involved an HTTPS migration, GTM migration (as part of an Analytics review), and an overhaul of internal campaign tracking processes over the course of about 6 months:

But the saga of direct traffic doesn’t end there! Once this channel is “clean” — that is, once you’ve minimized the number of avoidable pollutants — what remains might actually be one of your most valuable traffic segments.

Analyze! Or: why direct traffic can actually be pretty cool

For reasons we’ve already discussed, traffic from bookmarks and dark social is an enormously valuable segment to analyze. These are likely to be some of your most loyal and engaged users, and it’s not uncommon to see a notably higher conversion rate for a clean direct channel compared to the site average. You should make the effort to get to know them.

The number of potential avenues to explore is infinite, but here are some good starting points:

  • Build meaningful custom segments, defining a subset of your direct traffic based on their landing page, location, device, repeat visit or purchase behavior, or even enhanced e-commerce interactions.
  • Track meaningful engagement metrics using modern GTM triggers such as element visibility and native scroll tracking. Measure how your direct users are using and viewing your content.
  • Watch for correlations with your other marketing activities, and use it as an opportunity to refine your tagging practices and segment definitions. Create a custom alert which watches for spikes in direct traffic.
  • Familiarize yourself with flow reports to get an understanding of how your direct traffic is converting. By using Goal Flow and Behavior Flow reports with segmentation, it’s often possible to glean actionable insights which can be applied to the site as a whole.
  • Ask your users for help! If you’ve isolated a valuable segment of traffic which eludes deeper analysis, add a button to the page offering visitors a free downloadable ebook if they tell you how they discovered your page.
  • Start thinking about lifetime value, if you haven’t already — overhauling your attribution model or implementing User ID are good steps towards overcoming the indifference or frustration felt by marketers towards direct traffic.

I hope this guide has been useful. With any luck, you arrived looking for ways to reduce the level of direct traffic in your reports, and left with some new ideas for how to better analyze this valuable segment of users.

Thanks for reading!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Rewriting the Beginner’s Guide to SEO

Posted by BritneyMuller


Many of you reading likely cut your teeth on Moz’s Beginner’s Guide to SEO. Since it was launched, it’s easily been our top-performing piece of content:

Most months see 100k+ views (the reverse plateau in 2013 is when we changed domains).

While Moz’s Beginner’s Guide to SEO still gets well over 100k views a month, the current guide itself is fairly outdated. This big update has been on my personal to-do list since I started at Moz, and we need to get it right because — let’s get real — you all deserve a bad-ass SEO 101 resource!

However, updating the guide is no easy feat. Thankfully, I have the help of my fellow Mozzers. Our content team has been a collective voice of reason, wisdom, and organization throughout this process and has kept this train on its tracks.

Despite the effort we’ve put into this already, it felt like something was missing: your input! We’re writing this guide to be a go-to resource for all of you (and everyone who follows in your footsteps), and want to make sure that we’re including everything that today’s SEOs need to know. You all have a better sense of that than anyone else.

So, in order to deliver the best possible update, I’m seeking your help.

This is similar to the way Rand did it back in 2007. And upon re-reading your many “more examples” requests, we’ve continued to integrate more examples throughout.

The plan:

  • Over the next 6–8 weeks, I’ll be updating sections of the Beginner’s Guide and posting them, one by one, on the blog.
  • I’ll solicit feedback from you incredible people and implement top suggestions.
  • The guide will be reformatted/redesigned, and I’ll 301 all of the blog entries that will be created over the next few weeks to the final version.
  • It’s going to remain 100% free to everyone — no registration required, no premium membership necessary.

To kick things off, here’s the revised outline for the Beginner’s Guide to SEO:

Click each chapter’s description to expand the section for more detail.

Chapter 1: SEO 101


What is it, and why is it important? ↓


Chapter 2: Crawlers & Indexing


First, you need to show up. ↓


Chapter 3: Keyword Research


Next, know what to say and how to say it. ↓


Chapter 4: On-Page SEO


Next, structure your message to resonate and get it published. ↓


Chapter 5: Technical SEO


Next, translate your site into Google’s language. ↓


Chapter 6: Establishing Authority

Finally, turn up the volume. ↓


Chapter 7: Measuring and Tracking SEO

Pivot based on what’s working. ↓


Appendix A: Glossary of Terms

Appendix B: List of Additional Resources

Appendix C: Contributors & Credits


What did you struggle with most when you were first learning about SEO? What would you have benefited from understanding from the get-go?

Are we missing anything? Any section you wish wouldn’t be included in the updated Beginner’s Guide? Leave your suggestions in the comments!

Thanks in advance for contributing.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert