Tag Archive | "wrong"

Zuckerberg Connecting Whatsapp, Instagram, and Facebook. What Could Go Wrong?

“Typically, you separate great brands to create enterprise value,” says Scott Galloway, a Professor of Marketing at NYU Stern School of Business. “Mark Zuckerberg is trying to encrypt the backbone between WhatsApp, Instagram, and the core platform Facebook, such that he has one communications network across 2.7 billion people or the population of the southern hemisphere plus India.  What could go wrong? I actually, and I’ve said this before, I think Mark Zuckerberg is the most dangerous person in the world.”

Scott Galloway, a well-known and popular Professor of Marketing at NYU Stern School of Business, discusses Facebook’s possible implementation of a single communions platform for all of its apps utilized by 2.7 billion people. Galloway was interviewed on Bloomberg Technology.

Connecting Whatsapp, Instagram, and Facebook – What Could Go Wrong?

What we have here is the mother of all conjoining of triplets (referring Facebook’s plan to use the same messaging backend on all of its platforms). That is, typically, you separate great brands to create enterprise value. Mark Zuckerberg is trying to encrypt the backbone between WhatsApp, Instagram, and the core platform Facebook, such that he has one communications network across 2.7 billion people or the population of the southern hemisphere plus India. What could go wrong? 

I actually, and I’ve said this before, I think Mark Zuckerberg is the most dangerous person in the world. If you look at key moments in our history where we moved to tyranny, one of the key steps is someone consolidates the media. The notion that we’re going to have one individual deciding the algorithms for an encrypted backbone of 2.7 billion people is frightening, regardless of that person’s intentions or not. They’re even talking about putting the Facebook brand on each of these. 

Is This a Prophylactic Move Against Antitrust Action?

I think what Mark Zuckerberg is doing is taking prophylactic moves against any sort of antitrust such that he could say, “It’d be impossible to unwind us now.” This is absolutely bad for the planet and bad for society. It’s clear where they’re going, an encrypted backbone, conjoin the triplets, and claim that if you do anything you’re going to kill all of us. 

Typically antitrust plays out over the course of years or even decades. The idea to try and conjoin the companies as quickly as possible, such that they can make a nationalist argument, and they’re making it now. They are arguing that the Chinese are coming for us with their AI weaponized companies and you need a big company (to combat them). In fact, we’re the only ones that can do a stable currency coin. 

They’re going to try and make the same argument around encrypting the backbone. The fact is the FTC and the DOJ, as they’ve shown at least stomach some for, should go on background and say, “This is not going to prevent us from splitting you up, so be careful.” There has never been a greater failure in FTC or DOJ history them approving the acquisition of Instagram. I think we all probably regret that now.

Zuckerberg Connecting Whatsapp, Instagram, and Facebook. What Could Go Wrong? – Scott Galloway

The post Zuckerberg Connecting Whatsapp, Instagram, and Facebook. What Could Go Wrong? appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

3 Proofreading Pointers, So Your Writing Isn’t Shared for the Wrong Reason

"Want to know how I find and correct errors in my own writing as well as every article we publish on Copyblogger?" – Stefanie Flaxman

Whenever someone questions the importance of proofreading, my go-to response is:

“Pubic relations is quite different from public relations.”

We all sometimes make a typo that omits or changes a letter in a word. A typo like that is difficult to spot when the mistake is still an actual word (or words). Just last week, I wrote “head lice” instead of “headline.” Again, two completely different things.

But I have an effective proofreading process that helps me find and correct errors before they are published. (Except, of course, when the error is a joke.)

Do you want to know techniques I use on my own writing as well as every article we publish on Copyblogger?

Walk the line

I’ve witnessed two different attitudes when it comes to how people feel about typos.

Some find them unacceptable and a reason to stop reading a publication. Others aren’t bothered by them at all and don’t understand why anyone would make an effort to prevent them.

I’m sure you’re not surprised that my outlook falls in the middle between those two extremes. I walk the line.

It’s a bit excessive to call a website “untrustworthy” if there is a typo in a piece of content or if an author doesn’t strictly follow grammar rules, but publishing your writing with a number of mistakes isn’t wise either. It can even lead to customer service headaches.

Established publications might be able to “get away with” occasional typos. Their audiences (for the most part) will be forgiving.

But if your website isn’t well-known and trusted yet, you want to demonstrate that you treat your content with care and aim to create the best possible experience for your readers.

Try one of the three methods below when you’re ready to polish your writing before you publish it.

1. Peek-a-boo proofreading

For this first method, you’ll need an opaque object that you don’t mind holding while you proofread.

It could be a note card, your phone, a slab of smoky quartz … whatever is handy and near your desk. Speaking of “handy,” your hand also works as this “object,” if nothing else feels right.

Start at the beginning of your text and cover the second word with the object so that you only concentrate on the first word in the document. Once you make sure it’s the correct word, surrounded by the correct punctuation if any is needed, shift your focus to the second word and cover the third word with the object.

When you’re satisfied with the second word, cover the fourth word with the object, review the third word, and repeat until you reach the end of your draft.

Blocking out the next word in your text forces you to slow down and examine your writing with a critical eye.

Names of companies, products, and people will stand out so that you can fact-check them. You’ll also be able to quickly see if you’ve accidentally left out a word, repeated a word, or chosen the wrong word.

2. Deep-tissue “word” massage

The tool I use for this method is a Rainmaker Platform pen I got at one of our company meetings. (You can buy the Platform, but I don’t think we sell the pen.)

I like proofreading with this retractable pen because when the ink cartridge is inside the external frame, a spongy material becomes the tip of the pen. The spongy part can make contact with my computer screen without scratching it.

You can use an eraser on the end of a pencil, a cotton swab, or another pointed object that is soft.

Start at the beginning of your text and physically underline each word with your soft, pointed object as you proofread. My pen actually touches my screen and presses into it as I observe each letter and word.

You don’t need to spend more than a few seconds on each word — just enough time to give it your full attention.

You’ll be able to easily spot “you’re/your/you” and “their/they’re/there” mistakes. Focusing on each letter of a word also helps you notice if you’ve accidentally made a word plural when it is supposed to be singular, or vice versa.

3. My all-time favorite proofreading technique, using one of the tips above

After I edit and proofread an article, the review process still feels a little incomplete — mistakes could be hiding in the content.

So, the technique I use as a final step before publishing is reading from the last sentence to the first sentence.

No matter how many times you’ve already reviewed an article, proofreading in this way helps you, at the very least, identify weaknesses you may have overlooked while editing.

During this stage, I sometimes notice a word has been overused or a lot of sentences begin with the same word. I’ll then vary the language so the text is more interesting.

You’ll also often find legitimate mistakes, such as:

  • The incorrect use of an apostrophe
  • The misinterpretation of a phrase, such as “beckon call” rather than “beck and call”
  • Subtle typos, such as “top” instead of “stop” or “in” instead of “it”

Read from the end to the beginning with either of the methods above to give every detail of your content extra special attention. Your job is to verify the accuracy of the words and phrases you present to your audience.

The luxury of digital content

When I discovered content marketing, I loved the concept but didn’t think it was something I could do.

Writing on a regular basis seemed like an impossible goal. Since I’m an editor, I thought an accidental writing mistake would tarnish my reputation. I couldn’t risk it.

Do you see what was really going on?

I was lacking confidence at the time. A confident person feels good about the work they’ve carefully produced and realizes mistakes still sometimes happen anyway.

With digital content on your own site, it’s especially easy to make corrections and move on.

So now that you’re equipped with smart ways to proofread, what are you going to publish today?

Image source: Joshua Ness via Unsplash.

The post 3 Proofreading Pointers, So Your Writing Isn’t Shared for the Wrong Reason appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Recovering Your Organic Search Traffic from a Web Migration Gone Wrong

Posted by Aleyda

[Estimated read time: 9 minutes]

I know you would never change a URL without identifying where to 301-redirect it and making sure that the links, XML sitemaps, and/or canonical tags are also updated. But if you’ve been doing SEO for a while, I bet you’ve also had a few clients — even big ones — coming to you after they’ve tried to do structural web changes or migrations of any type without taking SEO best practices into consideration.

Whenever this happens, your new client comes to you for help in an “emergency” type of situation in which there are two characteristics when doing the required SEO analysis:

  1. You need to prioritize:
    Your client is likely very nervous about the situation. You don’t have a lot of time to invest at the beginning to do a full audit right away. You’ll need to focus on identifying what hasn’t been done during the migration to make sure that the fundamental causes of the traffic loss are fixed — then you can move on with the rest.
  2. You might not have all the data:
    You might have only the basics — like Google Analytics & Google Search Console — and the information that the client shares with you about the steps they took when doing the changes. There are usually no previous rankings, crawls, or access to logs. You’ll need to make the most out of these two fairly easy-to-get data sources, new crawls that you can do yourself, and third-party “historical” ranking data. In this analysis we’ll work from this existing situation as a “worst-case scenario,” so anything extra that you can get will be an added benefit.

How can you make the most out of your time and basic data access to identify what went wrong and fix it — ASAP?

Let’s go through the steps for a “minimum viable” web migration validation to identify the critical issues to fix:

1. Verify that the web migration is the cause of the traffic loss.

To start, it’s key to:

  • Obtain from the client the specific changes that were done and actions taken during the migration, so you can identify those that had been likely missed and prioritize their validations when doing the analysis.
  • Check that the time of the traffic loss coincides with that of the migration to validate that it was actually the cause, or if there were different or coinciding factors that might have affected at the same time that you can later take into consideration when doing the full audit and analysis.

Screenshot: traffic dropping shortly after a web migration.

To identify this, compare the before and after with other traffic sources, per device & the migrated areas of your site (if not all of them changed), etc.

Use the “Why My Web Traffic Dropped” checklist to quickly verify that the loss has nothing to do with, for example, incorrect Google Analytics settings after the migration or a Google update happening at the same time.

Screenshot from Google Analytics of web traffic dropping.

I’ve had situations where the organic search traffic loss had coincided not only with the web migration but also with the date of a Phantom update (and they had the type of characteristics that were targeted).

Screenshot: Traffic loss after web migration and Google algo update.

If this is the case, you can’t expect to regain all the traffic after fixing the migration-related issues. There will be further analysis and implementations needed to fix the other causes of traffic loss.

2. Identify the pages that dropped the most in traffic, conversions, & rankings.

Once you verify that the traffic loss is due (completely or partially) to the web migration, then the next step is to focus your efforts on analyzing and identifying the issues in those areas that were hit the most from a traffic, conversions, & rankings perspective. You can do this by comparing organic search traffic per page before and after the migration in Google Analytics:

Screenshot: comparing organic search traffic per page before and after the migration in Google Analytics.

Select those that previously had the highest levels of traffic & conversions and that lost the highest percentages of traffic.

You can also do something similar with those pages with the highest impressions, clicks, & positions that have also had the greatest negative changes from the Google Search Console “Search Analytics” report:

Screenshot: the Google Search Console "Search Analytics" report.

After gathering this data, consolidate all of these pages (and related metrics) in an Excel spreadsheet. Here you’ll have the most critical pages that have lost the most from the migration.

Pages and related metrics consolidated in an Excel sheet

3. Identify the keywords for which these pages were ranking for and start monitoring them.

In most cases the issues will be technical (though sometimes they may be due to structural content issues). However, it’s important to identify the keywords for which these pages had been ranking in the past that lost visibility post-migration, start tracking them, and be able to verify their improvement after the issues are fixed.

Screenshot: identifying which keywords the page was ranking for.

This can be done by gathering data from tools with historical keyword ranking features — like SEMrush, Sistrix, or SearchMetrics — that also show you which pages have lost rankings during a specific period of time.

This can be a bit time-consuming, so you can also use URLProfiler to discover those keywords that were ranking in the past. It easily connects with your Google Search Console “Search Analytics” data via API to obtain their queries from the last 3 months.

Connecting URL Profiler to Google Search Console

As a result, you’ll have your keyword data and selected critical pages to assess in one spreadsheet:

Keyword data and selected critical pages to assess in one spreadsheet.

Now you can start tracking these keywords with your favorite keyword monitoring tool. You can even track the entire SERPs for your keywords with a tool like SERPwoo.

4. Crawl both the list of pages with traffic drops & the full website to identify issues and gaps.

Now you can crawl the list of pages you’ve identified using the “list mode” of an SEO crawler like Screaming Frog, then crawl your site with the “crawler mode,” comparing the issues in the pages that lost traffic versus the new, currently linked ones.

Uploading a list into Screaming Frog

You can also integrate your site crawl with Google Analytics to identify gaps (ScreamingFrog and Deepcrawl have this feature) and verify crawling, indexation, and even structural content-related issues that might have been caused by the migration. The following are some of the fundamentals that I recommend you take a look at, answering these questions:

Verifying against various issues your site may have.

A.) Which pages aren’t in the web crawl (because they’re not linked anymore) but were receiving organic search traffic?

Do these coincide with the pages that have lost traffic, rankings, & conversions? Have these pages been replaced? If so, why they haven’t been 301-redirected towards their new versions? Do it.

B.) Is the protocol inconsistent in the crawls?

Especially if the migration was from one version to the other (like HTTP to HTTPS), verify whether there are pages still being crawled with their HTTP version because links or XML sitemaps were not updated… then make sure to fix it.

C.) Are canonicalized pages pointing towards non-relevant URLs?

Check whether the canonical tags of the migrated pages are still pointing to the old URLs, or if the canonical tags were changed and are suddenly pointing to non-relevant URLs (such as the home page, as in the example below). Make sure to update them to point to their relevant, original URL if this is the case.

A page's source code with canonicalization errors.

D.) Are the pages with traffic loss now blocked via robots.txt or are non-indexable?

If so, why? Unblock all pages that should be crawled, indexed, and ranking as well as they were before.

E.) Verify whether the redirects logic is correct.

Just because the pages were redirected doesn’t mean that those redirects were correct. Identify these type of issues by asking the following questions:

  • Are the redirects going to relevant new page-versions of the old ones?
    Verify if the redirects are going to the correct page destination that features similar content and has the same purpose as the one redirected. If they’re not, make sure to update the redirects.
  • Are there any 302 redirects that should become 301s (as they are permanent and not temporary)
    Update them.
  • Are there any redirect loops that might be interfering with search crawlers reaching the final page destination?
    Update those as well.

    Especially if you have an independent mobile site version (under an “m” subdomain, for example), you’ll want to verify their redirect logic specifically versus the desktop one.

Checking redirect logic.

    • Are there redirects going towards non-indexable, canonicalized, redirected or error pages?
      Prioritize their fixing.

      To facilitate this analysis, you can use OnPage.org‘s “Redirects by Status Code” report.

OnPage.org's Redirects by Status Code report

    • Why are these redirected pages still being crawled?

      Update the links and XML sitemaps still pointing to the pages that are now redirecting to others, so that they go directly to the final page to crawl, index, and rank.

  • Are there duplicate content issues among the lost traffic pages?
    The configuration of redirects, canonicals, noindexation, or pagination might have changed and therefore these pages might now be featuring content that’s identified as duplicated and should be fixed.

Duplicate content issues shown on OnPage.org.

5. It’s time to implement fixes for the identified issues.

Once you ask these questions and update the configuration of your lost traffic pages as mentioned above, it’s important to:

  1. Update all of your internal links to go to the final URL destinations directly.
  2. Update all of your XML sitemaps to eliminate the inclusion of the old URLs, only leaving the new ones and resubmitting them to the Google Search Console
  3. Verify whether there are any external links still going to non-existent pages that should now redirect. This way, in the future and with more time, you can perform outreach to the most authoritative sites linking to them so they can be updated.
  4. Submit your lost traffic pages to be recrawled with the Google Search Console “Fetch as Google” section.

After resubmitting, start monitoring the search crawlers’ behavior through your web logs (you can use the Screaming Frog Log Analyzer), as well as your pages’ indexation, rankings, & traffic trends. You should start seeing a positive move after a few days:

Regaining numbers after implementing the fixes.

Remember that if the migration required drastic changes (like if you’ve migrated over another domain, for example), it’s natural to see a short-term rankings and traffic loss. This can be true even if it’s now correctly implemented and the new domain has a higher authority. You should take this into consideration; however, if the change has improved the former optimization status, the mid- to long-term results should be positive.

In the short term results dip, but as time goes on they rise again.

As you can see above, you can recover from this type of situation if you make sure to prioritize and fix the issues with negative effects before moving on to change anything else that’s not directly related. Once you’ve done this and see a positive trend, you can then begin a full SEO audit and process to improve what you’ve migrated, maximizing the optimization and results of the new web structure.

I hope this helps you have a quicker, easier web migration disaster recovery!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google “Lithuanian Flag” & Get The Wrong Flag

If you go to Google and ask it to show you the Flag of Lithuanian by searching for [Lithuanian Flag], Google will show you the wrong flag.

What Google shows you is three horizontal lines of green, followed by white…


Search Engine Roundtable

Posted in IM NewsComments Off

Are Your Solar Powered Lights Not Working – What Could Be Wrong?




style="display:inline-block;width:250px;height:250px"
data-ad-client="ca-pub-7815236958543991"
data-ad-slot="8717335615">

Solar powered lights are increasing in popularity throughout the world, providing you with a cost effective and environmentally friendly lighting solution that you can rely on and trust. There are times when your solar powered lights don’t perform as expected. This may be anything from them not giving off any light to them giving off too little lights. Knowing potential problems and what you need to do to rectify the situation can help you enjoy these energy saving lighting solutions now and in the future.

Before you start trying to identify why your solar powered lights are not working properly, you need to understand what they comprise of and how they work. These lights are made up of four important components; the battery, the sun, a controller and a LED light. The sun shines onto a solar panel which is highly absorbent, soaking up the natural light. The controller then converts this light to energy which is stored in the battery. In turn, when you turn on the lights, the battery powers the lights. This enables you to have light throughout the day and night, as and when needed.

When you have been using your lights for some time, you may notice that they start to dim, they may even only work for a short period of time. This can be a result of the battery no longer holding charge. It is common for these batteries to last between one to two years, after this time you need to change them to continue enjoying your solar powered lights. Think of these as your phone battery, you need them to charge and then discharge. A battery can only manage so much of this demanding abuse before it starts slowing down.

This can be a result of the battery not charging properly. It is advisable to speak to he company where you purchased your solar powered lights to see if they have an ideas on what the issues could be. It may be something as simple as changing the battery or the controller without having to replace the entire unit.

It is also important to remember that the earth is constantly moving, which means the sun doesn’t stay in the same location at all times. You need to carefully place the panel to ensure it gets maximum sun exposure on a daily basis, throughout summer and winter. As you know from getting sunburn, the sun’s dangerous rays can still get through clouds, so even on a cloudy day you will find that your batteries will charge and your solar powered lights will work.

Another important thing to remember when you are storing your solar powered lights is to remove the batteries. This will reduce the strain on the battery and help the item last that bit longer. See this as your car. When going away for long periods, you disconnect the car battery to ensure it doesn’t cause unnecessary drain and reduce the risk of the battery not working on your return.

Always ensure that when buying solar powered lights that you only buy from a reputable and reliable company with years of knowledge and experience in the industry. The quality of the solar powered lights is imperative to how long they live and how well they work.

When you buy lower quality while trying to get the most for your budget, you may find that the batteries don’t hold charge and that you are constantly running out of power when you need it most.

Latest solar news

Posted in IM NewsComments Off

Have We Been Wrong About Panda All Along?

Posted by MarieHaynes

Thin content! Duplicate content! Everyone knows that these are huge Panda factors. But are they really? In this article, I will explore the possibility that Panda is about so much more than thin and duplicate content. I don’t have a list of ten steps to follow to cure your Panda problems. But, I do hope that this article provokes some good discussion on how to improve our websites in the eyes of Google’s Panda algorithm.

The duplicate content monster

Recently, Google employee John Mueller ran a webmaster help hangout that focused on duplicate content issues. It was one of the best hangouts I have seen in a while—full of excellent information. John commented that almost every website has some sort of duplicate content. Some duplicate content could be there because of a CMS that sets up multiple tag pages. Another example would be an eCommerce store that carries several sizes of a product and has a unique URL for each size.

He also said that when Google detects duplicate content, it generally does not do much harm, but rather, Google determines which page they think is the best and they display that page.

But wait! Isn’t duplicate content a Panda issue? This is well believed in the SEO world. In fact, the Moz Q&A has almost 1800 pages indexed that ask about duplicate content and Panda!

I asked John Mueller whether duplicate content issues could be Panda issues. I wondered if perhaps duplicate content reduced crawl efficiency and this, in turn, would be a signal of low quality in the eyes of the Panda algorithm. He responded saying that these were not related, but were in fact two separate issues:

The purpose of this post is not to instruct you on how to deal with duplicate content. Google has some good guidelines here. Cleaning up your duplicate content can, in many cases, improve your crawl efficiency—which in some cases can result in an improvement in rankings. But I think that, contrary to what many of us have believed, duplicate content is NOT a huge component to the Panda algorithm.

Where duplicate content can get you in trouble is if you are purposely duplicating content in a spammy way in order to manipulate Google. For example, if a huge portion of your site consisted of articles duplicated from other sources, or if you are purposely trying to duplicate content with the intent of manipulating Google, then this can get you a manual penalty and can cause your site to be removed from the Google index:

These cases are not common, though. Google isn’t talking about penalizing sites that have duplicate product pages or a boatload of WordPress tag pages. While it’s always good to have as clean a site as possible, I’m going to make a bold statement here and say that this type of issue likely is not important when it comes to Panda.

What about thin content?

This is where things can get a little bit tricky. Recently, Google employee Gary Illyes caused a stir when he stated that Google doesn’t recommend removing thin content but rather, beefing up your site to make it “thick” and full of value.

Jen Slegg from The SEM Post had a great writeup covering this discussion; if you’re interested in reading more, I wrote a long post discussing why I believe that we should indeed remove thin content when trying to recover from a Panda hit, along with a case study showing a site that made a nice Panda recovery after removing thin content.

The current general consensus amongst SEOs who work with Panda-hit sites is that thin content should be improved upon wherever possible. But, if a site has a good deal of thin, unhelpful pages, it does make sense to remove those pages from Google’s index.

The reason for this is that Panda is all about quality. In the example which I wrote about where a site recovered from Panda after removing thin content, the site had hosted thousands of forum posts that contained unanswered questions. A user landing on one of these questions would not have found the page helpful and would likely have found another site to read in order to answer their query.

I believe that thin content can indeed be a Panda factor if that content consistently disappoints searchers who land on that page. If you have enough pages like this on your site, then yes, by all means, clean it up.

Panda is about so much MORE than duplicate and thin content

While some sites can recover from Panda after clearing out pages and pages of thin content, for most Panda-hit sites, the issues are much deeper and more complex. If you have a mediocre site that contains thousands of thin pages, removing those thin pages will not make the site excellent.

I believe Panda is entirely about excellence.

At Pubcon in Vegas, Rand Fishkin gave an excellent keynote speech in which he talked about living in a two-algo world. Rand spoke about the “regular algorithm,” which, in years past, we’ve worked hard to figure out and conquer by optimizing our title tags, improving our page speed, and gaining good links. But then he also spoke of a machine learning algorithm.

When Rand said “We’re talking about algorithms that build algorithms,” something clicked in my head and I realized that this very well could be what’s happening with Panda. Google has consistently said that Panda is about showing users the highest-quality sites. Rand suggested that machine learning algos may classify a site as a high quality one if they’re able to do some of the following things:

  • Consistently garner a higher click-through rate than their competitors.
  • Get users to engage more with your site than others in your space.
  • Answer more questions than other sites.
  • Earn more shares and clicks that result in loyal users.
  • Be the site that ultimately fulfills the searcher’s task.

There are no quick ways to fulfill these criteria. Your site ultimately has to be the best in order for Google to consider it the best.

I believe that Google is getting better and better at determining which sites are the most helpful ones to show users. If your site has been negatively affected by Panda, it may not be because you have technical on-site issues, but because your competitors’ sites are of higher overall quality than yours.

Is this why we’re not seeing many Panda recoveries?

In mid- to late 2014, Google was still refreshing Panda monthly. Then, after October of 2014, we had nine months of Panda silence. We all rejoiced when we heard that Google was refreshing Panda again in July of 2015. Google told us it would take a while for this algo to roll out. At the time of writing this, Panda has been supposedly rolling out for three months. I’ve seen some sporadic reports of mild recoveries, but I would say that probably 98% of the sites that have made on-site quality changes in hopes of a Panda recovery have seen no movement at all.

While it’s possible that the slow rollout still hasn’t affected the majority of sites, I think that there’s another frightening possibility.

It’s possible that sites that saw a Panda-related ranking demotion will only be able to recover if they can drastically improve the site to the point where users GREATLY prefer this site over their competitors’ sites.

It is always good to do an on-site quality audit. I still recommend a thorough site audit for any website that has suffered a loss in traffic that coincides with a Panda rerun date. In many cases, fixing quality issues—such as page speed problems, canonical issues, and confusing URL structures—can result in ranking improvement. But I think that we also need to put a HUGE emphasis on making your site the best of its kind.

And that’s not easy.

I’ve reviewed a lot of eCommerce sites that have been hit by Panda over the years. I have seen few of these recover. Many of them have had site audits done by several of the industry’s recognized experts. In some cases, the sites haven’t recovered because they have not implemented the recommended changes. However, there are quite a few sites that have made significant changes, yet still seem to be stuck under some type of ranking demotion.

In many cases like this, I’ve spent some time reviewing competitors’ sites that are currently ranking well. What I’ll do is try to complete a task, such as searching for and reaching the point of purchase on a particular product on the Panda hit-site, as well as the competitors’ sites. In most cases, I’ll find that the competitors offer a vastly better search experience. They may have a number of things that the Panda-hit site doesn’t, such as the following:

  • A better search interface.
  • Better browsing options (i.e. search by color, size, etc.)
  • Pictures that are much better and more descriptive than the standard stock product photos.
  • Great, helpful reviews.
  • Buying guides that help the searcher determine which product is best to buy.
  • Video tutorials on using their products.
  • More competitive pricing.
  • A shopping cart that’s easier to use.

The question that I ask myself is, “If I were buying this product, would I want to search for it and buy it on my clients’ site, or on one of these competitors’ sites?” The answer is almost always the latter.

And this is why Panda recovery is difficult. It’s not easy for a site to simply improve their search interface, add legitimate reviews that are not just scraped from another source, or create guides and video tutorials for many of their products. Even if the site did add these features, this is only going to bring them to the level where they are perhaps just as good as their competitors. I believe that in order to recover from Panda, you need to show Google that by far, users prefer your website over any other one.

This doesn’t just apply to eCommerce sites. I have reviewed a number of informational sites that have been hit by Panda. In some cases, clearing up thin content can result in Panda recoveries. But often, when an informational site is hit by Panda, it’s because the overall quality of the content is sub-par.

If you run a news site and you’re pushing out fifty stories a day that contain the same information as everyone else in your space, it’s going to be hard to convince Google’s algorithms that they should be showing your site’s pages first. You’ve got to find a way to make your site the one that everyone wants to visit. You want to be the site that when people see you in the SERPS, even if you’re not sitting at position #1, they say, “Oh…I want to get my news from THAT site…I know them and I trust them…and they always provide good information.”

In the past, a mediocre site could be propelled to the top of the SERPS by tweaking things like keywords in title tags, improving internal linking, and building some links. But, as Google’s algorithms get better and better at determining quality, the only sites that are going to rank well are the ones that are really good at providing value. Sure, they’re not quite there yet, but they keep improving.

So should I just give up?

No! I still believe that Panda recovery is possible. In fact, I would say that we’re in an age of the Internet where we have much potential for improvement. If you’ve been hit by Panda, then this is your opportunity to dig in deep, work hard, and make your site an incredible site that Google would be proud to recommend.

The following posts are good ones to read for people who are trying to improve their sites in the eyes of Panda:

How the Panda Algorithm Might Evaluate Your Site – A thorough post by Michael Martinez that looks at each of Amit Singhal’s 23 Questions for Panda-hit sites in great detail.

Leveraging Panda To Get Out Of Product Feed Jail – An excellent post on the Moz blog in which Michael Cottam gives some tips to help make your product pages stand out and be much more valuable than your competitors’ pages.

Google’s Advice on Making a High-Quality Site – This is short, but contains many nuggets.

Case Study – One Site’s Recovery from an Ugly SEO Mess – Alan Bleiweiss gives thorough detail on how implementing advice from a strong technical audit resulted in a huge Panda recovery.

Glenn Gabe’s Panda 4.0 Analysis – This post contains a fantastic list of things to clean up and improve upon for Panda-hit sites.

If you have been hit by Panda, you absolutely must do the following:

  • Start with a thorough on-site quality audit.
  • Find and remove any large chunks of thin content.
  • Deal with anything that annoys users, such as huge popups or navigation that doesn’t work.

But then we have to do more. In the first few years of Panda’s existence, making significant changes in on-site quality could result in beautiful Panda recoveries. I am speculating though that now, as Google gets better at determining which sites provide the most value, this may not be enough for many sites.

If you have been hit by Panda, it is unlikely that there is a quick fix. It is unlikely that you can tweak a few things or remove a chunk of content and see a dramatic recovery. Most likely, you will need to DRAMATICALLY improve the overall usefulness of the site to the point where it’s obvious to everyone that your pages are the best choices for Google to present to searchers.

What do you think?

I am seriously hoping that I’m wrong in predicting that the only sites we’ll see make significant Panda recoveries are ones that have dramatically overhauled all of their content. Who knows…perhaps one day soon we’ll start seeing awesome recoveries as this agonizingly slow iteration of Panda rolls out. But if we don’t, then we all need to get working on making our sites far better than anyone else’s site!

Do you think that technical changes alone can result in Panda recoveries? Or is vastly improving upon all of your content necessary as well?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google Apologizes For World Cup Logo With Wrong Flag, Not “Ghana” Happen Again

Google scored a goal against itself by mixing up the flags of Cameroon and Ghana in a special logo highlighting the Mexico versus Cameroon match happening today in the World Cup. Well, it is Friday the 13th. Google’s already fixed the logo and tweeted an apology with a pun: p.s. thanks to…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

What’s Wrong With A/B Testing

A/B testing is an internet marketing standard. In order to optimize response rates, you compare one page against another. You run with the page that gives you the best response rates.

But anyone who has tried A/B testing will know that whilst it sounds simple in concept, it can be problematic in execution. For example, it can be difficult to determine if what you’re seeing is a tangible difference in customer behaviour or simply a result of chance. Is A/B testing an appropriate choice in all cases? Or is it best suited to specific applications? Does A/B testing obscure what customers really want?

In this article, we’ll look at some of the gotchas for those new to A/B testing.

1. Insufficient Sample Size

You set up test. You’ve got one page featuring call to action A and one page featuring call to action B. You enable your PPC campaign and leave it running for a day.

When you stop the test, you’ve found call-to-action A converted at twice the rate of call-to-action B. So call-to-action A is the winner and we should run with it, and eliminate option B.

But this would be a mistake.

The sample size may be insufficient. If we only tested one hundred clicks, we might get a significant difference in results between two pages, but that change doesn’t show up when we get to 1,000 clicks. In fact, the result may even be reversed!

So, how do we determine a sample size that is statistically significant? This excellent article explains the maths. However, there are various online sample size calculators that will do the calculations for you, including Evan’s. Most A/B tracking tools will include sample size calculators, but it’s a good idea to understand what they’re calculating, and how, to ensure the accuracy of your tests.

In short, make sure you’ve tested enough of the audience to determine a trend.

2. Collateral Damage

We might want to test a call to action metric. We want to test the number of people who click on the “find out more” link on a landing page. We find that a lot more people click on this link we use the term “find out more” than if we use the term “buy now”.

Great, right?

But what if the conversion rate for those who actually make a purchase falls as a result? We achieved higher click-thrus on one landing page at the expense of actual sales.

This is why it’s important to be clear about the end goal when designing and executing tests. Also, ensure we look at the process as a whole, especially when we’re chopping the process up into bits for testing purposes. Does a change in one place affect something else further down the line?

In this example, you might A/B test the landing page whilst keeping an eye on your total customer numbers deeming the change effective only if customer numbers also rise. If your aim was only to increase click-thru, say to boost quality scores, then the change was effective.

3. What, Not Why

In the example above, we know the “what”. We changed the wording of a call-to-action link, and we achieved higher click thru’s, although we’re still in the dark as to why. We’re also in the dark as to why the change of wording resulted in fewer sales.

Was it because we attracted more people who were information seekers? Were buyers confused about the nature of the site? Did visitors think they couldn’t buy from us? Were they price shoppers who wanted to compare price information up front?

We don’t really know.

But that’s good, so long as we keep asking questions. These types of questions lead to more ideas for A/B tests. By turning testing into an ongoing process, supported by asking more and hopefully better questions, we’re more likely to discover a whole range of “why’s”.

4. Small Might Be A Problem

If you’re a small company competing directly with big companies, you may already be on the back foot when it comes to A/B testing.

It’s clear that its very modularity can cause problems. But what about in cases where the number of tests that can be run at once is low? While A/B testing makes sense on big websites where you can run hundreds of tests per day and have hundreds of thousands of hits, only a few offers can be tested at one time in cases like direct mail. The variance that these tests reveal is often so low that any meaningful statistical analysis is impossible.

Put simply, you might not have the traffic to generate statistically significant results. There’s no easy way around this problem, but the answer may lay in getting tricky with the maths.

Experimental design massively and deliberately increases the amount of variance in direct marketing campaigns. It lets marketers project the impact of many variables by testing just a few of them. Mathematical formulas use a subset of combinations of variables to represent the complexity of all the original variables. That allows the marketing organization to more quickly adjust messages and offers and, based on the responses, to improve marketing effectiveness and the company’s overall economics

Another thing to consider is that if you’re certain the bigger company is running A/B tests, and achieving good results, then “steal” their landing page*. Take their ideas for landing pages and use that as a test against your existing pages. *Of course, you can’t really steal their landing page, but you can be “influenced by” their approach.

What your competitors do is often a good starting point for your own tests. Try taking their approach and refine it.

5. Might There Be A Better Way?

Are there alternatives to A/B testing?

Some swear by the Multi Armed Bandit methodology:

The multi-armed bandit problem takes its terminology from a casino. You are faced with a wall of slot machines, each with its own lever. You suspect that some slot machines pay out more frequently than others. How can you learn which machine is the best, and get the most coins in the fewest trials?
Like many techniques in machine learning, the simplest strategy is hard to beat. More complicated techniques are worth considering, but they may eke out only a few hundredths of a percentage point of performance.

Then again…..

What multi-armed bandit algorithm does is that it aggressively (and greedily) optimizes for currently best performing variation, so the actual worse performing versions end up receiving very little traffic (mostly in the explorative 10% phase). This little traffic means when you try to calculate statistical significance, there’s still a lot of uncertainty whether the variation is “really” worse performing or the current worse performance is due to random chance. So, in a multi-armed bandit algorithm, it takes a lot more traffic to declare statistical significance as compared to simple randomization of A/B testing. (But, of course, in a multi-armed bandit campaign, the average conversion rate is higher).

Multivariate testing may be suitable if you’re testing a combination of variables, as opposed to just one i.e.

  • Product Image: Big vs. Medium vs Small
  • Price Text Style: Bold vs Normal
  • Price Text Color: Blue vs. Black vs. Red

There would be 3x2x3 different versions to test.

The problem with multivariate tests is they can get complicated pretty quickly and require a lot of traffic to produce statistically significant results. One advantage of multivariate testing over A/B testing is that it can tell you which part of the page is most influential. Was it a graphic? A headline? A video? If you’re testing a page using an A/B test, you won’t know. Multivariate testing will tell you which page sections influence the conversion rate and which don’t.

6. Methodology Is Only One Part Of The Puzzle

So is A/B testing worthwhile? Are the alternatives better?

The methodology we choose will only be as good as the test design. If tests are poorly designed, then the maths, the tests, the data and the software tools won’t be much use.

To construct good tests, you should first take a high level view:

Start the test by first asking yourself a question. Something on the lines of, “Why is the engagement rate of my site lower than that of the competitors…..Collect information about your product from customers before setting up any big test. If you plan to test your tagline, run a quick survey among your customers asking how they would define your product.

Secondly, consider the limits of testing. Testing can be a bit of a heartless exercise. It’s cold. We can’t really test how memorable and how liked one design is over the other, and typically have to go by instinct on some questions. Sometimes, certain designs just work for our audience, and other designs don’t. How do we test if we’re winning not just business, but also hearts and minds?

Does it mean we really understand our customers if they click this version over that one? We might see how they react to an offer, but that doesn’t mean we understand their desires and needs. If we’re getting click-backs most of the time, then it’s pretty clear we don’t understand the visitors. Changing a graphic here, and wording there, isn’t going to help if the underlying offer is not what potential customers want. No amount of testing ad copy will sell a pink train.

The understanding of customers is gained in part by tests, and in part by direct experience with customers and the market we’re in. Understanding comes from empathy. From asking questions. From listening to, and understanding, the answers. From knowing what’s good, and bad, about your competitors. From providing options. From open communication channels. From reassuring people. You’re probably armed with this information already, and that information is highly useful when it comes to constructing effective tests.

Do you really need A/B testing? Used well, it can markedly improve and hone offers. It isn’t a magic bullet. Understanding your audience is the most important thing. Google, a company that uses testing extensively, seem to be most vulnerable when it comes to areas that require a more intuitive understanding of people. Google Glass is a prime example of failing to understand social context. Apple, on the other hand, were driven more by an intuitive approach. Jobs: “We built [the Mac] for ourselves. We were the group of people who were going to judge whether it was great or not. We weren’t going to go out and do market research”

A/B testing is can work wonders, just so long as it isn’t used as a substitute for understanding people.

Categories: 

SEO Book

Posted in IM NewsComments Off

Google Is Wrong! The Link Is Natural!

There are two common sayings you’ve all heard in this field. The first is, “no one is perfect” and the second is, “never tell Google they are wrong.”

As you can imagine, they don’t go well together. No one, including, no algorithm or person…


Search Engine Roundtable

Posted in IM NewsComments Off

Chitika: We Got Google’s Local Search Number Wrong

Last week I wrote an article with the headline: Study: 43 Percent Of Total Google Search Queries Are Local. This was based on Chitika network data. The article was widely cited and linked to. Unfortunately it was wrong. Earlier today Chitika contacted me with the correction. The original…



Please visit Search Engine Land for the full article.




Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Advert