Tag Archive | "Them"

Don’t Let Go of Expectations … Put Them to Work in Your Marketing

We all know the potential trouble associated with expectations: If you expect something to turn out a certain way —…

The post Don’t Let Go of Expectations … Put Them to Work in Your Marketing appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

The silent killers of loading time and how to fix them

Imagine visiting a website that takes more than 10, no two seconds to load. We know that the mouse is going to hover to the top right corner because honestly, no one has the time to wait nowadays. 

A Forbes article mentioned that a mere one-second delay in page load time means a seven percent decline in sales, 11% fewer page views, a 16% decrease in customer satisfaction, and a seven percent loss in conversions. 

Your website may be a work of art with awesome features. It can have lightning speed chat responses but with slow loading time, none of that matters. 

Attention spans are growing smaller and patience is thinner than ever. Other than that, slow loading sites impact your SEO because it affects how Google sees your page. Speed is a ranking factor Google uses to measure your page. Sure, content may be king, but speed can change how your content performs in search. 

We’ll dig deep and find the silent killers of loading time – both common and uncommon causes.

1. Uncompressed images and bizarre image dimensions

The quality and size of an image affects its loading time. Having a high-resolution image on every page means your site will load slower.

How you can fix this

A couple of ways we found included installing plugins. The first one is with a jQuery Lazy Load plugin. This plugin allows the images that are only appearing to load “above the fold” or on a part a visitor is currently viewing. 

The second option is by using image optimizers such as Yahoo!’s Smush.it or use the WP Smush.it plugin which compacts images without altering their quality. With the WP plugin, it can be done automatically when you add graphics to your site.

2. Unnecessary plugins

If you have a WordPress site you’ll know that there are tons of plugins wandering around and sometimes you might feel the need to download every one because they’re “helpful” to your site. 

Before you know it, you’ll have plugins running your site and you might even have a plugin for your plugin.

Plugin overload can be a problem because the more plugins your site has, the more work it has to do when it loads. Also, not all plugins are as awesome as they claim to be. Beware of outdated plugins that can slow down your site instead of improving its performance.

What you can do to solve this problem is by evaluating your current plugins to figure out which ones you actually need. You might have multiple plugins that have the same function or have some that you’re no longer using. 

When you’re deleting plugins check to see if

– The plugin is relevant and updated

– Whether it has another similar plugin with same functions

– Whether you’re still using it the respective plugin

You can also check the performance of your plugins using the P3 (Plugin Performance Profiler) which shows you the impact of each plugin has on your WordPress site load time.

3. An excessive homepage

Your homepage is the face of your brand. So, we get it if you want it to look the best. However, when you try to impress new visitors with a bunch of widgets, content, and state-of-the-art imagery, it’s going to compromise your loading time.

When you want to make an impressive site, keep in mind that a clean design can do wonders. We’re not telling you to ban widgets completely (save them for the end of your blog posts or site pages) but we’re just telling you to keep it simple.

Another thing you can do to speed up load times is by altering the WordPress options to show excerpts instead of full posts and limiting the posts per page by five to seven each.

4. Free third-party WordPress themes

Free WordPress themes may sound like the best thing since sliced bread but free things come with a price tag. When you’re looking for a theme on WordPress, you’re likely to click on those free ones made by a third-party. They’re free anyway, so what can go wrong? Right?

Apparently, a lot of things. Like how free music and movies can come with spyware or malware, free third-party WordPress themes may be one of the causes for your slow website.

How you can fix this

One of the best ways is to only use themes from the official WordPress theme repository. If you want something more personalized, consider allocating less than $ 100 in a premium theme you can customize to your heart’s desire.

5. Unreliable web hosting

Having a web hosting server that’s not properly configured can harm your loading times. When picking a web hosting server, more often than not, we’ll try to choose the most budget-friendly option. That may be good in the beginning when you’re just starting out. 

However, once the amount of traffic you’re receiving suddenly spikes, your host and server won’t be able to handle a huge amount of users at a single time. Sudden spikes can happen especially during times you launch a new online marketing campaign or a new product. 

Instead of looking for a free or cheap web hosting solution, it’s best to use a well-known host that usually runs between four to eight dollars a month, which isn’t so bad. 

Other than the price, you should also keep in mind how fast the server responds when it deals with problems. Sometimes your site can have emergencies and filling in forms just won’t cut it. Do your research thoroughly and read reviews about the company and its support. 

6. Invisible loading images or videos

When you’re scrolling through a page, there is some content you can’t see immediately. Some are still at the bottom of the page and are visible after a visitor arrives at the exact spot. 

So, how is this a problem? The more images you tell your server to fetch, the slower your site will load. The reality is, the server usually fetches all of these images and videos (even the ones you can’t see yet). This is a huge factor for mobile devices since they have limited speed and data.

This can be fixed with “lazy loading” which means fetching the file only if it’s needed and only when it’s on the screen. A couple of plugins you can use for your WordPress site are BJ Lazy Load and LazyLoad. 

7. Coding issue

Your website is made of code. The more elaborate your site is, the more coding is necessary. Just because you want your website to be ideal, that doesn’t mean the coding should be over the top. Irrelevant or unnecessary code will only slow down your site since the server has to work through more data in order to get to a page.

An example of a coding issue

Unnecessary redirects which happen when the code refers to two different forms of the website URL. Although this seems like something trivial, it makes a huge difference.

When a redirect takes place, a user has to wait for the page to load twice. Using too many redirects means you’re doubling the load time.

To fix this, you need to review your code in detail. Most of the time, the root cause of slow load times could be from a coding issue. This occurs when the code isn’t consistent and causes too many redirects.

8. Not using a content delivery network (CDN)

CDN is a network of independent servers deployed in different geographic locations that serves web content to visitors. Depending on the location of your website visitors, the content requested gets served by the node that’s at the nearest data center. 

The problem with not using a CDN is that many sites can be slow, especially if they have visitors from around the world. Although a CDN isn’t necessary, it can help serve your web content much faster and reduce the loading time.

Now that you’re aware of some of the most and least obvious loading time killers, it’s time to get cracking with fixing them for your website.

Got some more load time killers that you wish to add to this list? Share them in the comments.

Nat McNeely is Digital Marketing Manager of Breadnbeyond, an award-winning explainer video company. 

The post The silent killers of loading time and how to fix them appeared first on Search Engine Watch.

Search Engine Watch

Posted in IM NewsComments Off

If Google says H1s don’t matter for rankings, why should you use them? Here’s why

Even if Google says its ranking systems work just fine without them, accessibility and readability are good reasons to use headings correctly.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Google doesn’t pass PageRank on nofollow links. Here’s why you still see them in GSC

Nofollow links will be included in your Link Report.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

The 5 Golden Rules Of Expectation Management And Why You Can’t Ignore Them

If you have a good memory, you may recall a few weeks just before Steve Jobs passed on, Apple stock dropped a good few percentage points. It wasn’t because of Steve’s death that brought the valuation down, because that was factored into the stock market years ago when he first began to get sick, it […]

The post The 5 Golden Rules Of Expectation Management And Why You Can’t Ignore Them appeared first on Yaro.Blog.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

3 Ways Marketing Automation Can Mess Up Perfectly Good Copy (and How to Fix Them)

Pretty much every online business on the planet uses marketing automation in one way or another. Honestly, it would be…

The post 3 Ways Marketing Automation Can Mess Up Perfectly Good Copy (and How to Fix Them) appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

5 Reasons Legacy Brands Struggle With SEO (and What to Do About Them)

Posted by Tom.Capper

Given the increasing importance of brand in SEO, it seems a cruel irony that many household name-brands seem to struggle with managing the channel. Yet, in my time at Distilled, I’ve seen just that: numerous name-brand sites in various states of stagnation and even more frustrated SEO managers attempting to prevent said stagnation. 

Despite global brand recognition and other established advantages that ought to drive growth, the reality is that having a household name doesn’t ensure SEO success. In this post, I’m going to explore why large, well-known brands can run into difficulties with organic performance, the patterns I’ve noticed, and some of the recommended tactics to address those challenges.

What we talk about when we talk about a legacy brand

For the purposes of this post, the term “legacy brand” applies to companies that have a very strong association with the product they sell, and may well have, in the past, been the ubiquitous provider for that product. This could mean that they were household names in the 20th century, or it could be that they pioneered and dominated their field in the early days of mass consumer web usage. A few varied examples (that Distilled has never worked with or been contacted by) include:

  • Wells Fargo (US)
  • Craigslist (US)
  • Tesco (UK)

These are cherry-picked, potentially extreme examples of legacy brands, but all three of the above, and most that fit this description have shown a marked decline in the last five years, in terms of organic visibility (confirmed by Sistrix, my tool of choice — your tool-of-choice may vary). It’s a common issue for large, well-established sites — peaking in 2013 and 2014 and never again reaching those highs.

It’s worth noting that stagnation is not the only possible state — sometimes brands can even be growing, but simply at a level far beneath the potential, you would expect from their offline ubiquity.

The question is: why does it keep happening?

Reason 1: Brand

Quite possibly the biggest hurdle standing in the way of a brand’s performance is the brand itself. This may seem like a bit of an odd one — we’d already established that the companies we’re talking about are big, recognized, household names. That in and of itself should help them in SEO, right?

The thing is, though, a lot of these big household names are recognized, but they’re not the one-stop shops that they used to be.

Here’s how the above name-brand examples are performing on search:

Other dominant, clearly vertical-leading brands in the UK, in general, are also not doing so well in branded search:

There’s a lot of potential reasons for why this may be — and we’ll even address some of them later — but a few notable ones include:

  • Complacency — particularly for brands that were early juggernauts of the web, they may have forgotten the need to reinforce their brand image and recognition.
  • More and more credible competitors. When you’re the only competent operator, as many of these brands once were, you had the whole pie. Now, you have to share it.
  • People trust search engines. In a lot of cases, ubiquitous brands decline, while the generic term is on the rise.

Check out this for the real estate example in the UK:

Rightmove and Zoopla are the two biggest brands in this space and have been for some time. There’s only one line there that’s trending upwards, though, and it’s the generic term, “houses for sale.”

What can I do about this?

Basically, get a move on! A lot of incumbents have been very slow to take action on things like top-of-funnel content, or only produce low-effort, exceptionally dry social media posts (I’ve posted before about some of these tactics here.) In fairness, it’s easy to see why — these channels and approaches likely have the least measurable returns. However, leaving a vacuum higher in your funnel is playing with fire, especially when you’re a recognized name. It opens an opportunity for smaller players to close the gap in recognition — at almost no cost.

Reason 2: Tech debt

I’m sure many people reading this will have experienced how hard it can be to get technical changes — particularly higher effort ones — implemented by larger, older organizations. This can stem from complex bureaucracy, aging and highly bespoke platforms, risk aversion, and, particularly for SEO, an inability to get senior buy-in for what can often be fairly abstract changes with little guaranteed reward.

What can I do about this?

At Distilled, we run into these challenges fairly often. I’ve seen dev queues that span, literally, for years. I’ve also seen organizations that are completely unable to change the most basic information on their sites, such as opening times or title tags. In fact, it was this exact issue that prompted the development of our ODN platform a few years ago as a way to circumvent technical limitations and prove the benefits when we did so.

There are less heavy-duty options available — GTM can be used for a range of changes as the last resort, albeit without the measurement component. CDN-level solutions like Cloudflare’s edge workers are also starting to gain traction within the SEO community.

Eventually, though, it’s necessary to tackle the problem at the source — by making headway within the politics of the organization. There’s a whole other post to be had there, if not several, but basically, it comes down to making yourself heard without undermining anyone. I’ve found that focusing on the downside is actually the most effective angle within big, risk-averse bureaucracies — essentially preying on the risk-aversion itself — as well as shouting loudly about any successes, however small.

Reason 3: Not updating tactics due to long-standing, ingrained practices

In a way, this comes back to risk aversion and politics — after all, legacy brands have a lot to lose. One particular manifestation I’ve often noticed in larger organizations is ongoing campaigns and tactics that haven’t been linked to improved rankings or revenue in years.

One conversation with a senior SEO at a major brand left me quite confused. I recall he said to me something along the lines of “we know this campaign isn’t right for us strategically, but we can’t get buy-in for anything else, so it’s this or lose the budget”. Fantastic.

This type of scenario can become commonplace when senior decision-makers don’t trust their staff — often, it’s a CMO, or similar executive leader, that hasn’t dipped their toe in SEO for a decade or more. When they do, they are unpleasantly surprised to discover that their SEO team isn’t buying any links this week and, actually, hasn’t for quite some time. Their reaction, then, is predictable: “No wonder the results are so poor!”

What can I do about this?

Unfortunately, you may have to humor this behavior in the short term. That doesn’t mean you should start (or continue) buying links, but it might be a good idea to ensure there’s similar-sounding activity in your strategy while you work on proving the ROI of your projects.

Medium-term, if you can get senior stakeholders out to conferences (I highly recommend SearchLove, though I may be biased), softly share articles and content “they may find interesting”, and drown them in news of the success of whatever other programs you’ve managed to get headway with, you can start to move them in the right direction.

Reason 4: Race to the bottom

It’s fair to say that, over time, it’s only become easier to launch an online business with a reasonably well-sorted site. I’ve observed in the past that new entrants don’t necessarily have to match tenured juggernauts like-for-like on factors like Domain Authority to hit the top spots.

As a result, it’s become common-place to see plucky, younger businesses rising quickly, and, at the very least, increasing the apparent level of choice where historically a legacy business might have had a monopoly on basic competence.

This is even more complicated when price is involved. Most SEOs agree that SERP behavior factors into rankings, so it’s easy to imagine legacy businesses, which disproportionately have a premium angle, struggling for clicks vs. attractively priced competitors. Google does not understand or care that you have a premium proposition — they’ll throw you in with the businesses competing purely on price all the same.

What can I do about this?

As I see it, there are two main approaches. One is abusing your size to crowd out smaller players (for instance, disproportionately targeting the keywords where they’ve managed to find a gap in your armor), and the second is, essentially, Conversion Rate Optimization.

Simple tactics like sorting a landing page by default by price (ascending), having clicky titles with a value-focused USP (e.g. free delivery), or well targeted (and not overdone) post-sales retention emails — all go a long way to mitigating the temptation of a cheaper or hackier competitor.

Reason 5: Super-aggregators (Amazon, Google)

In a lot of verticals, the pie is getting smaller, so it stands to reason the dominant players will be facing a diminishing slice.

A few obvious examples:

  • Local packs eroding local landing pages
  • Google Flights, Google Jobs, etc. eroding specialist sites
  • Amazon taking a huge chunk of e-commerce search

What can I do about this?

Again, there are two separate angles here, and one is a lot harder than the other. The first is similar to some of what I’ve mentioned above — move further up the funnel and lock in business before this ever comes to your prospective client Googling your head term and seeing Amazon and/or Google above you. This is only a mitigating tactic, however.

The second, which will be impossible for many or most businesses, is to jump into bed with the devil. If you ever do have the opportunity to be a data partner behind a Google or Amazon product, you may do well to swallow your pride and take it. You may be the only one of your competitors left in a few years, and if you don’t, it’ll be someone else.

Wrapping up

While a lot of the issues relate to complacency, and a lot of my suggested solutions relate to reinvesting as if you weren’t a dominant brand that might win by accident, I do think it’s worth exploring the mechanisms by which this translates into poorer performance.

This topic is unavoidably very tinted by my own experiences and opinions, so I’d love to hear your thoughts in the comments below. Similarly, I’m conscious that any one of my five reasons could have been a post in its own right — which ones would you like to see more fleshed out?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Tim Draper: These Guys Transformed the World and We Should Thank Them

Legendary investor and political activist Tim Draper says that instead of getting on the case of Elon Musk, we should be thanking him and other transformational entrepreneurs such as Steve Jobs and Travis Kalanick.

Draper also suggests that Elon Musk probably should have just taken Tesla private in order to avoid the myriad of rules and regulations imposed on public companies.

Venture capitalist Tim Draper was interviewed at the Web Summit in Lisbon, Portugal by CNBC:

These Guys Transformed the World, We Should Thank Them

Every time I pull out my iPhone I think thank you, Steve Jobs, this is awesome. Every time I hit the Uber key, I think thank you, Travis, that is so cool. Every time I get in my Tesla I think thank you Elon. These guys have really transformed the world and we should just thank them everywhere we go. And if they are having trouble supporting them. What can we do to help? How can we support you? How can we make you happier? We want to make you happier, look what you have done for us! It’s so cool!

He Probably Should Have Just Taken the Whole Thing Private

Every human in the world has made a mistake. There are so many laws that you have to follow if you are a public company he probably should have just taken the whole thing private. When you are a public company you’ve got to follow so many rules. If you step one little piece out of line you guys in the press are like… oh my gosh, our hero has done something wrong. I think we have got to say, hey look, he’s a human being, he’s doing the best he can. He’s running two amazing huge multi-billion dollar companies that he started. Well, he started one and jumped in very early and saved the other. This guy is awesome, let’s do what we can to support him.

All of Us Should Really Focus on Making SpaceX Successful

I invest in early-stage startups and then I will ride them as long as I feel it’s the right thing to do. Have you driven a Tesla, it’s so much better than any other car out there. And SpaceX, all of us should really focus on making SpaceX successful. If Tesla doesn’t save this earth, he will at least get some of us off the earth so that we can move our species somewhere else. Elon was amazing… we are all going to Mars. People looked at him and said, oh he’s crazy.

But then all of the best engineers in the world said, how would we get there? Then they thought, how would we have human life succeed there? And then, how can we get there faster? All those questions happen with an engineer and so Elon gets the best rocket scientists in the world working for his company and so, of course, it becomes a big success. He’s going to get us closer and closer to Mars and maybe to Alpha Centauri and other places.

About Tim Draper

Tim Draper helps entrepreneurs change the world. Tim Draper helps entrepreneurs drive their visions through funding, education, media, and government reform. He has founded thirty Draper venture funds, Draper University, Bizworld, and two statewide initiatives to improve governance and education.

The post Tim Draper: These Guys Transformed the World and We Should Thank Them appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

Email Marketing: Why phishing emails (unfortunately) work … and what marketers can learn from them

Phishing emails are just plain thievery. While phishing emails don’t ultimately deliver value, they do communicate value. Not to everyone, but to a specific audience. And that is why some people act on them.
MarketingSherpa Blog

Posted in IM NewsComments Off

Don’t Be Fooled by Data: 4 Data Analysis Pitfalls & How to Avoid Them

Posted by Tom.Capper

Digital marketing is a proudly data-driven field. Yet, as SEOs especially, we often have such incomplete or questionable data to work with, that we end up jumping to the wrong conclusions in our attempts to substantiate our arguments or quantify our issues and opportunities.

In this post, I’m going to outline 4 data analysis pitfalls that are endemic in our industry, and how to avoid them.

1. Jumping to conclusions

Earlier this year, I conducted a ranking factor study around brand awareness, and I posted this caveat:

“…the fact that Domain Authority (or branded search volume, or anything else) is positively correlated with rankings could indicate that any or all of the following is likely:

  • Links cause sites to rank well
  • Ranking well causes sites to get links
  • Some third factor (e.g. reputation or age of site) causes sites to get both links and rankings”
    ~ Me

However, I want to go into this in a bit more depth and give you a framework for analyzing these yourself, because it still comes up a lot. Take, for example, this recent study by Stone Temple, which you may have seen in the Moz Top 10 or Rand’s tweets, or this excellent article discussing SEMRush’s recent direct traffic findings. To be absolutely clear, I’m not criticizing either of the studies, but I do want to draw attention to how we might interpret them.

Firstly, we do tend to suffer a little confirmation bias — we’re all too eager to call out the cliché “correlation vs. causation” distinction when we see successful sites that are keyword-stuffed, but all too approving when we see studies doing the same with something we think is or was effective, like links.

Secondly, we fail to critically analyze the potential mechanisms. The options aren’t just causation or coincidence.

Before you jump to a conclusion based on a correlation, you’re obliged to consider various possibilities:

  • Complete coincidence
  • Reverse causation
  • Joint causation
  • Linearity
  • Broad applicability

If those don’t make any sense, then that’s fair enough — they’re jargon. Let’s go through an example:

Before I warn you not to eat cheese because you may die in your bedsheets, I’m obliged to check that it isn’t any of the following:

  • Complete coincidence - Is it possible that so many datasets were compared, that some were bound to be similar? Why, that’s exactly what Tyler Vigen did! Yes, this is possible.
  • Reverse causation - Is it possible that we have this the wrong way around? For example, perhaps your relatives, in mourning for your bedsheet-related death, eat cheese in large quantities to comfort themselves? This seems pretty unlikely, so let’s give it a pass. No, this is very unlikely.
  • Joint causation - Is it possible that some third factor is behind both of these? Maybe increasing affluence makes you healthier (so you don’t die of things like malnutrition), and also causes you to eat more cheese? This seems very plausible. Yes, this is possible.
  • Linearity - Are we comparing two linear trends? A linear trend is a steady rate of growth or decline. Any two statistics which are both roughly linear over time will be very well correlated. In the graph above, both our statistics are trending linearly upwards. If the graph was drawn with different scales, they might look completely unrelated, like this, but because they both have a steady rate, they’d still be very well correlated. Yes, this looks likely.
  • Broad applicability - Is it possible that this relationship only exists in certain niche scenarios, or, at least, not in my niche scenario? Perhaps, for example, cheese does this to some people, and that’s been enough to create this correlation, because there are so few bedsheet-tangling fatalities otherwise? Yes, this seems possible.

So we have 4 “Yes” answers and one “No” answer from those 5 checks.

If your example doesn’t get 5 “No” answers from those 5 checks, it’s a fail, and you don’t get to say that the study has established either a ranking factor or a fatal side effect of cheese consumption.

A similar process should apply to case studies, which are another form of correlation — the correlation between you making a change, and something good (or bad!) happening. For example, ask:

  • Have I ruled out other factors (e.g. external demand, seasonality, competitors making mistakes)?
  • Did I increase traffic by doing the thing I tried to do, or did I accidentally improve some other factor at the same time?
  • Did this work because of the unique circumstance of the particular client/project?

This is particularly challenging for SEOs, because we rarely have data of this quality, but I’d suggest an additional pair of questions to help you navigate this minefield:

  • If I were Google, would I do this?
  • If I were Google, could I do this?

Direct traffic as a ranking factor passes the “could” test, but only barely — Google could use data from Chrome, Android, or ISPs, but it’d be sketchy. It doesn’t really pass the “would” test, though — it’d be far easier for Google to use branded search traffic, which would answer the same questions you might try to answer by comparing direct traffic levels (e.g. how popular is this website?).

2. Missing the context

If I told you that my traffic was up 20% week on week today, what would you say? Congratulations?

What if it was up 20% this time last year?

What if I told you it had been up 20% year on year, up until recently?

It’s funny how a little context can completely change this. This is another problem with case studies and their evil inverted twin, traffic drop analyses.

If we really want to understand whether to be surprised at something, positively or negatively, we need to compare it to our expectations, and then figure out what deviation from our expectations is “normal.” If this is starting to sound like statistics, that’s because it is statistics — indeed, I wrote about a statistical approach to measuring change way back in 2015.

If you want to be lazy, though, a good rule of thumb is to zoom out, and add in those previous years. And if someone shows you data that is suspiciously zoomed in, you might want to take it with a pinch of salt.

3. Trusting our tools

Would you make a multi-million dollar business decision based on a number that your competitor could manipulate at will? Well, chances are you do, and the number can be found in Google Analytics. I’ve covered this extensively in other places, but there are some major problems with most analytics platforms around:

  • How easy they are to manipulate externally
  • How arbitrarily they group hits into sessions
  • How vulnerable they are to ad blockers
  • How they perform under sampling, and how obvious they make this

For example, did you know that the Google Analytics API v3 can heavily sample data whilst telling you that the data is unsampled, above a certain amount of traffic (~500,000 within date range)? Neither did I, until we ran into it whilst building Distilled ODN.

Similar problems exist with many “Search Analytics” tools. My colleague Sam Nemzer has written a bunch about this — did you know that most rank tracking platforms report completely different rankings? Or how about the fact that the keywords grouped by Google (and thus tools like SEMRush and STAT, too) are not equivalent, and don’t necessarily have the volumes quoted?

It’s important to understand the strengths and weaknesses of tools that we use, so that we can at least know when they’re directionally accurate (as in, their insights guide you in the right direction), even if not perfectly accurate. All I can really recommend here is that skilling up in SEO (or any other digital channel) necessarily means understanding the mechanics behind your measurement platforms — which is why all new starts at Distilled end up learning how to do analytics audits.

One of the most common solutions to the root problem is combining multiple data sources, but…

4. Combining data sources

There are numerous platforms out there that will “defeat (not provided)” by bringing together data from two or more of:

  • Analytics
  • Search Console
  • AdWords
  • Rank tracking

The problems here are that, firstly, these platforms do not have equivalent definitions, and secondly, ironically, (not provided) tends to break them.

Let’s deal with definitions first, with an example — let’s look at a landing page with a channel:

  • In Search Console, these are reported as clicks, and can be vulnerable to heavy, invisible sampling when multiple dimensions (e.g. keyword and page) or filters are combined.
  • In Google Analytics, these are reported using last non-direct click, meaning that your organic traffic includes a bunch of direct sessions, time-outs that resumed mid-session, etc. That’s without getting into dark traffic, ad blockers, etc.
  • In AdWords, most reporting uses last AdWords click, and conversions may be defined differently. In addition, keyword volumes are bundled, as referenced above.
  • Rank tracking is location specific, and inconsistent, as referenced above.

Fine, though — it may not be precise, but you can at least get to some directionally useful data given these limitations. However, about that “(not provided)”…

Most of your landing pages get traffic from more than one keyword. It’s very likely that some of these keywords convert better than others, particularly if they are branded, meaning that even the most thorough click-through rate model isn’t going to help you. So how do you know which keywords are valuable?

The best answer is to generalize from AdWords data for those keywords, but it’s very unlikely that you have analytics data for all those combinations of keyword and landing page. Essentially, the tools that report on this make the very bold assumption that a given page converts identically for all keywords. Some are more transparent about this than others.

Again, this isn’t to say that those tools aren’t valuable — they just need to be understood carefully. The only way you could reliably fill in these blanks created by “not provided” would be to spend a ton on paid search to get decent volume, conversion rate, and bounce rate estimates for all your keywords, and even then, you’ve not fixed the inconsistent definitions issues.

Bonus peeve: Average rank

I still see this way too often. Three questions:

  1. Do you care more about losing rankings for ten very low volume queries (10 searches a month or less) than for one high volume query (millions plus)? If the answer isn’t “yes, I absolutely care more about the ten low-volume queries”, then this metric isn’t for you, and you should consider a visibility metric based on click through rate estimates.
  2. When you start ranking at 100 for a keyword you didn’t rank for before, does this make you unhappy? If the answer isn’t “yes, I hate ranking for new keywords,” then this metric isn’t for you — because that will lower your average rank. You could of course treat all non-ranking keywords as position 100, as some tools allow, but is a drop of 2 average rank positions really the best way to express that 1/50 of your landing pages have been de-indexed? Again, use a visibility metric, please.
  3. Do you like comparing your performance with your competitors? If the answer isn’t “no, of course not,” then this metric isn’t for you — your competitors may have more or fewer branded keywords or long-tail rankings, and these will skew the comparison. Again, use a visibility metric.

Conclusion

Hopefully, you’ve found this useful. To summarize the main takeaways:

  • Critically analyse correlations & case studies by seeing if you can explain them as coincidences, as reverse causation, as joint causation, through reference to a third mutually relevant factor, or through niche applicability.
  • Don’t look at changes in traffic without looking at the context — what would you have forecasted for this period, and with what margin of error?
  • Remember that the tools we use have limitations, and do your research on how that impacts the numbers they show. “How has this number been produced?” is an important component in “What does this number mean?”
  • If you end up combining data from multiple tools, remember to work out the relationship between them — treat this information as directional rather than precise.

Let me know what data analysis fallacies bug you, in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert