Tag Archive | "Faster"

The Pace of Digitalization in China is Much Faster Than Anywhere in the World, Says AXA China CEO

“The pace of digitalization in China is much faster than anywhere in the world and in a sense, it’s much deeper than anywhere else,” says AXA China CEO Xavier Veyry. “In China we really see an acceleration in the way companies leverage digital tools. I think in China digitalization is accelerating and I believe that in a lot of ways China is really leading the innovation in terms of worldwide interaction with the customers on the digital front.”

Xavier Veyry, CEO of AXA China, discusses how China is leading the world in digitalization and how digitalization is impacting the insurance industry and customers in an interview on CNBC:

The Pace of Digitalization in China is Much Faster Than Anywhere in the World

AXA has been one of the pioneers in terms of digital insurance in many places in the world. This is an industry, this is a trend, this is a fundamental shift in our industry that we have been pioneering in many geographies, most specifically in Europe and sometimes in Asia. Here in China, I would say that the landscape is very different. The pace of digitalization in China is much faster than anywhere in the world and in a sense, it’s much deeper than anywhere else. 

In China we really see an acceleration in the way companies leverage digital tools. The fact is that China has a very unique ecommerce platform and a very unique e-payment platform. In our analysis e-payment is the key driver toward facilitating the purchase of insurance products. It’s true that some insurance products can be designed for ecommerce for digital interactions. Others require a more personal touch and personal interaction with customers. It really depends on the product that we are manufacturing and the product that we are presenting to the customers. 

I think in China this is accelerating and I believe that in a lot of ways China is really leading the innovation in terms of worldwide interaction with the customers on the digital front.

The post The Pace of Digitalization in China is Much Faster Than Anywhere in the World, Says AXA China CEO appeared first on WebProNews.


WebProNews

Posted in IM NewsComments Off

10 Basic SEO Tips to Index + Rank New Content Faster – Whiteboard Friday

Posted by Cyrus-Shepard

In SEO, speed is a competitive advantage.

When you publish new content, you want users to find it ranking in search results as fast as possible. Fortunately, there are a number of tips and tricks in the SEO toolbox to help you accomplish this goal. Sit back, turn up your volume, and let Cyrus Shepard show you exactly how in this week’s Whiteboard Friday.

[Note: #3 isn't covered in the video, but we've included in the post below. Enjoy!]

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. I’m Cyrus Shepard, back in front of the whiteboard. So excited to be here today. We’re talking about ten tips to index and rank new content faster.

You publish some new content on your blog, on your website, and you sit around and you wait. You wait for it to be in Google’s index. You wait for it to rank. It’s a frustrating process that can take weeks or months to see those rankings increase. There are a few simple things we can do to help nudge Google along, to help them index it and rank it faster. Some very basic things and some more advanced things too. We’re going to dive right in.

Indexing

1. URL Inspection / Fetch & Render

So basically, indexing content is not that hard in Google. Google provides us with a number of tools. The simplest and fastest is probably the URL Inspection tool. It’s in the new Search Console, previously Fetch and Render. As of this filming, both tools still exist. They are depreciating Fetch and Render. The new URL Inspection tool allows you to submit a URL and tell Google to crawl it. When you do that, they put it in their priority crawl queue. That just simply means Google has a list of URLs to crawl. It goes into the priority, and it’s going to get crawled faster and indexed faster.

2. Sitemaps!

Another common technique is simply using sitemaps. If you’re not using sitemaps, it’s one of the easiest, quickest ways to get your URLs indexed. When you have them in your sitemap, you want to let Google know that they’re actually there. There’s a number of different techniques that can actually optimize this process a little bit more.

The first and the most basic one that everybody talks about is simply putting it in your robots.txt file. In your robots.txt, you have a list of directives, and at the end of your robots.txt, you simply say sitemap and you tell Google where your sitemaps are. You can do that for sitemap index files. You can list multiple sitemaps. It’s really easy.

Sitemap in robots.txt

You can also do it using the Search Console Sitemap Report, another report in the new Search Console. You can go in there and you can submit sitemaps. You can remove sitemaps, validate. You can also do this via the Search Console API.

But a really cool way of informing Google of your sitemaps, that a lot of people don’t use, is simply pinging Google. You can do this in your browser URL. You simply type in google.com/ping, and you put in the sitemap with the URL. You can try this out right now with your current sitemaps. Type it into the browser bar and Google will instantly queue that sitemap for crawling, and all the URLs in there should get indexed quickly if they meet Google’s quality standard.

Example: https://www.google.com/ping?sitemap=https://example.com/sitemap.xml

3. Google Indexing API

(BONUS: This wasn’t in the video, but we wanted to include it because it’s pretty awesome)

Within the past few months, both Google and Bing have introduced new APIs to help speed up and automate the crawling and indexing of URLs.

Both of these solutions allow for the potential of massively speeding up indexing by submitting 100s or 1000s of URLs via an API.

While the Bing API is intended for any new/updated URL, Google states that their API is specifically for “either job posting or livestream structured data.” That said, many SEOs like David Sottimano have experimented with Google APIs and found it to work with a variety of content types.

If you want to use these indexing APIs yourself, you have a number of potential options:

Yoast announced they will soon support live indexing across both Google and Bing within their SEO WordPress plugin.

Indexing & ranking

That’s talking about indexing. Now there are some other ways that you can get your content indexed faster and help it to rank a little higher at the same time.

4. Links from important pages

When you publish new content, the basic, if you do nothing else, you want to make sure that you are linking from important pages. Important pages may be your homepage, adding links to the new content, your blog, your resources page. This is a basic step that you want to do. You don’t want to orphan those pages on your site with no incoming links. 

Adding the links tells Google two things. It says we need to crawl this link sometime in the future, and it gets put in the regular crawling queue. But it also makes the link more important. Google can say, “Well, we have important pages linking to this. We have some quality signals to help us determine how to rank it.” So linking from important pages.

5. Update old content 

But a step that people oftentimes forget is not only link from your important pages, but you want to go back to your older content and find relevant places to put those links. A lot of people use a link on their homepage or link to older articles, but they forget that step of going back to the older articles on your site and adding links to the new content.

Now what pages should you add from? One of my favorite techniques is to use this search operator here, where you type in the keywords that your content is about and then you do a site:example.com. This allows you to find relevant pages on your site that are about your target keywords, and those make really good targets to add those links to from your older content.

6. Share socially

Really obvious step, sharing socially. When you have new content, sharing socially, there’s a high correlation between social shares and content ranking. But especially when you share on content aggregators, like Reddit, those create actual links for Google to crawl. Google can see those signals, see that social activity, sites like Reddit and Hacker News where they add actual links, and that does the same thing as adding links from your own content, except it’s even a little better because it’s external links. It’s external signals.

7. Generate traffic to the URL

This is kind of an advanced technique, which is a little controversial in terms of its effectiveness, but we see it anecdotally working time and time again. That’s simply generating traffic to the new content. 

Now there is some debate whether traffic is a ranking signal. There are some old Google patents that talk about measuring traffic, and Google can certainly measure traffic using Chrome. They can see where those sites are coming from. But as an example, Facebook ads, you launch some new content and you drive a massive amount of traffic to it via Facebook ads. You’re paying for that traffic, but in theory Google can see that traffic because they’re measuring things using the Chrome browser. 

When they see all that traffic going to a page, they can say, “Hey, maybe this is a page that we need to have in our index and maybe we need to rank it appropriately.”

Ranking

Once we get our content indexed, talk about a few ideas for maybe ranking your content faster. 

8. Generate search clicks

Along with generating traffic to the URL, you can actually generate search clicks.

Now what do I mean by that? So imagine you share a URL on Twitter. Instead of sharing directly to the URL, you share to a Google search result. People click the link, and you take them to a Google search result that has the keywords you’re trying to rank for, and people will search and they click on your result.

You see television commercials do this, like in a Super Bowl commercial they’ll say, “Go to Google and search for Toyota cars 2019.” What this does is Google can see that searcher behavior. Instead of going directly to the page, they’re seeing people click on Google and choosing your result.

  1. Instead of this: https://moz.com/link-explorer
  2. Share this: https://www.google.com/search?q=link+tool+moz

This does a couple of things. It helps increase your click-through rate, which may or may not be a ranking signal. But it also helps you rank for auto-suggest queries. So when Google sees people search for “best cars 2019 Toyota,” that might appear in the suggest bar, which also helps you to rank if you’re ranking for those terms. So generating search clicks instead of linking directly to your URL is one of those advanced techniques that some SEOs use.

9. Target query deserves freshness

When you’re creating the new content, you can help it to rank sooner if you pick terms that Google thinks deserve freshness. It’s best maybe if I just use a couple of examples here.

Consider a user searching for the term “cafes open Christmas 2019.” That’s a result that Google wants to deliver a very fresh result for. You want the freshest news about cafes and restaurants that are going to be open Christmas 2019. Google is going to preference pages that are created more recently. So when you target those queries, you can maybe rank a little faster.

Compare that to a query like “history of the Bible.” If you Google that right now, you’ll probably find a lot of very old pages, Wikipedia pages. Those results don’t update much, and that’s going to be harder for you to crack into those SERPs with newer content.

The way to tell this is simply type in the queries that you’re trying to rank for and see how old the most recent results are. That will give you an indication of what Google thinks how much freshness this query deserves. Choose queries that deserve a little more freshness and you might be able to get in a little sooner.

10. Leverage URL structure

Finally, last tip, this is something a lot of sites do and a lot of sites don’t do because they’re simply not aware of it. Leverage URL structure. When Google sees a new URL, a new page to index, they don’t have all the signals yet to rank it. They have a lot of algorithms that try to guess where they should rank it. They’ve indicated in the past that they leverage the URL structure to determine some of that.

Consider The New York Times puts all its book reviews under the same URL, newyorktimes.com/book-reviews. They have a lot of established ranking signals for all of these URLs. When a new URL is published using the same structure, they can assign it some temporary signals to rank it appropriately.

If you have URLs that are high authority, maybe it’s your blog, maybe it’s your resources on your site, and you’re leveraging an existing URL structure, new content published using the same structure might have a little bit of a ranking advantage, at least in the short run, until Google can figure these things out.

These are only a few of the ways to get your content indexed and ranking quicker. It is by no means a comprehensive list. There are a lot of other ways. We’d love to hear some of your ideas and tips. Please let us know in the comments below. If you like this video, please share it for me. Thanks, everybody.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Try This System to Manage Your Blog Comments Faster (and with Less Stress)

I think the best way to introduce the topic of this post is to remind you that my favorite word is No. At the risk of sounding no-fun, I like rules. If you’re in a position to set rules for any given situation, they can help you reach solutions to issues faster and avoid future
Read More…

The post Try This System to Manage Your Blog Comments Faster (and with Less Stress) appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Faster, Fresher, Better: Announcing Link Explorer, Moz’s New Link Building Tool

Posted by SarahBird

More link data. Fresher link data. Faster link data.

Today, I’m delighted to share that after eons of hard work, blood, sweat, tears, and love, Moz is taking a major step forward on our commitment to provide the best SEO tools money can buy.

We’ve rebuilt our link technology from the ground up and the data is now broadly available throughout Moz tools. It’s bigger, fresher, and much, much faster than our legacy link tech. And we’re just getting started! The best way to quickly understand the potential power of our revolutionary new link tech is to play with the beta of our Link Explorer.

Introducing Link Explorer, the newest addition to the Moz toolset!

We’ve heard your frustrations with Open Site Explorer and we know that you want more from Moz and your link building tools. OSE has done more than put in its time. Groundbreaking when it launched in 2008, it’s worked long and hard to bring link data to the masses. It deserves the honor of a graceful retirement.

OSE represents our past; the new Link Explorer is our fast, innovative, ambitious future.

Here are some of my favorite things about the Link Explorer beta:

  • It’s 20x larger and 30x fresher than OSE (RIP)
  • Despite its huge index size, the app is lightning fast! I can’t stand waiting so this might be my number-one fav improvement.
  • We’re introducing Link Tracking Lists to make managing your link building efforts a breeze. Sometimes the simple things make the biggest difference, like when they started making vans with doors on each side. You’ll never go back.
  • Link Explorer includes historic data, a painful gap in OSE. Studying your gained/lost linking domains is fast and easy.
  • The new UX surfaces competitive insights much more quickly
  • Increases the size and freshness of the index improved the quality of Domain Authority and Spam Score. Voilà.

All this, and we’re only in beta.

Dive into your link data now!

Here’s a deeper dive into my favorites:

#1: The sheer size, quality, and speed of it all

We’re committed to data quality. Here are some ways that shows up in the Moz tools:

  • When we collect rankings, we evaluate the natural first page of rankings to ensure that the placement and content of featured snippets and other SERP features are correctly situated (as can happen when ranking are collected in 50- or 100-page batches). This is more expensive, but we think the tradeoff is worth it.
  • We were the first to build a hybrid search volume model using clickstream data. We still believe our model is the most accurate.
  • Our SERP corpus, which powers Keywords by Site, is completely refreshed every two weeks. We actively update up to 15 million of the keywords each month to remove keywords that are no longer being searched and replace them with trending keywords and terms. This helps keep our keyword data set fresh and relevant.

The new Link Explorer index extends this commitment to data quality. OSE wasn’t cutting it and we’re thrilled to unleash this new tech.

Link Explorer is over 20x larger and 30x fresher than our legacy link index. Bonus points: the underlying technology is very cost-efficient, making it much less expensive for us to scale over time. This frees up resources to focus on feature delivery. BOOM!

One of my top pet peeves is waiting. I feel physical pain while waiting in lines and for apps to load. I can’t stand growing old waiting for a page to load (amirite?).

The new Link Explorer app is delightfully, impossibly fast. It’s like magic. That’s how link research should be. Magical.

#2: Historical data showing discovered and lost linking domains

If you’re a visual person, this report gives you an immediate idea of how your link building efforts are going. A spike you weren’t expecting could be a sign of spam network monkey business. Deep-dive effortlessly on the links you lost and gained so you can spend your valuable time doing thoughtful, human outreach.

#3: Link Tracking Lists

Folks, this is a big one. Throw out (at least one of… ha. ha.) those unwieldy spreadsheets and get on board with Link Tracking Lists, because these are the future. Have you been chasing a link from a particular site? Wondering if your outreach emails have borne fruit yet? Want to know if you’ve successfully placed a link, and how you’re linking? Link Tracking Lists cut out a huge time-suck when it comes to checking back on which of your target sites have actually linked back to you.

Why announce the beta today?

We’re sharing this now for a few reasons:

  • The new Link Explorer data and app have been available in beta to a limited audience. Even with a quiet, narrow release, the SEO community has been talking about it and asking good questions about our plans. Now that the Link Explorer beta is in broad release throughout all of Moz products and the broader Moz audience can play with it, we’re expecting even more curiosity and excitement.
  • If you’re relying on our legacy link technology, this is further notice to shift your applications and reporting to the new-and-improved tech. OSE will be retired soon! We’re making it easier for API customers to get the new data by providing a translation layer for the legacy API.
  • We want and need your feedback. We are committed to building the very best link building tool on the planet. You can expect us to invest heavily here. We need your help to guide our efforts and help us make the most impactful tradeoffs. This is your invitation to shape our roadmap.

Today’s release of our new Link Explorer technology is a revolution in Moz tools, not an evolution. We’ve made a major leap forward in our link index technology that delivers a ton of immediate value to Moz customers and the broader Moz Community.

Even though there are impactful improvements around the corner, this ambitious beta stands on its own two feet. OSE wasn’t cutting it and we’re proud of this new, fledgling tech.

What’s on the horizon for Link Explorer?

We’ve got even more features coming in the weeks and months ahead. Please let us know if we’re on the right track.

  • Link Building Assistant: a way to quickly identify new link acquisition opportunities
  • A more accurate and useful Link Intersect feature
  • Link Alerts to notify you when you get a link from a URL you were tracking in a list
  • Changes to how we count redirects: Currently we don’t count links to a redirect as links to the target of the redirect (that’s a lot of redirects), but we have this planned for the future.
  • Significantly scaling up our crawling to further improve freshness and size

Go forth, and explore:

Try the new Link Explorer!

Tomorrow Russ Jones will be sharing a post that discusses the importance of quality metrics when it comes to a link index, and don’t miss our pinned Q&A post answering questions about Domain Authority and Page Authority changes or our FAQ in the Help Hub.

We’ll be releasing early and often. Watch this space, and don’t hold back your feedback. Help us shape the future of Links at Moz. We’re listening!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Accelerated Mobile Pages: Is faster better?

Google has doubled down on Accelerated Mobile Pages (AMP), its open source initiative designed to improve web page speed and performance for mobile users. More than 2 billion AMP pages have been published from over 900,000 domains, and many online publishers report significant gains in both traffic…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Avoid This Rookie Marketing Habit to Get New Customers Faster

A component of my publishing philosophy is: “Wanting to write something does not guarantee that someone will want to read it.” And it comes into play when you write the first marketing materials for your business — many new marketers get excited about a type of writing that doesn’t turn out to be engaging. The
Read More…

The post Avoid This Rookie Marketing Habit to Get New Customers Faster appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customers have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn’t any. It’s bigger, better, faster, and you won’t pay an extra dime for it.

A moment of humility, though — if you’ve used our existing site crawl, you know it hasn’t always lived up to your expectations. Truth is, it hasn’t lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt “Aardwolf” engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on “All Crawled Pages” in the new crawler, and you’ll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let’s say I just wanted to see all of the pages crawled for Moz.com in the “/blog” directory…

I just click the [+], select “URL,” enter “/blog,” and I’m on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can’t wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click “Recrawl my site” from the top of any page in the Site Crawl section, and you’ll be on your way…

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you’re under tight deadlines for client reviews, we understand that waiting just isn’t an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what’s critical for one site is barely a nuisance for another. For example, let’s say I don’t care about a handful of overly dynamic URLs (for many sites, it’s a minor issue). With the new Site Crawl, I can just select those issues and then “Ignore” them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We’ll also keep tracking any new issues that pop up over time. Just because you don’t care about something today doesn’t mean you won’t need to know about it a month from now.

Fix duplicate content

Under “Content Issues,” we’ve launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the “parent” page. Here’s a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we’ve misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what’s a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we’ve attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we’re going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven’t helped you do as well as we could. We’re going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we’ve been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you’ll have history available from day one! Stay tuned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and be sure to register for the upcoming webinar.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

5 Ways to Write More, Faster, and with Less Stress

hp-write-more-faster

This week on Hit Publish, host Amy Harrison is talking momentum and productivity tricks. If you’ve ever found that things get in the way of your flow when you write and wish you could get more out of your writing time, this episode is for you.

On a previous show, Amy looked at how you could create content consistently, even when life gets in the way. Well, since then, a listener wrote a Dear Amy letter and asked about how to make actual writing time more productive.

Amy has struggled more than once to stay focused when she has a content creation project to do, but she’s found some workarounds to keep her on track.

Tune in to this episode of Hit Publish to find out:

  • Why you should expect (nay, embrace) overwhelm and then use this one rule to show it who’s really boss (answer: you are)
  • How ticking things off gives your writing simple momentum and is as addictive as popping bubble wrap
  • What a “sticky thoughts” pad is, and why every writer should have one
  • How to avoid “Writer’s Eye,” which can slow you down and cause you to make mistakes

Click Here to Listen to

Hit Publish on iTunes

Click Here to Listen on Rainmaker.FM

About the author

Rainmaker.FM

Rainmaker.FM is the premier digital commerce and content marketing podcast network. Get on-demand digital business and marketing advice from experts, whenever and wherever you want it.

The post 5 Ways to Write More, Faster, and with Less Stress appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

#3 Most Read Article of 2014: Index Your Content Faster With the Fetch as Google Tool

Have new content that you’d like to be discovered and found in Google’s search results more quickly? Within Google Webmaster Tools is the Fetch as Google tool, which gives users the opportunity to submit new URLs to Google’s index.

Home – SearchEngineWatch

Apparel: http://goforbrokeapparel.com LAST VLOG: https://www.youtube.com/watch?v=xCi60v9_uJI LAST FFF: https://www.youtube.com/watch?v=6jaEBJsQhis LAST LIFE VLOG: …
Video Rating: 4 / 5

Posted in IM NewsComments Off

How to Improve Your Conversion Rates with a Faster Website

Posted by Zoompf

credit-card-on-computer

Back in August the team at Zoompf published a joint research study with Moz analyzing How Website Speed Actually Impacts Search Ranking. In this research, a surprise result showed no clear correlation between page load time and search ranking. This confounded us, since we expected to see at least some small measure of correlation, especially after Google announced in 2010 that site speed would have a partial impact on search ranking. We did, however, observe a correlation between “Time to First Byte” and search ranking, and we delved into more detail in our follow-up post.

In these two articles, it was noted by our readers that while page load time may not appear to directly impact search ranking, it still has an obvious impact on user experience and will likely have an increasing impact on search ranking in the future. In other words, page load time should still be considered a priority to the success of your site.

But how big of a priority is it really? Of course it depends: The slower your site is now, the greater your user experience lags behind your competitors. Additionally, the more traffic your site receives, the more benefit you’ll receive from performance optimization (we’ll dig into that more below).

The good news is that, unlike the impact on search ranking, there is a wide body of independent research showing clear causation between improved site performance and increased conversion rates, user engagement, and customer satisfaction. It also just makes sense—we’ve all visited slow websites, and we’ve all bailed out when the page takes too long to load. On mobile we’re even less patient.

What may be surprising, though, is just how big of an impact a slow performance can have on your conversions. Let’s look at that first.

The research

research_books

Back in 2006, Amazon presented one of the first studies linking a clear causation between page load time and online customer revenue, summarized in Greg Linden’s presentation Make Data Useful. Through A/B testing, Greg showed every 100 millisecond delay in page rendering time resulted in a 1% loss of sales for Amazon.

In more recent research, Intuit presented findings at Velocity 2013 from their recent effort to reduce page load time from 15 seconds to 2 seconds. During that effort, they observed a dramatic increase in conversions for every second shaved off their page load time, in a stair step that decreased with increasing speed. Specifically:

  • +3% conversions for every second reduced from 15 seconds to 7 seconds
  • +2% conversions for every second reduced from seconds 7 to 5
  • +1% conversions for every second reduced from seconds 4 to 2

So in other words there was tremendous value in the initial optimization, and diminishing value as they got faster.

In another recent report, Kyle Rush from the 2011 Obama for America campaign site showed through A/B testing that a 3-second page time reduction (from 5 seconds to 2 seconds) improved onsite donations by 14%, resulting in an increase of over $ 34 million in election contributions.

In fact, there’s a wide body of research supporting clear economic benefits of improving your site performance, and clearly the slower your site is, the more you have to gain. Additionally, the higher your traffic, the larger the impact each millisecond will yield.

How fast should I be?

Whenever we talk with people about web performance, they always want to know “How fast should I be?” Unfortunately this one is hard to answer, since the result is subjective to your business goals. Those in the performance industry (of which, full disclosure, Zoompf is a member) may push you to hit two seconds or less, citing research such as that from Forrester showing that 47% of users expect pages to load in two seconds or less.

We prefer a more pragmatic approach: You should optimize to the point where the ROI continues to makes sense. The higher your traffic, the more monetary difference each millisecond gained will make. If you’re Amazon.com, a 200-ms improvement could mean millions of dollars. If you’re just launching a new site, getting down to 4-6 seconds may be good enough. Its really a judgment call on your current traffic levels, where your competition sits, your budget, and your strategic priorities.

The first step, though, is to measure where you stand. Fortunately, there’s a great free tool supported by Google at WebPageTest.org that can measure your page load time from various locations around the world. If you receive a lot of international traffic, don’t just select a location close to home—see how fast your site is loading from Sydney, London, Virginia, etc. The individual results may vary quite a bit! WebPageTest has a lot of bells and whistles, so check out this beginner’s guide to learn more.

Where do I start?

Improving the performance of your site can seem daunting, so it’s important you start with the low hanging fruit. Steve Souders, the Head Performance Engineer at Google, has famously stated:

“80-90% of the end-user response time is spent on the front-end. Start there.”

This has come to be called the Performance Golden Rule. In layman’s terms, this means that while optimizing your web server and database infrastructure is important, you will get a higher return on your time investment by first optimizing the front-end components loaded by your users’ browsers. This means all the images, CSS, JavaScript, Flash and other resources linked as dependencies from your base HTML page.

You can see the Performance Golden Rule well illustrated in a typical waterfall chart returned by tools like WebPageTest. Note how the original page requested is a very small subset of the overall time. Generating this original base page is where all the back-end server work is done. However, all the other resources included by that page (images, CSS, etc.) are what take the large majority of the time to load:

waterfall_frontend

So how can you speed up your front-end performance and reap the rewards of a better user experience? There are literally hundreds of ways. In the sections below, we will focus on the high-level best practices that generally yield the most benefit for the least amount of effort.

Step 1: Reduce the size of your page

Bloated content takes a long time to download. By reducing the size of your page, you not only improve your speed, you also reduce the used network bandwidth for which your hosting provider charges you.

An easy optimization is enabling HTTP compression, which can often reduce the size of your text resources (HTML, CSS, and JavaScript) by 50% or more. WhatsMyIP.org has a great free tool to test if compression is turned on for your site. When using, don’t just test the URL to your home page, but also test links to your JavaScript and CSS files. Often we find compression is turned on for HTML files, but not for JavaScript and CSS. This can represent a considerable potential performance boost when your server is configured for compression properly. Keep in mind, though, you do NOT want your images to be compressed by the server as they are already compressed. The extra server processing time will only slow things down. You can learn more in this detailed guide on what content you should compressing on your website.

If you find your server is not using compression, talk to your server admin or hosting provider to turn it on. Its often a simple configuration setting, for example see the mod_deflate module for Apache, IIS 7 configuration docs, or this article on enabling on WordPress sites.

In addition, images can often contribute to 80% or more of your total page download size, so its very important to optimize them as well. Follow these best practices to cut down your image size by 50% or more in some cases:

  • Don’t use PNG images for photos. JPEG images compress photographs to significantly smaller sizes with great image quality. For example, on Windows 8 launch day, the Microsoft homepage used a 1 megabyte PNG photograph when a visually comparable JPEG would have been 140k! Think of all the wasted bandwidth on that one image alone!
  • Don’t overuse PNGs for transparency. Transparency is a great effect (and not supported by JPEG), but if you don’t need it, you don’t always need the extra space of a PNG image, especially for photographic images. PNGs work better for logos and images with sharp contrast, like text.
  • Correctly set your JPEG image quality. Using a quality setting of 50-75% can significantly reduce the size of your image without noticeable impact on image quality. Of course, each result should be individually evaluated. In most cases your image sizes should all be less than 100k, and preferably smaller.
  • Strip out extraneous metadata from your images. Image editors leave a lot of “junk” in your image files, including thumbnails, comments, unused palette entries and more. While these are useful to the designer, they don’t need to be downloaded by your users. Instead, have your designer make a backup copy for their own use, and then run the website image versions through a free optimizer like Yahoo’s Smush.It or open source tools like pngcrush and jpegtran.

Lastly, another good way to reduce your page size is to Minify your Javascript and CSS. “Minification” is a process that strips out the extra comments and spaces in your code, as well as shortening the names of functions and variables. This is best seen by example:

Example: Original Javascript

 /* ALERT PLUGIN DEFINITION
  * ======================= */
  var old = $  .fn.alert
  $  .fn.alert = function (option) {
    return this.each(function () {
      var $  this = $  (this)
        , data = $  this.data('alert')
      if (!data) $  this.data('alert', (data = new Alert(this)))
      if (typeof option == 'string') data[option].call($  this)
    })
  }
  $  .fn.alert.Constructor = Alert

Minified Version (from YUI Compressor):

var old=$  .fn.alert;$  .fn.alert=function(a){return this.each(function(){var c=$  (this),b=c.data("alert");if(!b){c.data("alert",(b=new Alert(this)))}if(typeof a=="string"){b[a].call(c)}})};

Your minified pages will still render the same, and this can often reduce file sizes by 10-20% or more. As you can see, this also has the added benefit of obfuscating your code to make it harder for your competitors to copy and modify all your hard earned work for their own purposes. JSCompress is a basic easy online tool for Javascript, or you can also try out more powerful tools like JSMin or Yahoo’s YUI compressor (also works for CSS). There’s also a useful online version of YUI which we recommend.

Step 2: Reduce the number of browser requests

The more resources your browser requests to render your page, the longer it will take to load. A great strategy to reduce your page load time is to simply cut down the number of requests your page has to make. This means less images, fewer JavaScript files, fewer analytics beacons, etc. There’s a reason Google’s homepage is so spartan, the clean interface has very few dependencies and thus loads super fast.

While “less is more” should be the goal, we realize this is not always possible, so are some additional strategies you can employ:

  • Allow browser caching. If your page dependencies don’t change often, there’s no reason the browser should download them again and again. Talk to your server admin to make sure caching is turned on for your images, JS and CSS. A quick test is to plug the URL of one of your images into redbot.org and look for the header Expires or Cache-Control: max-age in the result. For example, this image off the eBay home page will be cached by your browser for 28,180,559 seconds (just over 1 year).

expires_header2

Cache-Control is the newer way of doing things, but often times you’ll also see Expires to support older browsers. If you see both, Cache-Control will “win” for newer browsers.

While browser side caching will not speed up the initial page load of your site, it will make a HUGE difference on repeat views, often knocking off 70% or more of the time. You can see this clearly when looking at the “Repeat View” metrics in a WebPageTest test, for example:

broswer_caching

  • Combine related CSS and JS files. While numerous individual CSS and JS files are easier for your developers to maintain, a lesser number of files can load much faster by your browser. If your files change infrequently, then a one time concatenation of files is an easy win. If they do change frequently, consider adding a step to your deploy process that automatically concatenates related groups of functionality prior to deployment, grouping by related functional area. There are pros and cons to each approach, but there’s some great info in this StackOverflow thread.
  • Combine small images into CSS sprites. If your site has lots of small images (buttons, icons, etc.), you can realize significant performance gains by combining them all into a single image file called a “sprite.” Sprites are more challenging to implement, but can yield significant performance gains for visually rich sites. See the CSS Image Sprites article on w3schools for more information, and check out the free tool SpriteMe.

Step 3: Reduce the distance to your site

If your website is hosted in Virginia, but your users are visiting from Australia, it’s going to take them a long time to download your images, JavaScript and CSS. This can be a big problem if your site is content-heavy and you get a lot of traffic from users far away. Fortunately, there’s an easy answer: Sign up for a Content Delivery Network (CDN). There are many excellent ones out there now, including Akamai, Amazon CloudFront, CloudFlare and more.

CDN’s work basically like this: you change the URL of your images, JS and CSS from something like this:

http://mysite.com/myimage.png

to something like this (as per the instructions given to you from your CDN provider):

http://d34vewdf5sdfsdfs.cloudnfront.net/myimage.png

Which then instructs the browser to look out on the CDN network for your image. The CDN provider will then return that image to the browser if it has it, or it will pull it from your site and store for reuse later if it doesn’t. The magic of CDNs is that they then copy that same image (or javascript or CSS file) to dozens, hundreds or even thousands of “edge nodes” around the world to route that browser request to the closest available location. So if you’re in Melbourne and request an image hosted in Virginia, you may instead get a copy from Sydney. Just like magic.

To illustrate, consider the left image (centralized server) vs. the right image (duplicated content around the world):

In closing

While front-end performance does not currently appear to have a direct impact on search ranking, it has a clear impact on user engagement and conversions into paying customers. Since page load time also has a direct impact on user experience, it is very likely to have a future impact on search ranking.

While there are many ways to optimize your site, we suggest three core principles to remember when optimizing your site:

  1. Reduce the size of your page
  2. Reduce the number of browser requests
  3. Reduce the distance to your site

Within each of these, there are different strategies that apply based on the makeup of your site. We at Zoompf have also introduced several free tools that can help you determine which areas will make the biggest impact, and we also support a free tool to analyze your website for over 400 common causes of slow front-end performance. You can find them here: http://zoompf.com/free.

Happy hunting!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Advert