Tag Archive | "Faster"

Try This System to Manage Your Blog Comments Faster (and with Less Stress)

I think the best way to introduce the topic of this post is to remind you that my favorite word is No. At the risk of sounding no-fun, I like rules. If you’re in a position to set rules for any given situation, they can help you reach solutions to issues faster and avoid future
Read More…

The post Try This System to Manage Your Blog Comments Faster (and with Less Stress) appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Faster, Fresher, Better: Announcing Link Explorer, Moz’s New Link Building Tool

Posted by SarahBird

More link data. Fresher link data. Faster link data.

Today, I’m delighted to share that after eons of hard work, blood, sweat, tears, and love, Moz is taking a major step forward on our commitment to provide the best SEO tools money can buy.

We’ve rebuilt our link technology from the ground up and the data is now broadly available throughout Moz tools. It’s bigger, fresher, and much, much faster than our legacy link tech. And we’re just getting started! The best way to quickly understand the potential power of our revolutionary new link tech is to play with the beta of our Link Explorer.

Introducing Link Explorer, the newest addition to the Moz toolset!

We’ve heard your frustrations with Open Site Explorer and we know that you want more from Moz and your link building tools. OSE has done more than put in its time. Groundbreaking when it launched in 2008, it’s worked long and hard to bring link data to the masses. It deserves the honor of a graceful retirement.

OSE represents our past; the new Link Explorer is our fast, innovative, ambitious future.

Here are some of my favorite things about the Link Explorer beta:

  • It’s 20x larger and 30x fresher than OSE (RIP)
  • Despite its huge index size, the app is lightning fast! I can’t stand waiting so this might be my number-one fav improvement.
  • We’re introducing Link Tracking Lists to make managing your link building efforts a breeze. Sometimes the simple things make the biggest difference, like when they started making vans with doors on each side. You’ll never go back.
  • Link Explorer includes historic data, a painful gap in OSE. Studying your gained/lost linking domains is fast and easy.
  • The new UX surfaces competitive insights much more quickly
  • Increases the size and freshness of the index improved the quality of Domain Authority and Spam Score. Voilà.

All this, and we’re only in beta.

Dive into your link data now!

Here’s a deeper dive into my favorites:

#1: The sheer size, quality, and speed of it all

We’re committed to data quality. Here are some ways that shows up in the Moz tools:

  • When we collect rankings, we evaluate the natural first page of rankings to ensure that the placement and content of featured snippets and other SERP features are correctly situated (as can happen when ranking are collected in 50- or 100-page batches). This is more expensive, but we think the tradeoff is worth it.
  • We were the first to build a hybrid search volume model using clickstream data. We still believe our model is the most accurate.
  • Our SERP corpus, which powers Keywords by Site, is completely refreshed every two weeks. We actively update up to 15 million of the keywords each month to remove keywords that are no longer being searched and replace them with trending keywords and terms. This helps keep our keyword data set fresh and relevant.

The new Link Explorer index extends this commitment to data quality. OSE wasn’t cutting it and we’re thrilled to unleash this new tech.

Link Explorer is over 20x larger and 30x fresher than our legacy link index. Bonus points: the underlying technology is very cost-efficient, making it much less expensive for us to scale over time. This frees up resources to focus on feature delivery. BOOM!

One of my top pet peeves is waiting. I feel physical pain while waiting in lines and for apps to load. I can’t stand growing old waiting for a page to load (amirite?).

The new Link Explorer app is delightfully, impossibly fast. It’s like magic. That’s how link research should be. Magical.

#2: Historical data showing discovered and lost linking domains

If you’re a visual person, this report gives you an immediate idea of how your link building efforts are going. A spike you weren’t expecting could be a sign of spam network monkey business. Deep-dive effortlessly on the links you lost and gained so you can spend your valuable time doing thoughtful, human outreach.

#3: Link Tracking Lists

Folks, this is a big one. Throw out (at least one of… ha. ha.) those unwieldy spreadsheets and get on board with Link Tracking Lists, because these are the future. Have you been chasing a link from a particular site? Wondering if your outreach emails have borne fruit yet? Want to know if you’ve successfully placed a link, and how you’re linking? Link Tracking Lists cut out a huge time-suck when it comes to checking back on which of your target sites have actually linked back to you.

Why announce the beta today?

We’re sharing this now for a few reasons:

  • The new Link Explorer data and app have been available in beta to a limited audience. Even with a quiet, narrow release, the SEO community has been talking about it and asking good questions about our plans. Now that the Link Explorer beta is in broad release throughout all of Moz products and the broader Moz audience can play with it, we’re expecting even more curiosity and excitement.
  • If you’re relying on our legacy link technology, this is further notice to shift your applications and reporting to the new-and-improved tech. OSE will be retired soon! We’re making it easier for API customers to get the new data by providing a translation layer for the legacy API.
  • We want and need your feedback. We are committed to building the very best link building tool on the planet. You can expect us to invest heavily here. We need your help to guide our efforts and help us make the most impactful tradeoffs. This is your invitation to shape our roadmap.

Today’s release of our new Link Explorer technology is a revolution in Moz tools, not an evolution. We’ve made a major leap forward in our link index technology that delivers a ton of immediate value to Moz customers and the broader Moz Community.

Even though there are impactful improvements around the corner, this ambitious beta stands on its own two feet. OSE wasn’t cutting it and we’re proud of this new, fledgling tech.

What’s on the horizon for Link Explorer?

We’ve got even more features coming in the weeks and months ahead. Please let us know if we’re on the right track.

  • Link Building Assistant: a way to quickly identify new link acquisition opportunities
  • A more accurate and useful Link Intersect feature
  • Link Alerts to notify you when you get a link from a URL you were tracking in a list
  • Changes to how we count redirects: Currently we don’t count links to a redirect as links to the target of the redirect (that’s a lot of redirects), but we have this planned for the future.
  • Significantly scaling up our crawling to further improve freshness and size

Go forth, and explore:

Try the new Link Explorer!

Tomorrow Russ Jones will be sharing a post that discusses the importance of quality metrics when it comes to a link index, and don’t miss our pinned Q&A post answering questions about Domain Authority and Page Authority changes or our FAQ in the Help Hub.

We’ll be releasing early and often. Watch this space, and don’t hold back your feedback. Help us shape the future of Links at Moz. We’re listening!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Accelerated Mobile Pages: Is faster better?

Google has doubled down on Accelerated Mobile Pages (AMP), its open source initiative designed to improve web page speed and performance for mobile users. More than 2 billion AMP pages have been published from over 900,000 domains, and many online publishers report significant gains in both traffic…



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Avoid This Rookie Marketing Habit to Get New Customers Faster

A component of my publishing philosophy is: “Wanting to write something does not guarantee that someone will want to read it.” And it comes into play when you write the first marketing materials for your business — many new marketers get excited about a type of writing that doesn’t turn out to be engaging. The
Read More…

The post Avoid This Rookie Marketing Habit to Get New Customers Faster appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customers have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn’t any. It’s bigger, better, faster, and you won’t pay an extra dime for it.

A moment of humility, though — if you’ve used our existing site crawl, you know it hasn’t always lived up to your expectations. Truth is, it hasn’t lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt “Aardwolf” engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on “All Crawled Pages” in the new crawler, and you’ll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let’s say I just wanted to see all of the pages crawled for Moz.com in the “/blog” directory…

I just click the [+], select “URL,” enter “/blog,” and I’m on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can’t wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click “Recrawl my site” from the top of any page in the Site Crawl section, and you’ll be on your way…

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you’re under tight deadlines for client reviews, we understand that waiting just isn’t an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what’s critical for one site is barely a nuisance for another. For example, let’s say I don’t care about a handful of overly dynamic URLs (for many sites, it’s a minor issue). With the new Site Crawl, I can just select those issues and then “Ignore” them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We’ll also keep tracking any new issues that pop up over time. Just because you don’t care about something today doesn’t mean you won’t need to know about it a month from now.

Fix duplicate content

Under “Content Issues,” we’ve launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the “parent” page. Here’s a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we’ve misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what’s a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we’ve attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we’re going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven’t helped you do as well as we could. We’re going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we’ve been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you’ll have history available from day one! Stay tuned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and be sure to register for the upcoming webinar.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

5 Ways to Write More, Faster, and with Less Stress

hp-write-more-faster

This week on Hit Publish, host Amy Harrison is talking momentum and productivity tricks. If you’ve ever found that things get in the way of your flow when you write and wish you could get more out of your writing time, this episode is for you.

On a previous show, Amy looked at how you could create content consistently, even when life gets in the way. Well, since then, a listener wrote a Dear Amy letter and asked about how to make actual writing time more productive.

Amy has struggled more than once to stay focused when she has a content creation project to do, but she’s found some workarounds to keep her on track.

Tune in to this episode of Hit Publish to find out:

  • Why you should expect (nay, embrace) overwhelm and then use this one rule to show it who’s really boss (answer: you are)
  • How ticking things off gives your writing simple momentum and is as addictive as popping bubble wrap
  • What a “sticky thoughts” pad is, and why every writer should have one
  • How to avoid “Writer’s Eye,” which can slow you down and cause you to make mistakes

Click Here to Listen to

Hit Publish on iTunes

Click Here to Listen on Rainmaker.FM

About the author

Rainmaker.FM

Rainmaker.FM is the premier digital commerce and content marketing podcast network. Get on-demand digital business and marketing advice from experts, whenever and wherever you want it.

The post 5 Ways to Write More, Faster, and with Less Stress appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

#3 Most Read Article of 2014: Index Your Content Faster With the Fetch as Google Tool

Have new content that you’d like to be discovered and found in Google’s search results more quickly? Within Google Webmaster Tools is the Fetch as Google tool, which gives users the opportunity to submit new URLs to Google’s index.

Home – SearchEngineWatch

Apparel: http://goforbrokeapparel.com LAST VLOG: https://www.youtube.com/watch?v=xCi60v9_uJI LAST FFF: https://www.youtube.com/watch?v=6jaEBJsQhis LAST LIFE VLOG: …
Video Rating: 4 / 5

Posted in IM NewsComments Off

How to Improve Your Conversion Rates with a Faster Website

Posted by Zoompf

credit-card-on-computer

Back in August the team at Zoompf published a joint research study with Moz analyzing How Website Speed Actually Impacts Search Ranking. In this research, a surprise result showed no clear correlation between page load time and search ranking. This confounded us, since we expected to see at least some small measure of correlation, especially after Google announced in 2010 that site speed would have a partial impact on search ranking. We did, however, observe a correlation between “Time to First Byte” and search ranking, and we delved into more detail in our follow-up post.

In these two articles, it was noted by our readers that while page load time may not appear to directly impact search ranking, it still has an obvious impact on user experience and will likely have an increasing impact on search ranking in the future. In other words, page load time should still be considered a priority to the success of your site.

But how big of a priority is it really? Of course it depends: The slower your site is now, the greater your user experience lags behind your competitors. Additionally, the more traffic your site receives, the more benefit you’ll receive from performance optimization (we’ll dig into that more below).

The good news is that, unlike the impact on search ranking, there is a wide body of independent research showing clear causation between improved site performance and increased conversion rates, user engagement, and customer satisfaction. It also just makes sense—we’ve all visited slow websites, and we’ve all bailed out when the page takes too long to load. On mobile we’re even less patient.

What may be surprising, though, is just how big of an impact a slow performance can have on your conversions. Let’s look at that first.

The research

research_books

Back in 2006, Amazon presented one of the first studies linking a clear causation between page load time and online customer revenue, summarized in Greg Linden’s presentation Make Data Useful. Through A/B testing, Greg showed every 100 millisecond delay in page rendering time resulted in a 1% loss of sales for Amazon.

In more recent research, Intuit presented findings at Velocity 2013 from their recent effort to reduce page load time from 15 seconds to 2 seconds. During that effort, they observed a dramatic increase in conversions for every second shaved off their page load time, in a stair step that decreased with increasing speed. Specifically:

  • +3% conversions for every second reduced from 15 seconds to 7 seconds
  • +2% conversions for every second reduced from seconds 7 to 5
  • +1% conversions for every second reduced from seconds 4 to 2

So in other words there was tremendous value in the initial optimization, and diminishing value as they got faster.

In another recent report, Kyle Rush from the 2011 Obama for America campaign site showed through A/B testing that a 3-second page time reduction (from 5 seconds to 2 seconds) improved onsite donations by 14%, resulting in an increase of over $ 34 million in election contributions.

In fact, there’s a wide body of research supporting clear economic benefits of improving your site performance, and clearly the slower your site is, the more you have to gain. Additionally, the higher your traffic, the larger the impact each millisecond will yield.

How fast should I be?

Whenever we talk with people about web performance, they always want to know “How fast should I be?” Unfortunately this one is hard to answer, since the result is subjective to your business goals. Those in the performance industry (of which, full disclosure, Zoompf is a member) may push you to hit two seconds or less, citing research such as that from Forrester showing that 47% of users expect pages to load in two seconds or less.

We prefer a more pragmatic approach: You should optimize to the point where the ROI continues to makes sense. The higher your traffic, the more monetary difference each millisecond gained will make. If you’re Amazon.com, a 200-ms improvement could mean millions of dollars. If you’re just launching a new site, getting down to 4-6 seconds may be good enough. Its really a judgment call on your current traffic levels, where your competition sits, your budget, and your strategic priorities.

The first step, though, is to measure where you stand. Fortunately, there’s a great free tool supported by Google at WebPageTest.org that can measure your page load time from various locations around the world. If you receive a lot of international traffic, don’t just select a location close to home—see how fast your site is loading from Sydney, London, Virginia, etc. The individual results may vary quite a bit! WebPageTest has a lot of bells and whistles, so check out this beginner’s guide to learn more.

Where do I start?

Improving the performance of your site can seem daunting, so it’s important you start with the low hanging fruit. Steve Souders, the Head Performance Engineer at Google, has famously stated:

“80-90% of the end-user response time is spent on the front-end. Start there.”

This has come to be called the Performance Golden Rule. In layman’s terms, this means that while optimizing your web server and database infrastructure is important, you will get a higher return on your time investment by first optimizing the front-end components loaded by your users’ browsers. This means all the images, CSS, JavaScript, Flash and other resources linked as dependencies from your base HTML page.

You can see the Performance Golden Rule well illustrated in a typical waterfall chart returned by tools like WebPageTest. Note how the original page requested is a very small subset of the overall time. Generating this original base page is where all the back-end server work is done. However, all the other resources included by that page (images, CSS, etc.) are what take the large majority of the time to load:

waterfall_frontend

So how can you speed up your front-end performance and reap the rewards of a better user experience? There are literally hundreds of ways. In the sections below, we will focus on the high-level best practices that generally yield the most benefit for the least amount of effort.

Step 1: Reduce the size of your page

Bloated content takes a long time to download. By reducing the size of your page, you not only improve your speed, you also reduce the used network bandwidth for which your hosting provider charges you.

An easy optimization is enabling HTTP compression, which can often reduce the size of your text resources (HTML, CSS, and JavaScript) by 50% or more. WhatsMyIP.org has a great free tool to test if compression is turned on for your site. When using, don’t just test the URL to your home page, but also test links to your JavaScript and CSS files. Often we find compression is turned on for HTML files, but not for JavaScript and CSS. This can represent a considerable potential performance boost when your server is configured for compression properly. Keep in mind, though, you do NOT want your images to be compressed by the server as they are already compressed. The extra server processing time will only slow things down. You can learn more in this detailed guide on what content you should compressing on your website.

If you find your server is not using compression, talk to your server admin or hosting provider to turn it on. Its often a simple configuration setting, for example see the mod_deflate module for Apache, IIS 7 configuration docs, or this article on enabling on WordPress sites.

In addition, images can often contribute to 80% or more of your total page download size, so its very important to optimize them as well. Follow these best practices to cut down your image size by 50% or more in some cases:

  • Don’t use PNG images for photos. JPEG images compress photographs to significantly smaller sizes with great image quality. For example, on Windows 8 launch day, the Microsoft homepage used a 1 megabyte PNG photograph when a visually comparable JPEG would have been 140k! Think of all the wasted bandwidth on that one image alone!
  • Don’t overuse PNGs for transparency. Transparency is a great effect (and not supported by JPEG), but if you don’t need it, you don’t always need the extra space of a PNG image, especially for photographic images. PNGs work better for logos and images with sharp contrast, like text.
  • Correctly set your JPEG image quality. Using a quality setting of 50-75% can significantly reduce the size of your image without noticeable impact on image quality. Of course, each result should be individually evaluated. In most cases your image sizes should all be less than 100k, and preferably smaller.
  • Strip out extraneous metadata from your images. Image editors leave a lot of “junk” in your image files, including thumbnails, comments, unused palette entries and more. While these are useful to the designer, they don’t need to be downloaded by your users. Instead, have your designer make a backup copy for their own use, and then run the website image versions through a free optimizer like Yahoo’s Smush.It or open source tools like pngcrush and jpegtran.

Lastly, another good way to reduce your page size is to Minify your Javascript and CSS. “Minification” is a process that strips out the extra comments and spaces in your code, as well as shortening the names of functions and variables. This is best seen by example:

Example: Original Javascript

 /* ALERT PLUGIN DEFINITION
  * ======================= */
  var old = $  .fn.alert
  $  .fn.alert = function (option) {
    return this.each(function () {
      var $  this = $  (this)
        , data = $  this.data('alert')
      if (!data) $  this.data('alert', (data = new Alert(this)))
      if (typeof option == 'string') data[option].call($  this)
    })
  }
  $  .fn.alert.Constructor = Alert

Minified Version (from YUI Compressor):

var old=$  .fn.alert;$  .fn.alert=function(a){return this.each(function(){var c=$  (this),b=c.data("alert");if(!b){c.data("alert",(b=new Alert(this)))}if(typeof a=="string"){b[a].call(c)}})};

Your minified pages will still render the same, and this can often reduce file sizes by 10-20% or more. As you can see, this also has the added benefit of obfuscating your code to make it harder for your competitors to copy and modify all your hard earned work for their own purposes. JSCompress is a basic easy online tool for Javascript, or you can also try out more powerful tools like JSMin or Yahoo’s YUI compressor (also works for CSS). There’s also a useful online version of YUI which we recommend.

Step 2: Reduce the number of browser requests

The more resources your browser requests to render your page, the longer it will take to load. A great strategy to reduce your page load time is to simply cut down the number of requests your page has to make. This means less images, fewer JavaScript files, fewer analytics beacons, etc. There’s a reason Google’s homepage is so spartan, the clean interface has very few dependencies and thus loads super fast.

While “less is more” should be the goal, we realize this is not always possible, so are some additional strategies you can employ:

  • Allow browser caching. If your page dependencies don’t change often, there’s no reason the browser should download them again and again. Talk to your server admin to make sure caching is turned on for your images, JS and CSS. A quick test is to plug the URL of one of your images into redbot.org and look for the header Expires or Cache-Control: max-age in the result. For example, this image off the eBay home page will be cached by your browser for 28,180,559 seconds (just over 1 year).

expires_header2

Cache-Control is the newer way of doing things, but often times you’ll also see Expires to support older browsers. If you see both, Cache-Control will “win” for newer browsers.

While browser side caching will not speed up the initial page load of your site, it will make a HUGE difference on repeat views, often knocking off 70% or more of the time. You can see this clearly when looking at the “Repeat View” metrics in a WebPageTest test, for example:

broswer_caching

  • Combine related CSS and JS files. While numerous individual CSS and JS files are easier for your developers to maintain, a lesser number of files can load much faster by your browser. If your files change infrequently, then a one time concatenation of files is an easy win. If they do change frequently, consider adding a step to your deploy process that automatically concatenates related groups of functionality prior to deployment, grouping by related functional area. There are pros and cons to each approach, but there’s some great info in this StackOverflow thread.
  • Combine small images into CSS sprites. If your site has lots of small images (buttons, icons, etc.), you can realize significant performance gains by combining them all into a single image file called a “sprite.” Sprites are more challenging to implement, but can yield significant performance gains for visually rich sites. See the CSS Image Sprites article on w3schools for more information, and check out the free tool SpriteMe.

Step 3: Reduce the distance to your site

If your website is hosted in Virginia, but your users are visiting from Australia, it’s going to take them a long time to download your images, JavaScript and CSS. This can be a big problem if your site is content-heavy and you get a lot of traffic from users far away. Fortunately, there’s an easy answer: Sign up for a Content Delivery Network (CDN). There are many excellent ones out there now, including Akamai, Amazon CloudFront, CloudFlare and more.

CDN’s work basically like this: you change the URL of your images, JS and CSS from something like this:

http://mysite.com/myimage.png

to something like this (as per the instructions given to you from your CDN provider):

http://d34vewdf5sdfsdfs.cloudnfront.net/myimage.png

Which then instructs the browser to look out on the CDN network for your image. The CDN provider will then return that image to the browser if it has it, or it will pull it from your site and store for reuse later if it doesn’t. The magic of CDNs is that they then copy that same image (or javascript or CSS file) to dozens, hundreds or even thousands of “edge nodes” around the world to route that browser request to the closest available location. So if you’re in Melbourne and request an image hosted in Virginia, you may instead get a copy from Sydney. Just like magic.

To illustrate, consider the left image (centralized server) vs. the right image (duplicated content around the world):

In closing

While front-end performance does not currently appear to have a direct impact on search ranking, it has a clear impact on user engagement and conversions into paying customers. Since page load time also has a direct impact on user experience, it is very likely to have a future impact on search ranking.

While there are many ways to optimize your site, we suggest three core principles to remember when optimizing your site:

  1. Reduce the size of your page
  2. Reduce the number of browser requests
  3. Reduce the distance to your site

Within each of these, there are different strategies that apply based on the makeup of your site. We at Zoompf have also introduced several free tools that can help you determine which areas will make the biggest impact, and we also support a free tool to analyze your website for over 400 common causes of slow front-end performance. You can find them here: http://zoompf.com/free.

Happy hunting!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

How to Make WordPress Sites Load 72.7% Faster

Image of going faster

You want to know the secret to a faster WordPress website?

You and everyone else.

But it’s not a secret. The problem is that there is a whole lot of misguidance out there that makes it hard for site owners like you and me to identify the solutions that really work.

Let’s cut the crap and turn down the hype.

If your house has foundation problems, you don’t treat your windows. You fix the foundation. If your car is idling rough, you don’t change your tires. You get under the hood and address the engine.

So if your WordPress website is slow, why on earth would you look to the edges first for a solution?

You don’t. Well, not if you want a real solution that can nearly double your load speed.

You look to the core

The core of your body is generally defined as the torso minus the appendages (arms, legs, and head).

Your WordPress website has a core too. Its “torso” includes hosting, theme, and plugins. This is the origin of every page your site serves. Speed and performance are determined here, at the web page’s origin, not at the edge where the page is viewed.

Sure, you might also have some appendages on your WordPress website. Your site’s “arms and legs” might include nebulous cloud solutions or dispensable content delivery networks. But edge services like these ultimately take their cues from the origin.

Think of it this way:

If you want total body strength, you see the most benefits by increasing the strength of the area in the body where almost all movement originates from: the core. Ripped biceps and pulsating calves are nice, but strength and flexibility in your abs and lower back are essential.

Your website is the same.

If you want supreme performance from your site, then look right to the core and strengthen — or, better said, optimize — the origin of your content.

Here is what an optimized core for a WordPress website includes:

  • Reliable DNS
  • Hosting configured specifically for WordPress with a smart origin caching strategy
  • A clean theme that is devoid of bloat (and preferably on a framework)
  • A plugin list trimmed of fat

Let’s break down these elements, and then I’ll show you how easy it is to increase WordPress performance by nearly 75 percent.

Don’t ignore DNS (even though you want to)

DNS is an annoying topic that makes little sense to most of us, but it can also determine whether your site sinks or swims.

You don’t need to know much about DNS (in fact, you don’t even need to know what the letters stand for), but you should know this: it is the first communication made when someone attempts to pull up one of your URLs in a web browser.

If the DNS for your site isn’t working, then the browser cannot find your site. If it can’t find your site, then it can’t find your page. The net result is that while everything else is working perfectly, your site is as good as down.

It would be like putting a letter in the mail with only an addressee name but no address. The letter isn’t getting there. Your web page isn’t either.

You may have no earthly idea whether you have a DNS problem or not. Find out. Don’t live in the dark. Get data.

Run a quick Pingdom test for your site and look at the first object in the waterfall to load, your domain name. Hover over the multi-colored bar representing its load time. The first number is for DNS, and it should be fast as a blink. Copyblogger’s is 7 milliseconds … even faster than a blink.

The point is this: if you’re serious about performance, get serious about DNS.

Go with a top-notch provider like Amazon’s Route 53. And if the thought of migrating your DNS alone scares you, get someone like our friends at Fantasktic to help.

Improving your WordPress efficiency by even 10 percent won’t matter if your DNS is not reliable.

Get your WordPress hosting and caching right

Once your DNS is squared away, you can turn toward your WordPress install. Start with hosting.

We’ve said it before, and I’ll say it again: serious WordPress users need hosting that means business. If you choose generic, shared hosting then you’ll get generic, slow performance.

If you are serious about your site, and if you don’t want it crashing the first time you get a nice stream of traffic, then you need to host it on servers specifically configured for the complexities of hosting WordPress.

You see, WordPress generates pages dynamically. This can mean pulling from theme files, from the database, from image folders, and from third-party sites — just to get a single page generated.

The whole process becomes quicker and more reliable when NGINX is involved and when entire pages, or elements of pages, can be saved on the server and loaded pre-generated when called. This enables web pages to be served much faster to more potential users at once.

This latter process is called origin caching. You can see why having a smart origin caching strategy is so important.

When it comes to WordPress, origin caching doesn’t get much smarter or more efficient than the team at W3 EDGE and their plugin W3 Total Cache.

The latest Pro release of W3 Total Cache includes fragment caching support for specific theme frameworks (currently only the Genesis framework, as of this article being published). This allows for even more granular caching control and, most important, faster load times like the ones I’m about to illustrate.

When it comes to theme and plugins … clean, clean, clean

Speaking of Genesis, there is a reason why more than 100,000 people use it to power their WordPress websites. Even the best NGINX configuration with the smartest origin caching plan cannot compensate for a bloated theme.

This is the digital web, where ultimately everything we do online is really just the stacking and restacking of 1s and 0s. So everything goes back to code.

Bad code equals a bad site. Period. Whether it’s in your theme or in a plugin, bad code will sink your site and there is no magic hosting, caching, “cloud,” or CDN pill that will cure it.

Bad code makes your site a ticking time bomb that could explode and crash as soon as your next traffic bump.

This is why you need to choose a clean theme. The Genesis Framework and every single child theme from the StudioPress team are exactly that.

This is also why you need to keep your plugin folder as clean as possible.

Keep only the plugins that you need for essential functionality. And of those, keep only the ones with solid, proven code behind them that are actively supported.

WordPress is a strong piece of software. It can handle 40+ plugins if they are all the right ones. But it only takes one bad apple to spoil the bunch. A single faulty plugin could make your site run 72.7 percent slower, which is the exact opposite of what we’re aiming for here.

Let’s break it on down …

The stark reality is that improving WordPress performance is more about removing crap from the core than it is about adding band-aids to the edge.

This may involve some investment, of both money and time, plus a few tough decisions.

Generic hosting, cheap themes, and running the wrong plugins will sink you. If you insist on any one of these, you aren’t serious about performance.

But for those of you who are serious, here is a real-world example of how easy it is to turbocharge a WordPress site.

How to improve WordPress performance by 72.7%

If you are a Synthesis customer, you know Julian Fernandes. Currently stationed in his native Brazil, Julian is one of the most knowledgeable and passionate members of the Synthesis support staff. I asked him to run a few tests for me in preparation for this article. He excitedly obliged.

Here is what Julian did:

He took a domain, julianfernand.es, and brought it up on Synthesis with a basic plan, which includes W3 Total Cache Pro and the latest version of WordPress by default. Then he started running Pingdom tests.

His first test was with the Twenty Thirteen theme running:

default-wordpress

His second test replaced the default WordPress theme with the Genesis Framework and the Sixteen Nine theme, which is also included with every Synthesis setup by default.

genesis-w3tcpro

Then Julian activated fragment caching from within W3 Total Cache Pro.

genesis-w3tcpro-frag

As you can see, with all else equal, just adding the Genesis framework improved the load time from 630 ms to 172 ms. Activating fragment caching on top of that dropped the load time further to 157 ms.

That is a 72.7% increase in speed for WordPress.

And if we had done an initial test on generic non-WordPress hosting, the difference would have been even more drastic.

(For the record, we also tested a random theme from ThemeForest as well as the Woo framework. Each performed well, 334 ms and 347 ms respectively, but not as well Genesis.)

Realize that these tests were not run on a bare bones install. There were actually five posts on the home page, each with featured images. There were even a few widgets, including Simple Social Icons. Yet the speeds were still that fast.

This is what an optimized core can do.

How to optimize your WordPress core today

Now you know what is possible with an optimized core.

Good DNS and hosting alone can get you to sub-second page load speeds. Optimize the core even further with a premium framework, efficient caching, and a no-bloat mindset, and you’re down to eye-blink speeds like 157 milliseconds.

That’s why inward, to your site’s core, is the first place you should look if you want to improve performance.

More often than not, you will find that you don’t need the fancy frills you are so often being sold. They are not the magical performance elixirs they are purported to be. There is a time and place for CDNs and the like, but get your origin optimized first and then let data drive your decisions.

If you want further guidance on that issue in particular, check out this whitepaper we wrote with the W3 EDGE team: “The Truth About WordPress Performance: Why You May Not Need What You’re Being Sold”

And if you want to go right ahead and get moving, then get started at Synthesis, which gives you literally everything outside of DNS that you need for an optimized core:

  • Hosting configurations designed for WordPress
  • The Genesis framework with Sixteen Nine included
  • W3 Total Cache Pro with support for fragment caching (if you run Genesis)

Plus it’s all ready to go out of the box, with friendly experts like Julian ready to answer questions if you have them.

To start optimizing your core today, sign up with Synthesis.

(And don’t forget to consider our new data center in Amsterdam to further optimize your core for your European traffic.)

About the author

Jerod Morris

Jerod Morris is the Director of Content for Copyblogger Media and a founding member of the Synthesis Managed WordPress Hosting team. Get more from Jerod on Twitter and .

Related Stories

Copyblogger

Posted in IM NewsComments Off

5 Writing Links That Will Help You Get Better … Stronger … Faster

The Lede | copyblogger.com

This week on The Lede

  • How a 40,000 Word PDF Earned a Massive Audience
  • The Overwhelming Force of “Gradual”
  • How to Write a Good Blog Post, Fast
  • On Finding Real Pleasure in Our Work
  • Twitter’s First Bona Fide Star

Want to grab even more useful links, beyond those that make The Lede, plus additional obscure references to The Six Million Dollar Man, adjusted for inflation?

You only have to follow @copyblogger on Twitter.

//

How a 40,000 Word Guide Earned 361,494 Site Visitors and 8421 Email Opt-ins

Mr. Patel is a master of taking action on real data. Through this dedication, he discovers, then delivers, what his audience wants. In this short case study he lightly analyzes the release of a recent PDF guide that ended up earning him the attention (and permission) of a wide new audience. But it wasn’t easy, and he didn’t get it right the first time …

//

The Overwhelming Force of “Gradual”

The home run. The hail mary. The half-court, last-second, game-winning bucket. The winning lottery ticket. We praise and immortalize these events, in my culture they are what you’d call a scene from the The American Dream. The Big Win. But nature doesn’t work that way. The natural order of things — generally — is a very slow, step-by-step process of growth and movement. If you saw a tree jump from sapling to 50 feet tall in one day, you’d wonder what the hell was going on in the world. So why are we obsessed with seeking the unnatural, sugar-coated growth of our work? Is there another way?

//

How to Write a Good Blog Post (Fast)

Ms. Duistermaat introduces us to her “Breadmaker Technique” of content creation, a process that cuts much of the angst out of getting started, and getting to a workable draft of your article. I find this, along with a little advice from Mr. Clark, to be very useful tools for getting good content done.

//

On Finding Real Pleasure in Our Work

It’s not often that a philosopher has the chops to not only teach, but make you laugh. Maybe it’s merely a 21st century sensibility, but Mr. de Botton delivers the goods in this smart, useful, and funny talk about the nature of work and success. Makes me think how much further Mr. Nietschze or Mr. Kant could have spread their messages, had they more finely tuned their respective senses of humor.

//

Twitter’s First Bona Fide Star

It’s well known that Copyblogger thinks digital sharecropping is a dumb move, but that’s never meant that we’re against the contextually smart use of social networking sites. This is one such case. Though Ms. Oxford’s story will certainly not be replicated (often, at least), the results of her producing the right content, in the right place, at the right time can’t be denied. You may never end up on Jimmy Kimmel Live, but do you really need to end up there in order to achieve your business goals?

Miss anything on Copyblogger recently?

About the Author: Robert Bruce is VP of Marketing for Copyblogger Media. In his off hours, he files unusually short stories to the Internet.

Related Stories

Copyblogger

Posted in IM NewsComments Off

Advert