Tag Archive | "Page"

Low mobile page speed scores may be killing your traffic

If your score is low, there are a few things you can do without having to redesign your website.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

Landing Page Optimization: Original MarketingSherpa Landing Page Handbook now available for free download

The MarketingSherpa Landing Page Handbook is one of the most popular resources we have offered in 20 years of publishing, and we are now offering this handbook free to you, the MarketingSherpa reader.
MarketingSherpa Blog

Posted in IM NewsComments Off

Google search bug has search suggestions crawling off the page

Google, that just looks ugly. Can you fix it?



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

12 Steps to Lightning Page Speed

Posted by WallStreetOasis.com

At Wall Street Oasis, we’ve noticed that every time we focus on improving our page speed, Google sends us more organic traffic. In 2018, our company’s website reached over 80 percent of our traffic from organic search. That’s 24.5 million visits. Needless to say, we are very tuned in to how we can continue to improve our user experience and keep Google happy.

We thought this article would be a great way to highlight the specific steps we take to keep our page speed lightning fast and organic traffic healthy. While this article is somewhat technical (page speed is an important and complex subject) we hope it provides website owners and developers with a framework on how to try and improve their page speed.

Quick technical background: Our website is built on top of the Drupal CMS and we are running on a server with a LAMP stack (plus Varnish and memcache). If you are not using MySQL, however, the steps and principles in this article are still relevant for other databases or a reverse proxy.

Ready? Let’s dig in.

5 Steps to speed up the backend

Before we jump into specific steps that can help you speed up your backend, it might help to review what we mean by “backend”. You can think of the backend of everything that goes into storing data, including the database itself and the servers — basically anything that helps make the website function that you don’t visually interact with. For more information on the difference between the backend vs. frontend, you read this article

Step 1: Make sure you have a Reverse Proxy configured

This is an important first step. For Wall Street Oasis (WSO), we use a reverse proxy called Varnish. It is by far the most critical and fastest layer of cache and serves the majority of the anonymous traffic (visitors logged out). Varnish caches the whole page in memory, so returning it to the visitor is lightning fast.

https://en.wikipedia.org/wiki/Reverse_proxy

Step 2: Extend the TTL of that cache

If you have a large database of content (specifically in the 10,000+ URL range) that doesn’t change very frequently, to drive the hit-rate higher on the Varnish caching layer, you can extend the time to live (TTL basically means how long before you flush the object out of the cache).

For WSO, we went all the way up to two weeks (since we were over 300,000 discussions). At any given time, only a few thousand of those forum URLs are active, so it makes sense to heavily cache the other pages. The downside to this is that when you make any sitewide, template or design changes, you have to wait two weeks for it to arrive across all URLs.

Step 3: Warm up the cache

In order to keep our cache “warm,” we have a specific process that hits all the URLs in our sitemap. This increases the likelihood of a page being in the cache when a user or Google bot visits those same pages (i.e. our hit rate improves). It also keeps Varnish full of more objects, ready to be accessed quickly.

As you can see from the chart below, the ratio of “cache hits” (green) to total hits (blue+green) is over 93 percent.

Step 4: Tune your database and focus on the slowest queries

On WSO, we use a MySQL database. Make sure you enable the slow queries report and check it at least every quarter. Check the slowest queries using EXPLAIN. Add indexes where needed and rewrite queries that can be optimized.

On WSO, we use a MySQL database. To tune MySQL, you can use the following scripts: https://github.com/major/MySQLTuner-perl and https://github.com/mattiabasone/tuning-primer

Step 5: HTTP headers

Use HTTP2 server push to send resources to the page before they are requested. Just make sure you test which ones should be pushed, first. JavaScript was a good option for us. You can read more about it here.

Here is an example of server push from our Investment Banking Interview Questions URL:

</files/advagg_js/js__rh8tGyQUC6fPazMoP4YI4X0Fze99Pspus1iL4Am3Nr4__k2v047sfief4SoufV5rlyaT9V0CevRW-VsgHZa2KUGc__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,</files/advagg_js/js__TLh0q7OGWS6tv88FccFskwgFrZI9p53uJYwc6wv-a3o__kueGth7dEBcGqUVEib_yvaCzx99rTtEVqb1UaLaylA4__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,</files/advagg_js/js__sMVR1us69-sSXhuhQWNXRyjueOEy4FQRK7nr6zzAswY__O9Dxl50YCBWD3WksvdK42k5GXABvKifJooNDTlCQgDw__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,

Be sure you’re using the correct format. If it is a script: <url>; rel=preload; as=script,

If it is a CSS file: <url>; rel=preload; as=style,

7 Steps to speed up the frontend

The following steps are to help speed up your frontend application. The front-end is the part of a website or application that the user directly interacts with. For example, this includes fonts, drop-down menus, buttons, transitions, sliders, forms, etc.

Step 1: Modify the placement of your JavaScript

Modifying the placement of your JavaScript is probably one of the hardest changes because you will need to continually test to make sure it doesn’t break the functionality of your site. 

I’ve noticed that every time I remove JavaScript, I see page speed improve. I suggest removing as much Javascript as you can. You can minify the necessary JavaScript you do need. You can also combine your JavaScript files but use multiple bundles.

Always try to move JavaScript to the bottom of the page or inline. You can also defer or use the async attribute where possible to guarantee you are not rendering blocking. You can read more about moving JavaScript here.

Step 2: Optimize your images

Use WebP for images when possible (Cloudflare, a CDN, does this for you automatically — I’ll touch more on Cloudflare below). It’s an image formatting that uses both Lossy compression and lossless compression.

    Always use images with the correct size. For example, if you have an image that is displayed in a 2” x 2 ” square on your site, don’t use a large 10” x 10” image. If you have an image that is bigger than is needed, you are transferring more data through the network and the browser has to resize the image for you

    Use lazy load to avoid/delay downloading images that are further down the page and not on the visible part of the screen.

    Step 3: Optimize your CSS

    You want to make sure your CSS is inline. Online tools like this one can help you find the critical CSS to be inlined and will solve the render blocking. Bonus: you’ll keep the cache benefit of having separate files.

    Make sure to minify your CSS files (we use AdVagg since we are on the Drupal CMS, but there are many options for this depending on your site).  

    Try using less CSS. For instance, if you have certain CSS classes that are only used on your homepage, don’t include them on other pages. 

    Always combine the CSS files but use multiple bundles. You can read more about this step here.

    Move your media queries to specific files so the browser doesn’t have to load them before rendering the page. For example: <link href=”frontpage-sm.css” rel=”stylesheet” media=”(min-width: 767px)”>

    If you’d like more info on how to optimize your CSS, check out Patrick Sexton’s interesting post.

    Step 4: Lighten your web fonts (they can be HEAVY)

    This is where your developers may get in an argument with your designers if you’re not careful. Everyone wants to look at a beautifully designed website, but if you’re not careful about how you bring this design live, it can cause major unintended speed issues. Here are some tips on how to put your fonts on a diet:

    • Use inline svg for icon fonts (like font awesome). This way you’ll reduce the critical chain path and will avoid empty content when the page is first loaded.
    • Use fontello to generate the font files. This way, you can include only the glyphs you actually use which leads to smaller files and faster page speed.
    • If you are going to use web fonts, check if you need all the glyphs defined in the font file. If you don’t need Japanese or Arabic characters, for example, see if there is a version with only the characters you need.
    • Use Unicode range to select the glyphs you need.
    • Use woff2 when possible as it is already compressed.
    • This article is a great resource on web font optimization.

    Here is the difference we measured when using optimized fonts:

    After reducing our font files from 131kb to 41kb and removing one external resource (useproof), the fully loaded time on our test page dropped all the way from 5.1 to 2.8 seconds. That’s a 44 percent improvement and is sure to make Google smile (see below).

    Here’s the 44 percent improvement.

    Step 5: Move external resources

    When possible, move external resources to your server so you can control expire headers (this will instruct the browsers to cache the resource for longer). For example, we moved our Facebook Pixel to our server and cached it for 14 days. This means you’ll be responsible to check updates from time to time, but it can improve your page speed score.

    For example, on our Private Equity Interview Questions page it is possible to see how the fbevents.js file is being loaded from our server and the cache control http header is set to 14 days (1209600 seconds)

    cache-control: public, max-age=1209600

    Step 6: Use a content delivery network (CDN)

    What’s a CDN? Click here to learn more.

    I recommend using Cloudflare as it makes a lot of tasks much easier and faster than if you were to try and do them on your own server. Here is what we specifically did on Cloudflare’s configuration:

    Speed

    • Auto-minify, check all
    • Under Polish
    • Enable Brotoli
    • Enable Mirage
    • Choose Lossy
    • Check WebP

    Network

    • Enable HTTP/2 – You can read more about this topic here
    • No browsers currently support HTTP/2 over an unencrypted connection. For practical purposes, this means that your website must be served over HTTPS to take advantage of HTTP/2. Cloudflare has a free and easy way to enable HTTPS. Check it out here.

    Crypto

    • Under SSL
      • Choose Flexible
    • Under TLS 1.3
      • Choose Enable+0RTT – More about this topic here.

    Step 7: Use service workers

    Service workers give the site owner and developers some interesting options (like push notifications), but in terms of performance, we’re most excited about how these workers can help us build a smarter caching system.

    To learn how to to get service workers up and running on your site, visit this page.

    With resources (images, CSS, javascript, fonts, etc) being cached by a service worker, returning visitors will often be served much faster than if there was no worker at all.

    Testing, tools, and takeaways

    For each change you make to try and improve speed, you can use the following tools to monitor the impact of the change and make sure you are on the right path:

    We know there is a lot to digest and a lot of resources linked above, but if you are tight on time, you can just start with Step 1 from both the Backend and Front-End sections. These 2 steps alone can make a major difference on their own.

    Good luck and let me know if you have any questions in the comments. I’ll make sure João Guilherme, my Head of Technology, is on to answer any questions for the community at least once a day for the first week this is published.

    Happy Tuning!

      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


      Moz Blog

      Posted in IM NewsComments Off

      Page Speed Optimization: Metrics, Tools, and How to Improve

      Posted by BritneyMuller

      Page speed is an important consideration for your SEO work, but it’s a complex subject that tends to be very technical. What are the most crucial things to understand about your site’s page speed, and how can you begin to improve? In this week’s edition of Whiteboard Friday, Britney Muller goes over what you need to know to get started.

      Click on the whiteboard image above to open a high resolution version in a new tab!

      Video Transcription

      Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we’re going over all things page speed and really getting to the bottom of why it’s so important for you to be thinking about and working on as you do your work.

      At the very fundamental level I’m going to briefly explain just how a web page is loaded. That way we can sort of wrap our heads around why all this matters.

      How a webpage is loaded

      A user goes to a browser, puts in your website, and there is a DNS request. This points at your domain name provider, so maybe GoDaddy, and this points to your server where your files are located, and this is where it gets interesting. So the DOM starts to load all of your HTML, your CSS, and your JavaScript. But very rarely does this one pull all of the needed scripts or needed code to render or load a web page.

      Typically the DOM will need to request additional resources from your server to make everything happen, and this is where things start to really slow down your site. Having that sort of background knowledge I hope will help in us being able to triage some of these issues.

      Issues that could be slowing down your site

      What are some of the most common culprits?

      1. First and foremost is images. Large images are the biggest culprit of slow loading web pages.
      2. Hosting can cause issues.
      3. Plugins, apps, and widgets, basically any third-party script as well can slow down load time.
      4. Your theme and any large files beyond that can really slow things down as well.
      5. Redirects, the number of hops needed to get to a web page will slow things down.
      6. Then JavaScript, which we’ll get into in a second.

      But all of these things can be a culprit. So we’re going to go over some resources, some of the metrics and what they mean, and then what are some of the ways that you can improve your page speed today.

      Page speed tools and resources

      The primary resources I have listed here are Google tools and Google suggested insights. I think what’s really interesting about these is we get to see what their concerns are as far as page speed goes and really start to see the shift towards the user. We should be thinking about that anyway. But first and foremost, how is this affecting people that come to your site, and then secondly, how can we also get the dual benefit of Google perceiving it as higher quality?

      We know that Google suggests a website to load anywhere between two to three seconds. The faster the better, obviously. But that’s sort of where the range is. I also highly suggest you take a competitive view of that. Put your competitors into some of these tools and benchmark your speed goals against what’s competitive in your industry. I think that’s a cool way to kind of go into this.

      Chrome User Experience Report

      This is Chrome real user metrics. Unfortunately, it’s only available for larger, popular websites, but you get some really good data out of it. It’s housed on Big ML, so some basic SQL knowledge is needed.

      Lighthouse

      Lighthouse, one of my favorites, is available right in Chrome Dev Tools. If you are on a web page and you click Inspect Element and you open up Chrome Dev Tools, to the far right tab where it says Audit, you can run a Lighthouse report right in your browser.

      What I love about it is it gives you very specific examples and fixes that you can do. A fun fact to know is it will automatically be on the simulated fast 3G, and notice they’re focused on mobile users on 3G. I like to switch that to applied fast 3G, because it has Lighthouse do an actual run of that load. It takes a little bit longer, but it seems to be a little bit more accurate. Good to know.

      Page Speed Insights

      Page Speed Insights is really interesting. They’ve now incorporated Chrome User Experience Report. But if you’re not one of those large sites, it’s not even going to measure your actual page speed. It’s going to look at how your site is configured and provide feedback according to that and score it. Just something good to be aware of. It still provides good value.

      Test your mobile website speed and performance

      I don’t know what the title of this is. If you do, please comment down below. But it’s located on testmysite.thinkwithgoogle.com. This one is really cool because it tests the mobile speed of your site. If you scroll down, it directly ties it into ROI for your business or your website. We see Google leveraging real-world metrics, tying it back to what’s the percentage of people you’re losing because your site is this slow. It’s a brilliant way to sort of get us all on board and fighting for some of these improvements.

      Pingdom and GTmetrix are non-Google products or non-Google tools, but super helpful as well.

      Site speed metrics

      So what are some of the metrics?

      First paint

      We’re going to go over first paint, which is basically just the first non-blank paint on a screen. It could be just the first pixel change. That initial change is first paint.

      First contentful paint

      First contentful paint is when the first content appears. This might be part of the nav or the search bar or whatever it might be. That’s the first contentful paint.

      First meaningful paint

      First meaningful paint is when primary content is visible. When you sort of get that reaction of, “Oh, yeah, this is what I came to this page for,” that’s first meaningful paint.

      Time to interactive

      Time to interactive is when it’s visually usable and engage-able. So we’ve all gone to a web page and it looks like it’s done, but we can’t quite use it yet. That’s where this metric comes in. So when is it usable for the user? Again, notice how user-centric even these metrics are. Really, really neat.

      DOM content loaded

      The DOM content loaded, this is when the HTML is completely loaded and parsed. So some really good ones to keep an eye on and just to be aware of in general.

      Ways to improve your page speed

      HTTP/2

      HTTP/2 can definitely speed things up. As to what extent, you have to sort of research that and test.

      Preconnect, prefetch, preload

      Preconnect, prefetch, and preload really interesting and important in speeding up a site. We see Google doing this on their SERPs. If you inspect an element, you can see Google prefetching some of the URLs so that it has it faster for you if you were to click on some of those results. You can similarly do this on your site. It helps to load and speed up that process.

      Enable caching & use a content delivery network (CDN)

      Caching is so, so important. Definitely do your research and make sure that’s set up properly. Same with CDNs, so valuable in speeding up a site, but you want to make sure that your CDN is set up properly.

      Compress images

      The easiest and probably quickest way for you to speed up your site today is really just to compress those images. It’s such an easy thing to do. There are all sorts of free tools available for you to compress them. Optimizilla is one. You can even use free tools on your computer, Save for Web, and compress properly.

      Minify resources

      You can also minify resources. So it’s really good to be aware of what minification, bundling, and compression do so you can have some of these more technical conversations with developers or with anyone else working on the site.

      So this is sort of a high-level overview of page speed. There’s a ton more to cover, but I would love to hear your input and your questions and comments down below in the comment section.

      I really appreciate you checking out this edition of Whiteboard Friday, and I will see you all again soon. Thanks so much. See you.

      Video transcription by Speechpad.com

      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


      Moz Blog

      Posted in IM NewsComments Off

      SearchCap: EU search results preview, Google URL inspection tool update & Bing Ads page feeds

      Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



      Please visit Search Engine Land for the full article.


      Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

      Posted in IM NewsComments Off

      Bing Ads launches page feeds for easier Dynamic Search Ads management

      Use page feeds to manage URL groupings for auto targets in DSA campaigns.



      Please visit Search Engine Land for the full article.


      Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

      Posted in IM NewsComments Off

      12 Methods to Get from Blank Page to First Draft

      If you’re like me, after taking some time off from writing, you’re refreshed and champing at the bit to translate…

      The post 12 Methods to Get from Blank Page to First Draft appeared first on Copyblogger.


      Copyblogger

      Posted in IM NewsComments Off

      Using a New Correlation Model to Predict Future Rankings with Page Authority

      Posted by rjonesx.

      Correlation studies have been a staple of the search engine optimization community for many years. Each time a new study is released, a chorus of naysayers seem to come magically out of the woodwork to remind us of the one thing they remember from high school statistics — that “correlation doesn’t mean causation.” They are, of course, right in their protestations and, to their credit, an unfortunate number of times it seems that those conducting the correlation studies have forgotten this simple aphorism.

      We collect a search result. We then order the results based on different metrics like the number of links. Finally, we compare the orders of the original search results with those produced by the different metrics. The closer they are, the higher the correlation between the two.

      That being said, correlation studies are not altogether fruitless simply because they don’t necessarily uncover causal relationships (ie: actual ranking factors). What correlation studies discover or confirm are correlates.

      Correlates are simply measurements that share some relationship with the independent variable (in this case, the order of search results on a page). For example, we know that backlink counts are correlates of rank order. We also know that social shares are correlates of rank order.

      Correlation studies also provide us with direction of the relationship. For example, ice cream sales are positive correlates with temperature and winter jackets are negative correlates with temperature — that is to say, when the temperature goes up, ice cream sales go up but winter jacket sales go down.

      Finally, correlation studies can help us rule out proposed ranking factors. This is often overlooked, but it is an incredibly important part of correlation studies. Research that provides a negative result is often just as valuable as research that yields a positive result. We’ve been able to rule out many types of potential factors — like keyword density and the meta keywords tag — using correlation studies.

      Unfortunately, the value of correlation studies tends to end there. In particular, we still want to know whether a correlate causes the rankings or is spurious. Spurious is just a fancy sounding word for “false” or “fake.” A good example of a spurious relationship would be that ice cream sales cause an increase in drownings. In reality, the heat of the summer increases both ice cream sales and people who go for a swim. More swimming means more drownings. So while ice cream sales is a correlate of drowning, it is spurious. It does not cause the drowning.

      How might we go about teasing out the difference between causal and spurious relationships? One thing we know is that a cause happens before its effect, which means that a causal variable should predict a future change. This is the foundation upon which I built the following model.

      An alternative model for correlation studies

      I propose an alternate methodology for conducting correlation studies. Rather than measure the correlation between a factor (like links or shares) and a SERP, we can measure the correlation between a factor and changes in the SERP over time.

      The process works like this:

      1. Collect a SERP on day 1
      2. Collect the link counts for each of the URLs in that SERP
      3. Look for any URL pairs that are out of order with respect to links; for example, if position 2 has fewer links than position 3
      4. Record that anomaly
      5. Collect the same SERP 14 days later
      6. Record if the anomaly has been corrected (ie: position 3 now out-ranks position 2)
      7. Repeat across ten thousand keywords and test a variety of factors (backlinks, social shares, etc.)

      So what are the benefits of this methodology? By looking at change over time, we can see whether the ranking factor (correlate) is a leading or lagging feature. A lagging feature can automatically be ruled out as causal since it happens after the rankings change. A leading factor has the potential to be a causal factor although could still be spurious for other reasons.

      We collect a search result. We record where the search result differs from the expected predictions of a particular variable (like links or social shares). We then collect the same search result 2 weeks later to see if the search engine has corrected the out-of-order results.

      Following this methodology, we tested 3 different common correlates produced by ranking factors studies: Facebook shares, number of root linking domains, and Page Authority. The first step involved collecting 10,000 SERPs from randomly selected keywords in our Keyword Explorer corpus. We then recorded Facebook Shares, Root Linking Domains, and Page Authority for every URL. We noted every example where 2 adjacent URLs (like positions 2 and 3 or 7 and 8) were flipped with respect to the expected order predicted by the correlating factor. For example, if the #2 position had 30 shares while the #3 position had 50 shares, we noted that pair. You would expect the page with moer shares to outrank the one with fewer. Finally, 2 weeks later, we captured the same SERPs and identified the percent of times that Google rearranged the pair of URLs to match the expected correlation. We also randomly selected pairs of URLs to get a baseline percent likelihood that any 2 adjacent URLs would switch positions. Here were the results…

      The outcome

      It’s important to note that it is incredibly rare to expect a leading factor to show up strongly in an analysis like this. While the experimental method is sound, it’s not as simple as a factor predicting future — it assumes that in some cases we will know about a factor before Google does. The underlying assumption is that in some cases we have seen a ranking factor (like an increase in links or social shares) before Googlebot has before, and that in the 2 week period, Google will catch up and correct the incorrectly ordered results. As you can expect, this is a rare occasion, as Google crawls the web faster than anyone else. However, with a sufficient number of observations, we should be able to see a statistically significant difference between lagging and leading results. Nevertheless, the methodology only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google.

      Factor Percent Corrected P-Value 95% Min 95% Max
      Control 18.93% 0
      Facebook Shares Controlled for PA 18.31% 0.00001 -0.6849 -0.5551
      Root Linking Domains 20.58% 0.00001 0.016268 0.016732
      Page Authority 20.98% 0.00001 0.026202 0.026398

      Control:

      In order to create a control, we randomly selected adjacent URL pairs in the first SERP collection and determined the likelihood that the second will outrank the first in the final SERP collection. Approximately 18.93% of the time the worse ranking URL would overtake the better ranking URL. By setting this control, we can determine if any of the potential correlates are leading factors – that is to say that they are potential causes of improved rankings because they better predict future changes than a random selection.

      Facebook Shares:

      Facebook Shares performed the worst of the three tested variables. Facebook Shares actually performed worse than random (18.31% vs 18.93%), meaning that randomly selected pairs would be more likely to switch than those where shares of the second were higher than the first. This is not altogether surprising as it is the general industry consensus that social signals are lagging factors — that is to say the traffic from higher rankings drives higher social shares, not social shares drive higher rankings. Subsequently, we would expect to see the ranking change first before we would see the increase in social shares.

      RLDs

      Raw root linking domain counts performed substantially better than shares and the control at ~20.5%. As I indicated before, this type of analysis is incredibly subtle because it only detects when a factor is both leading and Moz Link Explorer discovered the relevant factor before Google. Nevertheless, this result was statistically significant with a P value <0.0001 and a 95% confidence interval that RLDs will predict future ranking changes around 1.5% greater than random.

      Page Authority

      By far, the highest performing factor was Page Authority. At 21.5%, PA correctly predicted changes in SERPs 2.6% better than random. This is a strong indication of a leading factor, greatly outperforming social shares and outperforming the best predictive raw metric, root linking domains.This is not unsurprising. Page Authority is built to predict rankings, so we should expect that it would outperform raw metrics in identifying when a shift in rankings might occur. Now, this is not to say that Google uses Moz Page Authority to rank sites, but rather that Moz Page Authority is a relatively good approximation of whatever link metrics Google is using to determine ranking sites.

      Concluding thoughts

      There are so many different experimental designs we can use to help improve our research industry-wide, and this is just one of the methods that can help us tease out the differences between causal ranking factors and lagging correlates. Experimental design does not need to be elaborate and the statistics to determine reliability do not need to be cutting edge. While machine learning offers much promise for improving our predictive models, simple statistics can do the trick when we’re establishing the fundamentals.

      Now, get out there and do some great research!

      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


      Moz Blog

      Posted in IM NewsComments Off

      Are You Making These 7 Mistakes with Your About Page?

      Good old Google. They do like to keep life interesting for web publishers. You may have heard rumblings about a recent update that wreaked havoc on a lot of “your money or your life” sites — the ones that talk about health, fitness, finances, or happiness. That update appeared to look at the credibility of
      Read More…

      The post Are You Making These 7 Mistakes with Your About Page? appeared first on Copyblogger.


      Copyblogger

      Posted in IM NewsComments Off

      Advert