Tag Archive | "Speed"

SearchCap: Google update still rolling out, Bing adds hotel booking, page speed performance & more

Here’s our recap of what happened in online marketing today, as reported on Marketing Land and other places across the web



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

SearchCap: Google mobile speed update, Lenovo debuts Google Assistant, Google deadlines & more

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

It’s Live: Google Speed Update Now Rolling Out

This morning, July 9th, Google has begun rolling out the Google Speed Update that they first announced in January 2018. Google has updated their blog post this morning, as I wrote at Search Engine Land, “Update July 9…


Search Engine Roundtable

Posted in IM NewsComments Off

Efficient Link Reclamation: How to Speed Up & Scale Your Efforts

Posted by DarrenKingman

Link reclamation: Tools, tools everywhere

Every link builder, over time, starts to narrow down their favorite tactics and techniques. Link reclamation is pretty much my numero-uno. In my experience, it’s one of the best ROI activities we can use for gaining links particularly to the homepage, simply because the hard work — the “mention” (in whatever form that is) — is already there. That mention could be of your brand, an influencer who works there, or a tagline from a piece of content you’ve produced, whether it’s an image asset, video, etc. That’s the hard part. But with it done, and after a little hunting and vetting the right mentions, you’re just left with the outreach.

Aside from the effort-to-return ratio, there are various other benefits to link reclamation:

  1. It’s something you can start right away without assets
  2. It’s a low risk/low investment form of link building
  3. Nearly all brands have unlinked mentions, but big brands tend to have the most and therefore see the biggest routine returns
  4. If you’re doing this for clients, they get to see an instant return on their investment

Link reclamation isn’t a new tactic, but it is becoming more complex and tool providers are out there helping us to optimize our efforts. In this post, I’m going to talk a little about those tools and how to apply them to speed up and scale your link reclamation.

Finding mentions

Firstly, we want to find mentions. No point getting too fancy at this stage, so we just head over to trusty Google and search for the range of mentions we’re working on.

As I described earlier, these mentions can come in a variety of shapes and sizes, so I would generally treat each type of mention that I’m looking for as a separate project. For example, if Moz were the site I was working on, I would look for mentions of the brand and create that as one “project,” then look for mentions of Followerwonk and treat that as another, and so on. The reasons why will become clear later on!

So, we head to the almighty Google and start our searches.

To help speed things up it’s best to expand your search result to gather as many URLs as you can in as few clicks as possible. Using Google’s Search Settings, you can quickly max out your SERPs to one hundred results, or you can install a plugin like GInfinity, which allows you to infinitely scroll through the results and grab as many as you can before your hand cramps up.

Now we want to start copying as many of these results as possible into an Excel sheet, or wherever it is you’ll be working from. Clicking each one and copying/pasting is hell, so another tool to quickly install for Chrome is Linkclump. With this one, you’ll be able to right click, drag, and copy as many URLs as you want.

Linkclump Pro Tip: To ensure you don’t copy the page titles and cache data from a SERP, head over to your Linkclump settings by right-clicking the extension icon and selecting “options.” Then, edit your actions to include “URLs only” and “copied to clipboard.” This will make the next part of the process much easier!

Filtering your URL list

Now we’ve got a bunch of URLs, we want to do a little filtering, so we know a) the DA of these domains as a proxy metric to qualify mentions, and b) whether or not they already link to us.

How you do this bit will depend on which platforms you have access to. I would recommend using BuzzStream as it combines a few of the future processes in one place, but URL Profiler can also be used before transferring your list over to some alternative tools.

Using BuzzStream

If you’re going down this road, BuzzStream can pretty much handle the filtering for you once you’ve uploaded your list of URLs. The system will crawl through the URLs and use their API to display Domain Authority, as well as tell you if the page already links to you or not.

The first thing you’ll want to do is create a “project” for each type of mention you’re sourcing. As I mentioned earlier this could be “brand mentions,” “creative content,” “founder mentions,” etc.

When adding your “New Project,” be sure to include the domain URL for the site you’re building links to, as shown below. BuzzStream will then go through and crawl your list of URLs and flag any that are already linking to you, so you can filter them out.

Next, we need to get your list of URLs imported. In the Websites view, use Add Websites and select “Add from List of URLs”:

The next steps are really easy: Upload your list of URLs, then ensure you select “Websites and Links” because we want BuzzStream to retrieve the link data for us.

Once you’ve added them, BuzzStream will work through the list and start displaying all the relevant data for you to filter through in the Link Monitoring tab. You can then sort by: link status (after hitting “Check Backlinks” and having added your URL), DA, and relationship stage to see if you/a colleague have ever been in touch with the writer (especially useful if you/your team uses BuzzStream for outreach like we do at Builtvisible).

Using URL Profiler

If you’re using URL Profiler, firstly, make sure you’ve set up URL Profiler to work with your Moz API. You don’t need a paid Moz account to do this, but having one will give you more than 500 checks per day on the URLs you and the team are pushing through.

Then, take the list of URLs you’ve copied using Linkclump from the SERPs (I’ve just copied the top 10 from the news vertical for “moz.com” as my search), then paste the URLs in the list. You’ll need to select “Moz” in the Domain Level Data section (see screenshot) and also fill out the “Domain to Check” with your preferred URL string (I’ve put “Moz.com” to capture any links to secure, non-secure, alternative subdomains and deeper level URLs).

Once you’ve set URL Profiler running, you’ll get a pretty intimidating spreadsheet, which can simply be cut right down to the columns: URL, Target URL and Domain Mozscape Domain Authority. Filter out any rows that have returned a value in the Target URL column (essentially filtering out any that found an HREF link to your domain), and any remaining rows with a DA lower than your benchmark for links (if you work with one).

And there’s my list of URLs that we now know:

1) don’t have any links to our target domain,

2) have a reference to the domain we’re working on, and

3) boast a DA above 40.

Qualify your list

Now that you’ve got a list of URLs that fit your criteria, we need to do a little manual qualification. But, we’re going to use some trusty tools to make it easy for us!

The key insight we’re looking for during our qualification is if the mention is in a natural linking element of the page. It’s important to avoid contacting sites where the mention is only in the title, as they’ll never place the link. We particularly want placements in the body copy as these are natural link locations and so increase the likelihood of your efforts leading somewhere.

So from my list of URLs, I’ll copy the list and head over to URLopener.com (now bought by 10bestseo.com presumably because it’s such an awesome tool) and paste in my list before asking it to open all the URLs for me:

Now, one by one, I can quickly scan the URLs and look for mentions in the right places (i.e. is the mention in the copy, is it in the headline, or is it used anywhere else where a link might not look natural?).

When we see something like this (below), we’re making sure to add this URL to our final outreach list:

However, when we see this (again, below), we’re probably stripping the URL out of our list as there’s very little chance the author/webmaster will add a link in such a prominent and unusual part of the page:

The idea is to finish up with a list of unlinked mentions in spots where a link would fit naturally for the publisher. We don’t want to get in touch with everyone, with mentions all over the place, as it can harm your future relationships. Link building needs to make sense, and not just for Google. If you’re working in a niche that mentions your client, you likely want not only to get a link but also build a relationship with this writer — it could lead to 5 links further down the line.

Getting email addresses

Now that you’ve got a list of URLs that all feature your brand/client, and you’ve qualified this list to ensure they are all unlinked and have mentions in places that make sense for a link, we need to do the most time-consuming part: finding email addresses.

To continue expanding our spreadsheet, we’re going to need to know the contact details of the writer or webmaster to request our link from. To continue our theme of efficiency, we just want to get the two most important details: email address and first name.

Getting the first name is usually pretty straightforward and there’s not really a need to automate this. However, finding email addresses could be an entirely separate article in itself, so I’ll be brief and get to the point. Read this, and here’s a summary of places to look and the tools I use:

  • Author page
  • Author’s personal website
  • Author’s Twitter profile
  • Rapportive & Email Permutator
  • Allmytweets
  • Journalisted.com
  • Mail Tester

More recently, we’ve been also using Skrapp.io. It’s a LinkedIn extension (like Hunter.io) that installs a “Find Email” button on LinkedIn with a percentage of accuracy. This can often be used with Mail Tester to discover if the suggested email address provided is working or not.

It’s likely to be a combination of these tools that helps you navigate finding a contact’s email address. Once we have it, we need to get in touch — at scale!

Pro Tip: When using Allmytweets, if you’re finding that searches for “email” or “contact” aren’t working, try “dot.” Usually journalists don’t put their full email address on public profiles in a scrapeable format, so they use “me@gmail [dot] com” to get around it.

Making contact

So, because this is all about making the process efficient, I’m not going to repeat or try to build on the other already useful articles that provide templates for outreach (there is one below, but that’s just as an example!). However, I am going to show you how to scale your outreach and follow-ups.

Mail merges

If you and your team aren’t set in your ways with a particular paid tool, your best bet for optimizing scale is going to be a mail merge. There are a number of them out there, and honestly, they are all fairly similar with either varying levels of free emails per day before you have to pay, or they charge from the get-go. However, for the costs we’re talking about and the time it saves, building a business case to either convince yourself (freelancers) or your finance department (everyone else!) will be a walk in the park.

I’ve been a fan of Contact Monkey for some time, mainly for tracking open rates, but their mail merge product is also part of the $ 10-a-month package. It’s a great deal. However, if you’re after something a bit more specific, YAMM is free to a point (for personal Gmail accounts) and can send up to 50 emails a day.

You’ll likely need to work through the process with the whatever tool you pick but, using your spreadsheet, you’ll be able to specify which fields you want the mail merge to select from, and it’ll insert each element into the email.

For link reclamation, this is really as personable as you need to get — no lengthy paragraphs on how much you loved or how long you’ve been following them on Twitter, just a good old to the point email:

Hi [first name],

I recently found a mention of a company I work with in one of your articles.

Here’s the article:

Where you’ve mentioned our company, Moz, would you be able to provide a link back to the domain Moz.com, in case users would like to know more about us?

Many thanks,
Darren.

If using BuzzStream

Although BuzzStream’s mail merge options are pretty similar to the process above, the best “above and beyond” feature that BuzzStream has is that you can schedule in follow up emails as well. So, if you didn’t hear back the first time, after a week or so their software will automatically do a little follow-up, which in my experience, often leads to the best results.

When you’re ready to start sending emails, select the project you’ve set up. In the “Websites” section, select “Outreach.” Here, you can set up a sequence, which will send your initial email as well as customized follow-ups.

Using the same extremely brief template as above, I’ve inserted my dynamic fields to pull in from my data set and set up two follow up emails due to send if I don’t hear back within the next 4 days (BuzzStream hooks up with my email through Outlook and can monitor if I receive an email from this person or not).

Each project can now use templates set up for the type of mention you’re following up. By using pre-set templates, you can create one for brand mention, influencers, or creative projects to further save you time. Good times.

I really hope this has been useful for beginners and seasoned link reclamation pros alike. If you have any other tools you use that people may find useful or have any questions, please do let us know below.

Thanks everyone!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

Google Confirms Chrome Usage Data Used to Measure Site Speed

Posted by Tom-Anthony

During a discussion with Google’s John Mueller at SMX Munich in March, he told me an interesting bit of data about how Google evaluates site speed nowadays. It has gotten a bit of interest from people when I mentioned it at SearchLove San Diego the week after, so I followed up with John to clarify my understanding.

The short version is that Google is now using performance data aggregated from Chrome users who have opted in as a datapoint in the evaluation of site speed (and as a signal with regards to rankings). This is a positive move (IMHO) as it means we don’t need to treat optimizing site speed for Google as a separate task from optimizing for users.

Previously, it has not been clear how Google evaluates site speed, and it was generally believed to be measured by Googlebot during its visits — a belief enhanced by the presence of speed charts in Search Console. However, the onset of JavaScript-enabled crawling made it less clear what Google is doing — they obviously want the most realistic data possible, but it’s a hard problem to solve. Googlebot is not built to replicate how actual visitors experience a site, and so as the task of crawling became more complex, it makes sense that Googlebot may not be the best mechanism for this (if it ever was the mechanism).

In this post, I want to recap the pertinent data around this news quickly and try to understand what this may mean for users.

Google Search Console

Firstly, we should clarify our understand of what the “time spent downloading a page” metric in Google Search Console is telling us. Most of us will recognize graphs like this one:

Until recently, I was unclear about exactly what this graph was telling me. But handily, John Mueller comes to the rescue again with a detailed answer [login required] (hat tip to James Baddiley from Chillisauce.com for bringing this to my attention):

John clarified what this graph is showing:

It’s technically not “downloading the page” but rather “receiving data in response to requesting a URL” – it’s not based on rendering the page, it includes all requests made.

And that it is:

this is the average over all requests for that day

Because Google may be fetching a very different set of resources every day when it’s crawling your site, and because this graph does not account for anything to do with page rendering, it is not useful as a measure of the real performance of your site.

For that reason, John points out that:

Focusing blindly on that number doesn’t make sense.

With which I quite agree. The graph can be useful for identifying certain classes of backend issues, but there are also probably better ways for you to do that (e.g. WebPageTest.org, of which I’m a big fan).

Okay, so now we understand that graph and what it represents, let’s look at the next option: the Google WRS.

Googlebot & the Web Rendering Service

Google’s WRS is their headless browser mechanism based on Chrome 41, which is used for things like “Fetch as Googlebot” in Search Console, and is increasingly what Googlebot is using when it crawls pages.

However, we know that this isn’t how Google evaluates pages because of a Twitter conversation between Aymen Loukil and Google’s Gary Illyes. Aymen wrote up a blog post detailing it at the time, but the important takeaway was that Gary confirmed that WRS is not responsible for evaluating site speed:

Twitter conversation with Gary Ilyes

At the time, Gary was unable to clarify what was being used to evaluate site performance (perhaps because the Chrome User Experience Report hadn’t been announced yet). It seems as though things have progressed since then, however. Google is now able to tell us a little more, which takes us on to the Chrome User Experience Report.

Chrome User Experience Report

Introduced in October last year, the Chrome User Experience Report “is a public dataset of key user experience metrics for top origins on the web,” whereby “performance data included in the report is from real-world conditions, aggregated from Chrome users who have opted-in to syncing their browsing history and have usage statistic reporting enabled.”

Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.

In March I was at SMX Munich (amazing conference!), where along with a small group of SEOs I had a chat with John Mueller. I asked John about how Google evaluates site speed, given that Gary had clarified it was not the WRS. John was kind enough to shed some light on the situation, but at that point, nothing was published anywhere.

However, since then, John has confirmed this information in a Google Webmaster Central Hangout [15m30s, in German], where he explains they’re using this data along with some other data sources (he doesn’t say which, though notes that it is in part because the data set does not cover all domains).

At SMX John also pointed out how Google’s PageSpeed Insights tool now includes data from the Chrome User Experience Report:

The public dataset of performance data for the top million domains is also available in a public BigQuery project, if you’re into that sort of thing!

We can’t be sure what all the other factors Google is using are, but we now know they are certainly using this data. As I mentioned above, I also imagine they are using data on more sites than are perhaps provided in the public dataset, but this is not confirmed.

Pay attention to users

Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.

The same is true if you were to use service workers for advanced caching behaviors — Googlebot wouldn’t be aware, but users would. There are certainly other such examples.

Essentially, this means that there’s no longer a reason to worry about pagespeed for Googlebot, and you should instead just focus on improving things for your users. You still need to pay attention to Googlebot for crawling purposes, which is a separate task.

If you are unsure where to look for site speed advice, then you should look at:

That’s all for now! If you have questions, please comment here and I’ll do my best! Thanks!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in IM NewsComments Off

SearchCap: EU domains at risk, mobile page speed & search pictures

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: EU domains at risk, mobile page speed & search pictures appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

SearchCap: Google EU rivals, Bing Ads tracking fix & Google speed index update

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

The post SearchCap: Google EU rivals, Bing Ads tracking fix & Google speed index update appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

FAQs on new Google Speed Update: AMP pages, Search Console notifications & desktop only pages

A page with AMP but a slow canonical URL will not be impacted by this update, assuming the AMP URL is not slow, Google told us.

The post FAQs on new Google Speed Update: AMP pages, Search Console notifications & desktop only pages appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in IM NewsComments Off

6 SEO Friendly Tips to Improve Site Speed on WordPress Blogs

"If a page takes more than a couple of seconds to load, users will instantly hit the back button and move on." – Loren Baker

In the world of SEO, user experience on websites has always been a factor, as has the time it takes for a site to load.

However, with the use of mobile devices surpassing desktop use (in most consumer-facing industries) and the wide adoption of broadband, people expect sites to load instantly.

Long gone are the days of waiting 10 seconds for a site to load.

If a page takes more than a couple of seconds to load, users will instantly hit the back button and move on to the next result.

Accordingly, Google officially started paying attention to site speed and declared its importance as a factor in rankings.

In order to keep up with Google’s site-ranking measures, WordPress blog users need to know exactly what they can do to improve their own site speed.

Remember when Google rolled out AMP (accelerated mobile pages)?

They now serve up publisher content in a simplified Google hosted experience that renders superfast. I like AMP from a user perspective because I know that AMP content will load incredibly fast on my mobile device, but as a publisher:

I’d rather speed up my blog and attract traffic directly to my site than have users stay on Google.

If you use StudioPress Sites or the Rainmaker Platform, your site will already load quickly. However, adding ad scripts, featured images, tracking codes, 301 redirects, etc. will slow down the loading of a site and increase demand on your server/hosting company.

Here are six simple tips I recommend since we used them to dramatically speed up the Search Engine Journal (SEJ) load time — it’s at 1.8 seconds!

1. Use a content delivery network

A content delivery network (CDN) is a group of servers that deliver web pages and other content according to the location of the user, the webpage origin, and its server.

It can handle heavy traffic and speeds up the delivery of content to different users.

For WordPress blogs looking to improve site speed, Cloudflare is a great tool to consider. Cloudflare offers a free content delivery network that speeds up the performance of your site and optimizes it for efficiency on any device.

It also offers security services that help protect websites from crawlers, bots, and other attackers.

2. Compress your images

Another effective way to reduce page-load time and increase site speed is by compressing your images. A CDN will help with this, but it doesn’t take care of 100 percent of the job.

There are several different plugins available that compress all the images on your website — and even compress new images as you upload them as well.

ShortPixel is a WordPress plugin that allows you to compress both new and old images on your blog. We use it on SEJ and various other sites, and absolutely love it.

It allows you to quickly compress images in batches for greater convenience, reduces the time it takes to do backups, and ensures all your processed files are kept safe and secure. The best part about it is that your image quality stays the same, regardless of the size of the image.

Other image-compression plugins also maintain the quality of your pictures and improve site speed.

3. Prevent ad scripts and pop-ups from slowing down the user experience

Many web pages today contain some form of third-party script that either runs ads for revenue or uses pop-ups to promote conversion. You want to build your audience and get more customers of course, but balance is key here.

Although it’s difficult to completely get rid of them to improve your site speed, you can tame their performance impact while keeping them on your website to provide their intended benefits.

The trick is to first identify the third-party scripts that run on your site, where they come from, and how they impact your blog.

You can use different real-time monitoring tools that track and identify which scripts delay your site-loading time and affect your site metrics.

One of my favorite tools to do this is Pingdom’s Website Speed Test, because it breaks down each file and script, and tells you which takes the most time to load.

The same rule applies for pop-up plugins that you add on to your site.

Knowing which ones work best to improve conversions and bring in email signups allows you to gauge which plugins to keep and which ones to uninstall.

One of the fastest pop-up plugins on the market is OptinMonster (a StudioPress partner). Its founder, Syed Balkhi, is a WordPress expert who stays on top of factors like site speed and overall user experience.

4. Install a caching plugin

Another effective way to reduce site-loading time is by installing caching plugins to your WordPress blog.

Caching plugins work by creating a static version of your WordPress blog and delivering it to your site users and visitors, which conveniently cuts your page-loading time in half.

Several cache plugins work best for WordPress, such as WP Super Cache and W3 Total Cache.

These plugins are easy to install and can be disabled anytime. They allow you to select certain pages on your blog (or all of them) to cache, and offer many other content compression settings that you can turn on or off.

WordPress supports many other plugins that allow you to optimize your blog to get rid of any latency in page-load time. It is important to test out these plugins to find the one that works best for you.

5. Disable plugins you don’t use

Tons of WordPress plugins can also make your site super slow, especially ones you don’t need.

It is important to review the plugins you have installed in the past and disable those that offer no significant value.

Many WordPress users install different plugins when they first create their blogs to enhance how they look, but realize over time that great-looking blogs don’t always attract traffic, especially if your page-loading time is slow.

Also, I would highly recommend making sure your plugins are updated. This may help improve page-load speed, but more importantly, it makes your site more secure.

6. Add one more layer of media optimization

One thing we realized at SEJ when speeding up the site was that even after optimizing images, ad scripts, and caching, there were still multiple forms of media that slowed down load time.

The internal fixes we implemented did not help with third-party media load times, such as embedded Twitter, YouTube, and Instagram content, or infographics from other sites.

One solution we found to assist with that is BJ Lazy Load. Essentially, this lazy-load plugin renders all written content first, then as the user scrolls down the page, images and other forms of media load. This way, the user doesn’t have to wait for tons of media to load before reading the main content.

What I really like about BJ Lazy Load is that in addition to images, it also lazy loads all embeds, iFrames, and YouTube videos. For a WordPress blog that uses a lot of embeds, it was ideal for us.

Bonus tip: ask your web host for help

If you run a WordPress blog or WordPress-powered site, then you should work with a hosting company that specializes in WordPress, such as WP Engine, Presslabs, or Rainmaker’s own Synthesis.

I’ve worked with all three, and one thing I can absolutely tell you is that if you contact them and ask how your site can be sped up, they will help you because the faster your site is, the less the load is on their servers.

As more and more people turn to mobile devices to access the internet, it is essential to optimize your blogs for mobile use and find ways to minimize page-loading time.

Remember, bounce rates increase when your page-load time is slow, which impacts whether or not your content gets read or skipped for other sites that load pages faster.

The post 6 SEO Friendly Tips to Improve Site Speed on WordPress Blogs appeared first on Copyblogger.


Copyblogger

Posted in IM NewsComments Off

Tired Of The Slow Start, Why Not Buy A Blog To Speed Up The Process

Tired Of The Slow Start, Why Not Speed Up The Process And Buy A Blog? Many many years ago I was on skype talking to a friend in the USA. My friend had a blog, one that was similar in audience size to my own, about small business branding. We…

The post Tired Of The Slow Start, Why Not Buy A Blog To Speed Up The Process appeared first on Entrepreneurs-Journey.com.

Entrepreneurs-Journey.com by Yaro Starak

Posted in IM NewsComments Off

Advert