Tag Archive | "Steps"

7 Steps to Grow a Blog Post

Sometimes it seems like writers are magicians, because we have the power to create something out of nothing. An important…

The post 7 Steps to Grow a Blog Post appeared first on Copyblogger.


Posted in IM NewsComments Off

4 Steps to Finish a Practical Project Instead of Fantasizing about a Lofty Idea

If I set out to practice yoga for an hour each day, I would practice zero hours of yoga each…

The post 4 Steps to Finish a Practical Project Instead of Fantasizing about a Lofty Idea appeared first on Copyblogger.


Posted in IM NewsComments Off

10 Steps to Becoming a Better Writer [Free Poster]

Back in the sweltering summer of 2007, I got a bit crazy. I wanted to get across to people that…

The post 10 Steps to Becoming a Better Writer [Free Poster] appeared first on Copyblogger.


Posted in IM NewsComments Off

12 Steps to Lightning Page Speed

Posted by WallStreetOasis.com

At Wall Street Oasis, we’ve noticed that every time we focus on improving our page speed, Google sends us more organic traffic. In 2018, our company’s website reached over 80 percent of our traffic from organic search. That’s 24.5 million visits. Needless to say, we are very tuned in to how we can continue to improve our user experience and keep Google happy.

We thought this article would be a great way to highlight the specific steps we take to keep our page speed lightning fast and organic traffic healthy. While this article is somewhat technical (page speed is an important and complex subject) we hope it provides website owners and developers with a framework on how to try and improve their page speed.

Quick technical background: Our website is built on top of the Drupal CMS and we are running on a server with a LAMP stack (plus Varnish and memcache). If you are not using MySQL, however, the steps and principles in this article are still relevant for other databases or a reverse proxy.

Ready? Let’s dig in.

5 Steps to speed up the backend

Before we jump into specific steps that can help you speed up your backend, it might help to review what we mean by “backend”. You can think of the backend of everything that goes into storing data, including the database itself and the servers — basically anything that helps make the website function that you don’t visually interact with. For more information on the difference between the backend vs. frontend, you read this article

Step 1: Make sure you have a Reverse Proxy configured

This is an important first step. For Wall Street Oasis (WSO), we use a reverse proxy called Varnish. It is by far the most critical and fastest layer of cache and serves the majority of the anonymous traffic (visitors logged out). Varnish caches the whole page in memory, so returning it to the visitor is lightning fast.


Step 2: Extend the TTL of that cache

If you have a large database of content (specifically in the 10,000+ URL range) that doesn’t change very frequently, to drive the hit-rate higher on the Varnish caching layer, you can extend the time to live (TTL basically means how long before you flush the object out of the cache).

For WSO, we went all the way up to two weeks (since we were over 300,000 discussions). At any given time, only a few thousand of those forum URLs are active, so it makes sense to heavily cache the other pages. The downside to this is that when you make any sitewide, template or design changes, you have to wait two weeks for it to arrive across all URLs.

Step 3: Warm up the cache

In order to keep our cache “warm,” we have a specific process that hits all the URLs in our sitemap. This increases the likelihood of a page being in the cache when a user or Google bot visits those same pages (i.e. our hit rate improves). It also keeps Varnish full of more objects, ready to be accessed quickly.

As you can see from the chart below, the ratio of “cache hits” (green) to total hits (blue+green) is over 93 percent.

Step 4: Tune your database and focus on the slowest queries

On WSO, we use a MySQL database. Make sure you enable the slow queries report and check it at least every quarter. Check the slowest queries using EXPLAIN. Add indexes where needed and rewrite queries that can be optimized.

On WSO, we use a MySQL database. To tune MySQL, you can use the following scripts: https://github.com/major/MySQLTuner-perl and https://github.com/mattiabasone/tuning-primer

Step 5: HTTP headers

Use HTTP2 server push to send resources to the page before they are requested. Just make sure you test which ones should be pushed, first. JavaScript was a good option for us. You can read more about it here.

Here is an example of server push from our Investment Banking Interview Questions URL:

</files/advagg_js/js__rh8tGyQUC6fPazMoP4YI4X0Fze99Pspus1iL4Am3Nr4__k2v047sfief4SoufV5rlyaT9V0CevRW-VsgHZa2KUGc__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,</files/advagg_js/js__TLh0q7OGWS6tv88FccFskwgFrZI9p53uJYwc6wv-a3o__kueGth7dEBcGqUVEib_yvaCzx99rTtEVqb1UaLaylA4__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,</files/advagg_js/js__sMVR1us69-sSXhuhQWNXRyjueOEy4FQRK7nr6zzAswY__O9Dxl50YCBWD3WksvdK42k5GXABvKifJooNDTlCQgDw__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,

Be sure you’re using the correct format. If it is a script: <url>; rel=preload; as=script,

If it is a CSS file: <url>; rel=preload; as=style,

7 Steps to speed up the frontend

The following steps are to help speed up your frontend application. The front-end is the part of a website or application that the user directly interacts with. For example, this includes fonts, drop-down menus, buttons, transitions, sliders, forms, etc.

Step 1: Modify the placement of your JavaScript

Modifying the placement of your JavaScript is probably one of the hardest changes because you will need to continually test to make sure it doesn’t break the functionality of your site. 

I’ve noticed that every time I remove JavaScript, I see page speed improve. I suggest removing as much Javascript as you can. You can minify the necessary JavaScript you do need. You can also combine your JavaScript files but use multiple bundles.

Always try to move JavaScript to the bottom of the page or inline. You can also defer or use the async attribute where possible to guarantee you are not rendering blocking. You can read more about moving JavaScript here.

Step 2: Optimize your images

Use WebP for images when possible (Cloudflare, a CDN, does this for you automatically — I’ll touch more on Cloudflare below). It’s an image formatting that uses both Lossy compression and lossless compression.

    Always use images with the correct size. For example, if you have an image that is displayed in a 2” x 2 ” square on your site, don’t use a large 10” x 10” image. If you have an image that is bigger than is needed, you are transferring more data through the network and the browser has to resize the image for you

    Use lazy load to avoid/delay downloading images that are further down the page and not on the visible part of the screen.

    Step 3: Optimize your CSS

    You want to make sure your CSS is inline. Online tools like this one can help you find the critical CSS to be inlined and will solve the render blocking. Bonus: you’ll keep the cache benefit of having separate files.

    Make sure to minify your CSS files (we use AdVagg since we are on the Drupal CMS, but there are many options for this depending on your site).  

    Try using less CSS. For instance, if you have certain CSS classes that are only used on your homepage, don’t include them on other pages. 

    Always combine the CSS files but use multiple bundles. You can read more about this step here.

    Move your media queries to specific files so the browser doesn’t have to load them before rendering the page. For example: <link href=”frontpage-sm.css” rel=”stylesheet” media=”(min-width: 767px)”>

    If you’d like more info on how to optimize your CSS, check out Patrick Sexton’s interesting post.

    Step 4: Lighten your web fonts (they can be HEAVY)

    This is where your developers may get in an argument with your designers if you’re not careful. Everyone wants to look at a beautifully designed website, but if you’re not careful about how you bring this design live, it can cause major unintended speed issues. Here are some tips on how to put your fonts on a diet:

    • Use inline svg for icon fonts (like font awesome). This way you’ll reduce the critical chain path and will avoid empty content when the page is first loaded.
    • Use fontello to generate the font files. This way, you can include only the glyphs you actually use which leads to smaller files and faster page speed.
    • If you are going to use web fonts, check if you need all the glyphs defined in the font file. If you don’t need Japanese or Arabic characters, for example, see if there is a version with only the characters you need.
    • Use Unicode range to select the glyphs you need.
    • Use woff2 when possible as it is already compressed.
    • This article is a great resource on web font optimization.

    Here is the difference we measured when using optimized fonts:

    After reducing our font files from 131kb to 41kb and removing one external resource (useproof), the fully loaded time on our test page dropped all the way from 5.1 to 2.8 seconds. That’s a 44 percent improvement and is sure to make Google smile (see below).

    Here’s the 44 percent improvement.

    Step 5: Move external resources

    When possible, move external resources to your server so you can control expire headers (this will instruct the browsers to cache the resource for longer). For example, we moved our Facebook Pixel to our server and cached it for 14 days. This means you’ll be responsible to check updates from time to time, but it can improve your page speed score.

    For example, on our Private Equity Interview Questions page it is possible to see how the fbevents.js file is being loaded from our server and the cache control http header is set to 14 days (1209600 seconds)

    cache-control: public, max-age=1209600

    Step 6: Use a content delivery network (CDN)

    What’s a CDN? Click here to learn more.

    I recommend using Cloudflare as it makes a lot of tasks much easier and faster than if you were to try and do them on your own server. Here is what we specifically did on Cloudflare’s configuration:


    • Auto-minify, check all
    • Under Polish
    • Enable Brotoli
    • Enable Mirage
    • Choose Lossy
    • Check WebP


    • Enable HTTP/2 – You can read more about this topic here
    • No browsers currently support HTTP/2 over an unencrypted connection. For practical purposes, this means that your website must be served over HTTPS to take advantage of HTTP/2. Cloudflare has a free and easy way to enable HTTPS. Check it out here.


    • Under SSL
      • Choose Flexible
    • Under TLS 1.3
      • Choose Enable+0RTT – More about this topic here.

    Step 7: Use service workers

    Service workers give the site owner and developers some interesting options (like push notifications), but in terms of performance, we’re most excited about how these workers can help us build a smarter caching system.

    To learn how to to get service workers up and running on your site, visit this page.

    With resources (images, CSS, javascript, fonts, etc) being cached by a service worker, returning visitors will often be served much faster than if there was no worker at all.

    Testing, tools, and takeaways

    For each change you make to try and improve speed, you can use the following tools to monitor the impact of the change and make sure you are on the right path:

    We know there is a lot to digest and a lot of resources linked above, but if you are tight on time, you can just start with Step 1 from both the Backend and Front-End sections. These 2 steps alone can make a major difference on their own.

    Good luck and let me know if you have any questions in the comments. I’ll make sure João Guilherme, my Head of Technology, is on to answer any questions for the community at least once a day for the first week this is published.

    Happy Tuning!

      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

      Moz Blog

      Posted in IM NewsComments Off

      7 Steps To Launch A Home-Based Business Selling Services Online

      Welcome to a complete overview of the steps to launch what I call a Services Arbitrage business. If you’re not sure what this is and you’ve never heard the story behind how I launched an online editing company, make sure you read Part 1 and 2 first. Here are the links: How To Start An […]

      The post 7 Steps To Launch A Home-Based Business Selling Services Online appeared first on Yaro.Blog.

      Entrepreneurs-Journey.com by Yaro Starak

      Posted in IM NewsComments Off

      How to Diagnose and Solve JavaScript SEO Issues in 6 Steps

      Posted by tomek_rudzki

      It’s rather common for companies to build their websites using modern JavaScript frameworks and libraries like React, Angular, or Vue. It’s obvious by now that the web has moved away from plain HTML and has entered the era of JS.

      While there is nothing unusual with a business willing to take advantage of the latest technologies, we need to address the stark reality of this trend: Most of the migrations to JavaScript frameworks aren’t being planned with users or organic traffic in mind.

      Let’s call it the JavaScript Paradox:

      1. The big brands jump on the JavaScript hype train after hearing all the buzz about JavaScript frameworks creating amazing UXs.
      2. Reality reveals that JavaScript frameworks are really complex.
      3. The big brands completely butcher the migrations to JavaScript. They lose organic traffic and often have to cut corners rather than creating this amazing UX journey for their users (I will mention some examples in this article).

      Since there’s no turning back, SEOs need to learn how to deal with JavaScript websites.

      But that’s easier said than done because making JavaScript websites successful in search engines is a real challenge both for developers and SEOs.

      This article is meant to be a follow-up to my comprehensive Ultimate Guide to JavaScript SEO, and it’s intended to be as easy to follow as possible. So, grab yourself a cup of coffee and let’s have some fun — here are six steps to help you diagnose and solve JavaScript SEO issues.

      Step 1: Use the URL inspection tool to see if Google can render your content

      The URL inspection tool (formerly Google Fetch and Render) is a great free tool that allows you to check if Google can properly render your pages.

      The URL inspection tool requires you to have your website connected to Google Search Console. If you don’t have an account yet, check Google’s Help pages.

      Open Google Search Console, then click on the URL inspection button.

      In the URL form field, type the full URL of a page you want to audit.

      Then click on TEST LIVE URL.

      Once the test is done, click on VIEW TESTED PAGE.

      And finally, click on the Screenshot tab to view the rendered page.

      Scroll down the screenshot to make sure your web page is rendered properly. Ask yourself the following questions:

      • Is the main content visible?
      • Can Google see the user-generated comments?
      • Can Google access areas like similar articles and products?
      • Can Google see other crucial elements of your page?

      Why does the screenshot look different than what I see in my browser? Here are some possible reasons:

      • Google encountered timeouts while rendering.
      • Some errors occurred while rendering. You probably used some features that are not supported by Google Web Rendering Service (Google uses the four-year-old Chrome 41 for web rendering, which doesn’t support many modern features).
      • You blocked crucial JavaScript files for Googlebot.

      Step 2: Make sure you didn’t block JavaScript files by mistake

      If Google cannot render your page properly, you should make sure you didn’t block important JavaScript files for Googlebot in robots.txt

      TL;DR: What is robots.txt?

      It’s a plain text file that instructs Googlebot or any other search engine bot if they are allowed to request a page/resource.

      Fortunately, the URL Inspection tool points out all the resources of a rendered page that are blocked by robots.txt.

      But how can you tell if a blocked resource is important from the rendering point of view?

      You have two options: Basic and Advanced.


      In most cases, it may be a good idea to simply ask your developers about it. They created your website, so they should know it well.

      Obviously, if the name of a script is called content.js or productListing.js, it’s probably relevant and shouldn’t be blocked.

      Unfortunately, as for now, URL Inspection doesn’t inform you about the severity of a blocked JS file. The previous Google Fetch and Render had such an option:


      Now, we can use Chrome Developer Tools for that.

      For educational purposes, we will be checking the following URL: http://botbenchmarking.com/youshallnotpass.html

      Open the page in the most recent version of Chrome and go to Chrome Developers Tools. Then move to the Network tab and refresh the page.

      Finally, select the desired resource (in our case it’s YouShallNotPass.js), right-click, and choose Block request URL.

      Refresh the page and see if any important content disappeared. If so, then you should think about deleting the corresponding rule from your robots.txt file.

      Step 3: Use the URL Inspection tool for fixing JavaScript errors

      If you see Google Fetch and Render isn’t rendering your page properly, it may be due to the JavaScript errors that occurred while rendering.

      To diagnose it, in the URL Inspection tool click on the More info tab.

      Then, show these errors to your developers to let them fix it.

      Just ONE error in the JavaScript code can stop rendering for Google, which in turn makes your website not indexable.

      Your website may work fine in most recent browsers, but if it crashes in older browsers (Google Web Rendering Service is based on Chrome 41), your Google rankings may drop.

      Need some examples?

      • A single error in the official Angular documentation caused Google to be unable to render our test Angular website.
      • Once upon a time, Google deindexed some pages of Angular.io, an official website of Angular 2+.

      If you want to know why it happened, read my Ultimate Guide to JavaScript SEO.

      Side note: If for some reason you don’t want to use the URL Inspection tool for debugging JavaScript errors, you can use Chrome 41 instead.

      Personally, I prefer using Chrome 41 for debugging purposes, because it’s more universal and offers more flexibility. However, the URL Inspection tool is more accurate in simulating the Google Web Rendering Service, which is why I recommend that for people who are new to JavaScript SEO.

      Step 4: Check if your content has been indexed in Google

      It’s not enough to just see if Google can render your website properly. You have to make sure Google has properly indexed your content. The best option for this is to use the site: command.

      It’s a very simple and very powerful tool. Its syntax is pretty straightforward: site:[URL of a website] “[fragment to be searched]”. Just take caution that you didn’t put the space between site: and the URL.

      Let’s assume you want to check if Google indexed the following text “Develop across all platforms” which is featured on the homepage of Angular.io.

      Type the following command in Google: site:angular.io “DEVELOP ACROSS ALL PLATFORMS”

      As you can see, Google indexed that content, which is what you want, but that’s not always the case.


      • Use the site: command whenever possible.
      • Check different page templates to make sure your entire website works fine. Don’t stop at one page!

      If you’re fine, go to the next step. If that’s not the case, there may be a couple of reasons why this is happening:

      • Google still didn’t render your content. It should happen up to a few days/weeks after Google visited the URL. If the characteristics of your website require your content to be indexed as fast as possible, implement SSR.
      • Google encountered timeouts while rendering a page. Are your scripts fast? Do they remain responsive when the server load is high?
      • Google is still requesting old JS files. Well, Google tries to cache a lot to save their computing power. So, CSS and JS files may be cached aggressively. If you can see that you fixed all the JavaScript errors and Google still cannot render your website properly, it may be because Google uses old, cached JS and CSS files. To work around it, you can embed a version number in the filename, for example, name it bundle3424323.js. You can read more in Google Guides on HTTP Caching.
      • While indexing, Google may not fetch some resources if it decides that they don’t contribute to the essential page content.

      Step 5: Make sure Google can discover your internal links

      There are a few simple rules you should follow:

      1. Google needs proper <a href> links to discover the URLs on your website.
      2. If your links are added to the DOM only when somebody clicks on a button, Google won’t see it.

      As simple as that is, plenty of big companies make these mistakes.

      Proper link structure

      Googlebot, in order to crawl a website, needs to have traditional “href” links. If it’s not provided, many of your webpages will simply be unreachable for Googlebot!

      I think it was explained well by Tom Greenway (a Google representative) during the Google I/O conference:

      Please note: if you have proper <a href> links, with some additional parameters, like onClick, data-url, ng-href, that’s still fine for Google.

      A common mistake made by developers: Googlebot can’t access the second and subsequent pages of pagination

      Not letting Googlebot discover pages from the second page of pagination and beyond is a common mistake that developers make.

      When you open the mobile versions for Gearbest, Aliexpress and IKEA, you will quickly notice that, in fact, they don’t let Googlebot see the pagination links, which is really weird. When Google enables mobile-first indexing for these websites, these websites will suffer.

      How do you check it on your own?

      If you haven’t already downloaded Chrome 41, get it from Ele.ph/chrome41.

      Then navigate to any page. For the sake of the tutorial, I’m using the mobile version of AliExpress.com. For educational purposes, it’s good if you follow the same example.

      Open the mobile version of the Mobile Phones category of Aliexpress.

      Then, right-click on View More and select the inspect button to see how it’s implemented.

      As you can see, there are no <a href>, nor <link rel> links pointing to the second page of pagination.

      There are over 2,000 products in the mobile phone category on Aliexpress.com. Since mobile Googlebot is able to access only 20 of them, that’s just 1 percent!

      That means 99 percent of the products from that category are invisible for mobile Googlebot! That’s crazy!

      These errors are caused by the wrong implementation of lazy loading. There are many other websites that make similar mistakes. You can read more in my article “Popular Websites that May Fail in Mobile First Indexing”.

      TL;DR: using link rel=”next” alone is too weak a signal for Google

      Note: it’s common to use “link rel=”next’ to indicate pagination series. However, the discoveries from Kyle Blanchette seem to show that “link rel=”next” alone is too weak a signal for Google and should be strengthened by the traditional <a href> links.

      John Mueller discussed this more:

      “We can understand which pages belong together with rel next, rel=”previous”, but if there are no links on the page at all, then it’s really hard for us to crawl from page to page. (…) So using the rel=”next” rel=”previous” in the head of a page is a great idea to tell us how these pages are connected, but you really need to have on-page, normal HTML links.

      Don’t get me wrong — there is nothing wrong with using <link rel=”next”>. On the contrary, they are beneficial, but it’s good to combine these tags with traditional <a href> links.

      Checking if Google can see menu links

      Another important step in auditing a JavaScript website is to make sure Google can see your menu links. To check this, use Chrome 41.

      For the purpose of the tutorial, we will use the case of Target.com:

      To start, open any browser and pick some links from the menu:

      Next, open Chrome 41. In the Chrome Developer Tools (click Ctrl + Shift + J),  navigate to the elements tab.

      The results? Fortunately enough, Google can pick up the menu links of Target.com.

      Now, check if Google can pick up the menu links on your website and see if you’re on target too.

      Step 6: Checking if Google can discover content hidden under tabs

      I have often observed that in the case of many e-commerce stores, Google cannot discover and index their content that is hidden under tabs (product descriptions, opinions, related products, etc). I know it’s weird, but it’s so common.

      It’s a crucial part of every SEO audit to make sure Google can see content hidden under tabs.

      Open Chrome 41 and navigate to any product on Boohoo.com; for instance, Muscle Fit Vest.

      Click on Details & Care to see the product description:


      94% Cotton 6% Elastane. Muscle Fit Vest. Model is 6’1″ and Wears UK Size M.“

      Now, it’s time to check if it’s in the DOM. To do so, go to Chrome Developers Tools (Ctrl + Shift + J) and click on the Network tab.

      Make sure the disable cache option is enabled.

      Click F5 to refresh the page. Once refreshed, navigate to the Elements tab and search for a product description:

      As you can see, in the case of boohoo.com, Google is able to see the product description.

      Perfect! Now take the time and check if your website is fine.

      Wrapping up

      Obviously, JavaScript SEO is a pretty complex subject, but I hope this tutorial was helpful.

      If you are still struggling with Google ranking, you might want to think about implementing dynamic rendering or hybrid rendering. And, of course, feel free to reach out to me on Twitter about this or other SEO needs.

      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

      Moz Blog

      Posted in IM NewsComments Off

      5 Steps to Protect Your Attention and Become a Better Thinker

      “I didn’t have enough time.” More and more, when I hear this common explanation for why something didn’t get done,…

      The post 5 Steps to Protect Your Attention and Become a Better Thinker appeared first on Copyblogger.


      Posted in IM NewsComments Off

      7 Steps to Becoming a Better Thinker

      When was the last time you stopped to think? I mean really stopped. To think. I hope you’re in the…

      The post 7 Steps to Becoming a Better Thinker appeared first on Copyblogger.


      Posted in IM NewsComments Off

      3 Game-Changing Steps You Might Skip When You Publish

      Processes are a part of any type of job — whether it involves data, gardening, or construction. Yet, the fun…

      The post 3 Game-Changing Steps You Might Skip When You Publish appeared first on Copyblogger.


      Posted in IM NewsComments Off

      The Root of Impostor Syndrome in Creative Business (and Two Steps to Temper It)

      Philosophy of art has been an interest of mine for quite some time. But philosophy of professional art is a…

      The post The Root of Impostor Syndrome in Creative Business (and Two Steps to Temper It) appeared first on Copyblogger.


      Posted in IM NewsComments Off