1. Identify pages with excessive RTRs.
Minimizing RTRs begins with awareness. Many pages will have more than 100 RTRs per page when really you should strive for less than 50. Often, developers or professionals of SEO who look at all of the requests on a page will find that some of the items requested are missing (404) and no one noticed. Even though they are missing, it still takes time for the browser to request and come back empty-handed, so these pages should be removed or fixed immediately.
You should also fix assets that have moved and return a 301 or 302 response. These responses mean the browser had to go to 1 place, then follow another set of instructions to go to another place to get the asset, but each new location takes more time. It’s like going to a grocery store, only to be told that they don’t carry what you need, but you can get it from another store down the road. Even worse, if there is a chain of redirects, it’s like going to the second grocery store to discover that they also stopped carrying the item, and now you must try a third.
2. Combine files where possible.
The next step for an optimized website is to combine files (like JavaScript and CSS) where you can. For example, if a page uses 10 style sheets (CSS files) that could all be combined into 1, you should do that. Similarly, if you are using a variety of JavaScript files that could be combined into 1, do that too. Try to think about things as site-wide and template-specific. You should have one site-wide JavaScript file for JavaScript that is on every page of the site, and one template-specific JavaScript file for each page template on the site. The same can be done with CSS. Ensuring that these files are also always referenced with the same file name and location will also help if they are cached properly.
3. Optimize the order of rendering.
After you eliminate and consolidate to decrease the total number of RTRs, the next thing you can do is adjust the order in which things are requested to make it faster. The same way there are steps that must be completed in a recipe before other steps can begin, there is a certain order that a mobile browser will follow that must be adhered to when building a page. Following the grocery example, if your recipe calls for something to marinate for 2 days, it is important to get the items needed for the marinade first, so that it can begin while you are acquiring the rest of the ingredients, rather than after.
Translating this analogy to the web world, the marinade is like a render blocker, because nothing else can happen in the cooking process until it is done. You can only prepare for the next steps in the cooking process by getting the ingredients while things marinate. This is like critical-path rendering, in which you prioritize the critical items, like the head tag, the page content, and the basic layout of the page first, and delay the items that are less critical, especially if they will delay the loading of more critical items. This can drastically improve the loading experience for users, and give them confidence that the rest of the page will load quickly. It also gives them assets to begin evaluating while the loading process happens.
4. Create a page load strategy.
Since search engines emulate human users, they experience pages in much the same way you do. The best thing that you can do to order requests for speed is to know what is critical to the user and bot experience in the first seconds that they are on the page. This is generally the information in the head tag and the text and images on the page. Generally, people will take a few seconds to engage with anything that is interactive on the site, so most JavaScript does not actually need to be loaded at first. Instead, visual representations of the JavaScript, like plus boxes or expanders, can load as place-holders before the JavaScript is needed. JavaScript is especially slow and cumbersome to load. To ensure that JavaScript loading does not slow down the experience so much that users and bots are staring at a blank page for a long time, it is ideal to load something for users to look at, and then load the JavaScript in the background, while users look at the page.
After JavaScript, videos, and images are the next slowest thing to load. Videos and images that are low on the page should not hinder the experience at the top of the page, where users and bots generally start. This concept, of delaying the loading of content that is not visible is called “lazy-loading.” Lazy-loading can be accomplished in a variety of ways (deferred, async, and Google’s lazy-loading meta tag).
The best tool to use to find opportunities for lazy-loading images is Google PageSpeed Insights. To optimize images, you can also use Google’s Lazy-load image tag, but so far, this will only be respected in mobile and desktop Chrome browsers (though this could expand in the future). With either method, you should use the URL Inspection Tool in Google Search Console to verify that the items that are lazy-loaded are visible in the tool’s page rendering and in the rendered HTML.
5. Compress everything that you can.
After you minimize and prioritize the RTR for each page template, you should compress what you can. Compression helps increase page speed by saving bandwidth. Gzip compression is one way to compress files, and it can be set up on most servers, but other options are available. You can also run most code through a minification process, which reduces the size of the final transfer file. However, it is difficult to compress images through these methods. You can use the Performance Review page on WebPageTest.org to find the Full Optimization Checklist, which shows all of the assets on the page, whether they are gzipped, and their level of compression.
For image compression, instead of gzipping and minimizing your images, it is important that the designer hands them off to the developer in the most compressed format possible, reducing image files. Essentially, they need to make the end file size as small as possible without compromising the appearance of the image. Photos should generally be saved as JPG files and icons and illustrations should be saved as GIF files. To make large images look great on full-screen computers, without bogging down tiny mobile screens, consider using the responsive images protocol, or using an image server like the one from Fastly to dynamically send a pre-scaled version of the image to smaller screens.
6. Cache the right pages at the right time.
The next step is to help browsers and bots identify what can be reused, and what needs to be fetched anew every time with browser caching, a functionality that can be set in different ways, one of them via CDNs (content delivery network). Most elements on a site—especially a site that doesn’t change more than once a week or share real-time information like news, weather, or sports scores can cache most of the elements on their pages for up to a year. Caching means that when users visit a page, the browsers will look locally in their memory to see if they already have a file that is needed to build a page. This saves on the round trip requests and improves load time. Search engine bots always view a page as if they have never seen it before, so they don’t use active caching, but they can perceive when caching is going on, and may use it to estimate the load time for a page.
The important thing to understand about caching is that it is based on file names and their location on the server. So if you use 1 file on many pages, like a logo, always reference it with the same file name and URL, even if it exists multiple places on your server. This can be used to your advantage because it means you simply need to update the file name of an item and its reference in the HTML to get a new version of the item cached. So for instance, if you make a tweak to your logo, simply changing the file name from ‘logo’ to ‘logov2’, it will cause the new version to be cached and the old version to be forgotten. In this way, when you allow something to cache for a year, you are not saying that you won’t change the item for the year, but that if you do, you will reference the updated or new item with a new file name.
Cache settings can be a bit complicated, so the best thing to do is to think of them like food that can expire. Some food expires more quickly than others, so you need to let the browser know which stuff gets old and needs to be thrown out quickly, and which can sit on the shelf for a while safely, without replacement. You can use a tool like WebPageTest.org to help you understand where the opportunities are. The important thing to understand here is that there is no expiration set, nor any maximum age for the item to exist in the cache. Therefore, the browser will just assume that it needs to be fetched freshly each time—browser caching cannot happen unless these details are specified.
Since cache settings tell a browser when an item is too old to be used, it is sometimes referenced as a “caching lifetime” or a “freshness lifetime.” If something is beyond its freshness or caching lifetime, the round trip to the server must be completed to get a fresh version. If items are listed as “expired” it means they are beyond their freshness lifetime, and in both the grocery and browser analogies must be replaced the next time they are needed.
In the same way that it is not useful to have shelves full of expired groceries, it is not useful to have a cache full of expired web content. Conversely, it is also inefficient to throw away groceries (files) that are still perfectly usable. When you are assigning caching lifetimes to your files, it is useful to think about this, and also consider the consequences of being generous with caching lifetime. If you make a minor change on your logo, it will not be the end of the world if the old logo is still shown, so you can keep that caching lifetime long, especially if you plan on being diligent with updating the file names when you make changes. With different file names, there is no concern that you will show the old logo.
7. Build Accelerated Mobile Pages (AMPs).
All of this is can seem complicated and that’s why Google created a simpler solution for page speed optimization. Accelerated Mobile Pages, or AMP, is a subset of HTML that follows much more stringent guidelines about what can be included. The goal is to get most pages to a 1-second load time on mobile, which it generally does. AMP allows Google to take control of most of the hard stuff by caching and configuring the loading process and most of the elements described above so that it is as optimized as possible.
In some cases, businesses create new AMP pages and link to them from existing pages on the site from the head tag. This tells Google to serve the AMP page when the person requesting the page is on a mobile device or a slow connection. In other cases, you can replace their existing pages (or their existing mobile pages, if they have separate sites) with AMP pages. This is called “Canonical AMP,” or “AMP canonical.” Google prefers this method, because AMP pages are exceedingly easy for Google to crawl, render, index, and rank. Businesses may not prefer it though, because the limitations and requirements of AMP can make pages a bit stark. AMP can also complicate or limit some of the tracking and testing that you can do on desktop pages, so this is another reason that some people avoid it.
Sites that areAMP valid follow all the AMP rules and guidelines, and in turn, Google shows a little grey lightning bolt with them in mobile results. Google is more likely to include them in special carousels of results that tend to rank at the top of a page. This is great, but if being AMP valid will cause too many problems for your website it is ok to use AMP code without being AMP valid. You should use AMP HTML when you can, without worrying about following all the rules. You will still benefit from the speed of AMP HTML and AMP JavaScript—not a bad deal considering that it’s free. Remember, speed has a direct impact on engagement and conversion but likely also helps with crawl-depth and other aspects of crawl-efficiency.
Mobile browsers work differently than desktop browsers. They also have slower processors and less reliable connections. These factors make it important to review the load time for your site's pages from a worst-case mobile perspective to understand where there are real opportunities for improvement.