Contact Us About Us
Log In
SEO 4 min read

Google Asks Websites to Embrace Caching for Faster Crawling

In a notable update, Google has revised its crawler documentation to emphasize the importance of HTTP caching. 

The newly added section explains how Google’s crawlers manage cache control headers, urging website owners to adopt caching best practices. 

Alongside this, Gary Illyes wrote a Google blog  post humorously pleading with website owners to allow caching, starting with the line, “Allow us to cache, pretty please.”

Google updates crawler documentation

The Problem: Decline in Cacheable Content

According to Illyes, the number of pages allowing cacheable requests has dropped significantly. 

Ten years ago, about 0.026% of Google’s fetches were cacheable—a modest figure even then. Today, it’s down to just 0.017%. 

This decline makes Google’s crawling process less efficient and increases the workload for both its bots and website servers.

By allowing caching, websites can reduce the number of times Googlebot needs to fully re-fetch unchanged content.

What’s New in Google’s Documentation?

The updated documentation lays out Google’s approach to HTTP caching in clear terms, focusing on two key technologies:

  • ETag Headers: These small identifiers help Googlebot determine if the content has changed. Google recommends ETag because it avoids issues associated with date formatting.
  • Last-Modified Headers: While supported, these rely on accurate timestamps, which can be prone to errors.

Google’s guidance specifically suggests prioritizing ETag to ensure compatibility with its crawlers. 

HTTP Caching - Documentation

Additionally, the documentation addresses how Google handles HTTP protocols.

  • HTTP/1.1 and HTTP/2: Googlebot can use both protocols, switching between them based on what’s most efficient. While HTTP/2 reduces computational costs for servers and Googlebot, it doesn’t influence rankings on Google Search.
  • Opting Out of HTTP/2 Crawling: Website owners can prevent crawling over HTTP/2 by configuring their servers to send an HTTP 421 status code. However, this option is temporary and not generally recommended unless absolutely necessary.

Supported transfer protocols

Why Caching Matters?

Caching plays a major role in making the Internet faster and more resource-efficient. When a website enables proper caching, it signals to Googlebot when content has changed and when it hasn’t. This avoids unnecessary re-fetching of data that hasn’t been updated, saving time and resources.

Gary Illyes’s post stressed that better caching practices benefit everyone. For Google, it means fewer redundant fetches. For website owners, it means less strain on servers. The result is a smoother, faster user experience and a more sustainable system overall.

A Decade-Long Issue

Google and Bing have supported ETag headers for years—at least since 2018. However, despite the advantages, many websites still fail to implement caching properly. The reasons vary, from lack of technical knowledge to developers focusing on other priorities.

This renewed push from Google signals a growing urgency. As the volume of online content grows, the inefficiency of repetitive crawling becomes harder to ignore. Improving caching isn’t just a technical improvement; it’s a necessity for scaling the modern web.

Implications for Website Owners

This update isn’t just for developers—it’s for everyone managing a website. Here’s what you should take away:

Use ETag: It’s Google’s preferred method for handling caching and ensures seamless communication with its crawlers.

Check Your Cache Headers: Perform an audit of your website to see if caching directives like ETag are properly implemented.

Understand HTTP Protocols: HTTP/2 can help save server resources, but it’s not mandatory. Ensure your server supports what’s best for your needs.

Don’t Block Caching: Unless you have specific concerns, allow caching to reduce unnecessary fetches.

What to Expect in the Future

This update is likely part of Google’s larger strategy to encourage better web practices. As more websites adopt efficient caching systems, search engines will be able to allocate their resources more effectively. This could lead to faster discovery of new content and less strain on both servers and bots.

Website owners can expect Google to continue providing detailed guidance on technical topics like caching. Over time, this may even evolve into more automated tools or features to help developers implement best practices easily.

Key Takeaways

  • Google’s crawler documentation now includes a dedicated section on HTTP caching, urging websites to improve their setups.
  • The use of ETag headers is strongly recommended, as it helps Googlebot efficiently determine whether content has changed.
  • HTTP/2 can reduce server workload but doesn’t affect search rankings directly.
  • Allowing caching lowers server strain, improves crawler efficiency, and benefits the broader web ecosystem.
  • Website owners should audit their caching settings and align with Google’s recommendations for better results.
Dileep Thekkethil

Dileep Thekkethil is the Director of Marketing at Stan Ventures and an SEMRush certified SEO expert. With over a decade of experience in digital marketing, Dileep has played a pivotal role in helping global brands and agencies enhance their online visibility. His work has been featured in leading industry platforms such as MarketingProfs, Search Engine Roundtable, and CMSWire, and his expert insights have been cited in Google Videos. Known for turning complex SEO strategies into actionable solutions, Dileep continues to be a trusted authority in the SEO community, sharing knowledge that drives meaningful results.

Keep Reading

Related Articles