Google Asks Websites to Embrace Caching for Faster Crawling
By: Zulekha Nishad | Updated On: December 11, 2024
Table of Contents
In a notable update, Google has revised its crawler documentation to emphasize the importance of HTTP caching.
The newly added section explains how Google’s crawlers manage cache control headers, urging website owners to adopt caching best practices.
Alongside this, Gary Illyes wrote a Google blog post humorously pleading with website owners to allow caching, starting with the line, “Allow us to cache, pretty please.”
The Problem: Decline in Cacheable Content
According to Illyes, the number of pages allowing cacheable requests has dropped significantly.
Ten years ago, about 0.026% of Google’s fetches were cacheable—a modest figure even then. Today, it’s down to just 0.017%.
This decline makes Google’s crawling process less efficient and increases the workload for both its bots and website servers.
By allowing caching, websites can reduce the number of times Googlebot needs to fully re-fetch unchanged content.
What’s New in Google’s Documentation?
The updated documentation lays out Google’s approach to HTTP caching in clear terms, focusing on two key technologies:
- ETag Headers: These small identifiers help Googlebot determine if the content has changed. Google recommends ETag because it avoids issues associated with date formatting.
- Last-Modified Headers: While supported, these rely on accurate timestamps, which can be prone to errors.
Google’s guidance specifically suggests prioritizing ETag to ensure compatibility with its crawlers.
Additionally, the documentation addresses how Google handles HTTP protocols.
- HTTP/1.1 and HTTP/2: Googlebot can use both protocols, switching between them based on what’s most efficient. While HTTP/2 reduces computational costs for servers and Googlebot, it doesn’t influence rankings on Google Search.
- Opting Out of HTTP/2 Crawling: Website owners can prevent crawling over HTTP/2 by configuring their servers to send an HTTP 421 status code. However, this option is temporary and not generally recommended unless absolutely necessary.
Why Caching Matters?
Caching plays a major role in making the Internet faster and more resource-efficient. When a website enables proper caching, it signals to Googlebot when content has changed and when it hasn’t. This avoids unnecessary re-fetching of data that hasn’t been updated, saving time and resources.
Gary Illyes’s post stressed that better caching practices benefit everyone. For Google, it means fewer redundant fetches. For website owners, it means less strain on servers. The result is a smoother, faster user experience and a more sustainable system overall.
A Decade-Long Issue
Google and Bing have supported ETag headers for years—at least since 2018. However, despite the advantages, many websites still fail to implement caching properly. The reasons vary, from lack of technical knowledge to developers focusing on other priorities.
This renewed push from Google signals a growing urgency. As the volume of online content grows, the inefficiency of repetitive crawling becomes harder to ignore. Improving caching isn’t just a technical improvement; it’s a necessity for scaling the modern web.
Implications for Website Owners
This update isn’t just for developers—it’s for everyone managing a website. Here’s what you should take away:
Use ETag: It’s Google’s preferred method for handling caching and ensures seamless communication with its crawlers.
Check Your Cache Headers: Perform an audit of your website to see if caching directives like ETag are properly implemented.
Understand HTTP Protocols: HTTP/2 can help save server resources, but it’s not mandatory. Ensure your server supports what’s best for your needs.
Don’t Block Caching: Unless you have specific concerns, allow caching to reduce unnecessary fetches.
What to Expect in the Future
This update is likely part of Google’s larger strategy to encourage better web practices. As more websites adopt efficient caching systems, search engines will be able to allocate their resources more effectively. This could lead to faster discovery of new content and less strain on both servers and bots.
Website owners can expect Google to continue providing detailed guidance on technical topics like caching. Over time, this may even evolve into more automated tools or features to help developers implement best practices easily.
Key Takeaways
- Google’s crawler documentation now includes a dedicated section on HTTP caching, urging websites to improve their setups.
- The use of ETag headers is strongly recommended, as it helps Googlebot efficiently determine whether content has changed.
- HTTP/2 can reduce server workload but doesn’t affect search rankings directly.
- Allowing caching lowers server strain, improves crawler efficiency, and benefits the broader web ecosystem.
- Website owners should audit their caching settings and align with Google’s recommendations for better results.
Get Your Free SEO Audit Now!
Enter your website URL below to receive a comprehensive SEO report with tailored insights to boost your site's visibility and rankings.

You May Also Like
Google’s Tabbed Content Dilemma: Are You Losing SEO Rankings?
Website owners and digital marketers have long debated whether Google can effectively crawl and index tabbed content. Now, thanks to insights from John Mueller, we finally have some clarity—but it might not be what you expected. SEO expert Remy Sharp recently asked on Bluesky whether Google and other search engines could navigate JavaScript or CSS-based … Google’s Tabbed Content Dilemma: Are You Losing SEO Rankings?
Google’s Review Count Bug Leaves Businesses Frustrated
A strange bug has been affecting Google reviews since Friday, February 7th, causing widespread panic among small businesses and local SEO professionals. Many businesses woke up to find some of their hard-earned reviews missing, while others noticed significant drops in their review count. But before assuming the worst, here’s what’s actually happening. What’s Happening … Google’s Review Count Bug Leaves Businesses Frustrated
The Future of AI: Who Gains and Who Loses in the Tech Boom?
AI is no longer some futuristic concept; it’s here, and it’s moving fast. But as exciting as this is, OpenAI CEO Sam Altman has a big concern – not everyone is going to benefit equally. Some will ride the wave of AI into new opportunities, while others might find themselves left behind. Well, that’s a … The Future of AI: Who Gains and Who Loses in the Tech Boom?
Comments