Google has added a new section on “read more” deep links to its official snippet documentation.
Most SEOs scrolled past it.
They should not have.
This is a small but precise update that directly affects how your pages appear in search results — and whether Google can link users straight to the most relevant section of your content.
Here is everything you need to know.
Esto es lo que recomienda Google para aumentar las probabilidades de que se muestre un deep link de “Más información” dentro del snippet.
Google ha actualizado su documentación sobre meta descripciones, añadiendo una sección con 3 claves para aumentar la probabilidad de que… pic.twitter.com/PW3m0MPHSb
— Juan González Villa (@seostratega) April 23, 2026
What Is a “Read More” Deep Link
You have probably seen them in search results without realizing what they are.
When a snippet appears in Google Search, it sometimes includes a “Read more” link at the end.
When you click it, it either expands the snippet or takes you directly to the part of the source page where the content is found.
That jump link is connected to a specific section inside the page, not just the page itself.
It is Google anchoring a user directly to the most relevant paragraph, heading, or block of text — without them having to scroll, search, or hunt for it.
This is a documentation clarification, not a new SERP feature. Read more deep links have appeared in search for some time. What is new is the written guidance on how to increase the likelihood of one appearing on your pages.
That guidance arrived on April 20, 2026, and it contains three specific things Google says you need to get right.
Rule One: Your Content Must Be Immediately Visible
This sounds obvious but it catches a huge number of sites.
Content must be immediately visible to a human on page load. Content hidden behind expandable sections or tabbed interfaces can reduce the likelihood of a deep link appearing, per Google’s guidance.
Think about how many sites use accordion FAQs, expandable sections, tabbed product information, or collapsible content blocks.
All of it is invisible to Google’s deep linking system until a user clicks.
If you want Google to link directly to a section of your page, that section needs to exist in the rendered page from the first moment someone lands on it.
Not behind a toggle. Not under a tab. Not collapsed by default.
If the content is hidden, Google cannot point to it.
This is particularly important for FAQ sections, step-by-step guides, comparison tables, and any other content type that commonly gets folded into accordion UI patterns for aesthetic reasons.
The tradeoff between a cleaner design and a deep-linkable page just became a more important SEO conversation to have with your clients.
Rule Two: Stop Forcing Users Back to the Top
Avoid using JavaScript to control the user’s scroll position on page load. One example Google gives is forcing the user’s scroll to the top of the page.
This is a technical issue that affects more sites than most teams realize.
The way deep linking works is that Google appends a hash fragment to your URL, for example yourdomain.com/article#section-three, and the browser uses that fragment to scroll the user directly to that section on load.
If your site runs JavaScript on page load that overrides the browser’s scroll behavior and pushes the user back to the top of the page, the deep link breaks the moment the page loads.
The user lands at the top. The anchor is ignored. The experience Google tried to create disappears entirely.
This is a common pattern in single-page applications, heavily customized themes, and sites that use scroll animation libraries that reset scroll position.
It is worth a technical audit specifically looking at what happens to your scroll position when a URL includes a hash fragment.
If your site ignores it, your pages are silently disqualifying themselves from deep link eligibility.
Rule Three: Do Not Modify the URL Hash on Load
If you make history API calls or window.location.hash modifications on page load, make sure you do not remove the hash fragment from the URL, as this breaks deep linking behavior.
This is the most technical of the three points but it is also the one that most commonly affects modern JavaScript-heavy websites.
Here is what happens.
Google creates a deep link that includes a hash fragment pointing to a specific section of text, something like #:~:text=your%20specific%20content.
When the user clicks the link, the browser loads your page and the hash fragment is in the URL.
But if your site uses history API calls or window.location.hash modifications on initial load, it may strip or rewrite that fragment before the browser has a chance to use it.
The result is a page that loads cleanly, with no scroll position, pointing nowhere in particular.
Google’s deep link effectively does not work.
For teams building on React, Vue, Angular, or any SPA framework that manages routing client-side, this is a configuration issue worth reviewing immediately.
The fix is not complex — it is a matter of ensuring that your page load sequence respects incoming hash fragments rather than overwriting them — but it requires a developer who understands how your routing and history management is set up.
Why This Matters More Than It Looks
Deep links change how users interact with your content in search results.
Instead of clicking through to a page and then finding what they need, users land directly at the relevant section.
That is a fundamentally better experience, and better experiences lead to stronger engagement signals.
Developers working on JavaScript-heavy sites should test how their pages handle scroll position and hash fragments on initial load.
And the CTR implications are real.
A snippet that links directly to the answer a user is looking for is more compelling than one that requires them to trust the page will have what they need.
For agencies managing clients at scale, this update is worth a structured audit pass across high-value pages — particularly long-form content, pillar pages, FAQ pages, and any page that uses heavy JavaScript for UI interactions.
The three issues Google has flagged are not obscure edge cases.
They are patterns that exist across a significant portion of modern websites, and until now there was no official documentation making the cost of those patterns explicit.
Now there is.
Understanding technical SEO fundamentals and how page architecture directly affects search visibility is exactly what separates agencies that react to updates from agencies that get ahead of them.
The documentation changed on April 20.
Your competitors’ pages have the same issues yours might.
The question is which team audits first.
Deepan Paul
AuthorDeepan Paul is a SEO Lead with four years of experience helping brands recover, scale, and sustain organic growth across global B2B, B2C, and D2C markets. He is recognized as a ranking revival expert, specializing in diagnosing traffic drops, fixing indexing and technical issues, and restoring lost search visibility. He has managed international clients and led cross-functional teams, aligning SEO strategies with core business goals. His expertise spans technical SEO, content strategy, indexing optimization, and building scalable growth systems that adapt to constant algorithm changes. Beyond execution, Deepan is also an SEO trainer and guest speaker, mentoring professionals and contributing insights to leading digital marketing publications. His approach is focused on sustainable, system-driven SEO that delivers long-term results rather than short-term gains.