In late May 2024, an anonymous source leaked over 2,500 pages of internal Google API documentation, providing an unprecedented look under the hood at how the search engine’s ranking algorithms actually work. The leak has sent shockwaves through the SEO industry, as many of the revelations directly contradict Google’s public statements and guidance over the years.
While Google has confirmed the leaked documents are authentic, they have cautioned against “making inaccurate assumptions” based on “outdated or incomplete information.” However, the leak still provides some of the most revealing insights into Google’s ranking systems that SEOs have ever seen.
So what are the major takeaways and implications for how SEOs should be approaching their work going forward? Let’s dive into some of the key learnings from the API leak.
User Engagement Signals Are Paramount
One of the most striking revelations is just how heavily Google’s ranking algorithms lean on user engagement signals like click-through rates, dwell time, bounces, and pogo-sticking behavior.[1] This directly refutes years of Google downplaying the importance of such signals.
The leaked docs reference systems like NavBoost and BrowseRank that analyze user interactions and clicks to score pages and domains on their ability to satisfy search intent.[2] Pages and sites that get more clicks from search results and keep users engaged for longer are prioritized over those with higher bounce rates.
For SEOs, this means that optimizing for engagement metrics like click-through rate and dwell time needs to be a top priority alongside more traditional content optimization. It also underscores the importance of brand authority, as users are more likely to click on and trust prominent, authoritative brands in search results.
Domain Age and Link Velocity Matter
Contrary to Google’s public stance, the API docs show that domain age is considered as a ranking factor, with older domains having an advantage over newer ones.[1] There are also references to a “Google Sandbox” where new domains are heavily filtered until they can be assessed for quality and trustworthiness over time.[3]
Additionally, the docs reveal that Google looks closely at the velocity and patterns around how a site acquires new backlinks, penalizing unnatural link growth.[2] This means SEOs need to focus on acquiring backlinks steadily over time rather than through aggressive link building campaigns.
Subdomains Treated as Separate Entities
Another revelation that goes against Google’s word is that subdomains are processed as completely separate entities from root domains, rather than just extensions of the main site.[1] This means any subdomain properties need to build up their own authority signals independently.
For companies with separate subdomains for different sections like blogs or product areas, this is a crucial consideration. The authority of the root domain does not necessarily extend to subdomains, so each one needs its own optimization efforts.
Content Quality Signals Are Opaque
While the API docs go into extensive detail on many of Google’s ranking systems, there is surprisingly little insight provided into how the search engine evaluates content quality and topical relevance.[4] This has been a core focus area for SEOs optimizing content based on Google’s public guidance.
The lack of clarity around on-page content scoring is likely because these systems are based on more advanced machine learning models that are opaque even within Google. It suggests SEOs need to be cautious about following overly prescriptive content guidelines and focus more holistically on creating excellent quality content that keeps users engaged.
E-E-A-T May Be Overemphasized
Google’s E-E-A-T (Expertise, Authoritativeness, Trustworthiness) guidelines for content have been a core focus for many SEOs, especially in highly regulated industries like finance and health. However, the API docs make little to no mention of E-E-A-T as a distinct ranking factor.[5]
While creating trustworthy, high-quality content from experts is still important, E-E-A-T may not be as formalized and weighted within Google’s algorithms as previously thought. This suggests SEOs should be careful about potentially over-optimizing for E-E-A-T at the expense of other areas like engagement.
Technical SEO Fundamentals Remain Critical
One area that does get extensive coverage in the API docs is the importance of technical SEO best practices related to site architecture, content structure, internal linking, and other core web fundamentals.[2] This aligns with Google’s public guidance around following technical SEO standards.
The leak reinforces that while Google’s ranking systems have evolved significantly, getting the basic technical foundations of a website correct is still a gating factor for achieving visibility. SEOs should continue prioritizing audits and optimizations in areas like site speed, mobile-friendliness, structured data implementation, and other core web vitals.
Key Takeaways for SEOs
So what are the key takeaways SEOs should focus on in light of this leak? Here are some of the major areas of emphasis:
- Optimize for engagement signals like click-through rates and dwell time, not just traditional on-page factors.
- Build a strong brand identity and authority over time to increase likelihood of clicks and engagement.
- Acquire backlinks steadily through sustainable white hat tactics, avoiding any spikes or unnatural patterns.
- Treat any subdomain properties as separate entities that need to build their own authority.
- Focus on creating high-quality, engaging content, but don’t overly optimize for E-E-A-T.
- Continue prioritizing technical SEO fundamentals as a baseline requirement.
While this leak provides some of the most revealing insights into Google’s ranking systems to date, it’s important to remember that algorithms are constantly evolving. SEOs should view this as guidance rather than a static playbook.
The core principles of creating excellent quality content that satisfies user intent while following best technical practices remain paramount. But this leak also underscores the need to prioritize brand building, engagement metrics, and a more holistic approach beyond just traditional on-page optimization.
Citations:
[1] https://www.blueswitch.com/blog/the-google-api-leak-what-happened-and-whats-next-for-seo
[2] https://www.phocuswire.com/google-algorithm-leak-what-it-means-for-travel-SEO
[3] https://sparktoro.com/blog/an-anonymous-source-shared-thousands-of-leaked-google-search-api-documents-with-me-everyone-in-seo-should-see-them/
[4] https://sparktoro.com/blog/11-min-video-the-google-api-leak-should-change-how-marketers-and-publishers-do-seo/
[5] https://www.receptional.com/articles/googles-api-leak/
Leave a Reply