Indonesia Singapore ไทย Pilipinas Việt Nam Malaysia မြန်မာ ລາວ
← Back to Blog

Google's Trust Signals Are Broken — And SEO Pays the Price

When Google's trust signals misfire, structured data and GitHub stars won't save you — proactive authority-building and technical hygiene are your only real defences.

A fractured search engine results page with a fake website ranking above a verified original, representing broken trust signals in SEO
Illustrated by Mikael Venne

From impostor sites outranking originals to plugin vulnerabilities exposing 60K sites, Google's authority signals are under stress. Here's what SEO teams must do now.

Google has spent a decade telling us its algorithms are getting smarter at identifying genuine authority. This week offered three separate reminders that the system still has significant blind spots — and that SEO teams who trust the machine to sort it out are taking on unacceptable risk.

When Structured Data and 18K GitHub Stars Aren’t Enough

The NanoClaw case is the kind of story that should make every SEO strategist uncomfortable. The creator of NanoClaw — a software project with 18,000 GitHub stars and legitimate press coverage — found his official site outranked by an impostor. The fake domain, with no genuine product behind it, was winning on the SERP despite the real site having structured data correctly implemented. Search Engine Journal reported the creator’s frustration publicly, and it’s warranted.

The strategic implication here goes beyond one developer’s bad week. It reveals that Google’s entity resolution — its ability to connect a brand, its assets, and its digital presence into a coherent authority signal — still fails under adversarial conditions. For SEA brands operating across multiple country domains, multilingual properties, and fragmented social footprints, this is not an edge case. It’s a live risk. If a well-documented open-source project can be eclipsed by a shell site, a regional brand with inconsistent NAP data and split-language content is genuinely exposed.

The fix isn’t just structured data. It’s a coordinated entity-building strategy: consistent brand mentions across authoritative third-party publications, verified profiles on platforms Google trusts (Google Business Profile, Wikidata, Crunchbase), and cross-linked owned properties that reinforce the canonical source of truth.

Plugin Vulnerabilities Are an SEO Problem, Not Just a Security Problem

Two newly disclosed vulnerabilities in the Seraphinite Accelerator WordPress plugin affect up to 60,000 installations, according to Search Engine Journal’s reporting. The plugin is widely used for performance optimisation — the kind of tool SEO teams often push for precisely because Core Web Vitals scores affect rankings.

Here’s the part that gets under-discussed in security write-ups: a compromised site doesn’t just lose data. It loses search equity. Injected spam links, hidden redirects, and cloaked content are all known attack vectors once a plugin vulnerability is exploited. Google’s manual actions team does eventually catch these, but the recovery timeline — typically weeks to months — is brutal for any brand where organic is a primary acquisition channel.

For SEO directors at mid-to-large brands in SEA, the governance question is straightforward: who owns plugin update cycles, and is that person incentivised to move fast? In many regional setups, the answer is a web team that sits outside the marketing function entirely. That organisational gap is a ranking risk.

A practical control: maintain a live inventory of all third-party plugins across your web properties, tiered by security exposure and traffic impact. Any plugin touching performance, caching, or rendering should be on a 72-hour patching SLA when critical vulnerabilities drop.


The Disavow File Is Back — Sort Of

Google’s John Mueller clarified this week that while most sites don’t need to submit a disavow file, doing so is a reasonable call if you’re genuinely uncertain about your link profile. This is a notably more permissive framing than Google’s historical guidance, which often discouraged disavow use unless sites had been explicitly hit by a manual action.

Read between the lines: Google is acknowledging that algorithmic link evaluation isn’t perfect, and that human judgment about link quality still has a place in the system. For brands that went through aggressive link-building campaigns between 2021 and 2024 — or inherited link profiles through acquisitions, which is common in SEA’s consolidating e-commerce sector — this is quiet permission to clean house.

The practical calculus is straightforward. If you have a meaningful cluster of links from domains that look like they were built for manipulation (thin content, no real traffic, irrelevant geography), and your organic performance has been soft relative to your content investment, a targeted disavow file is worth the effort. The risk of over-disavowing quality links is real but manageable if you’re working from a crawl-and-audit baseline rather than gut instinct.

What this guidance also signals, more broadly, is that Google’s trust infrastructure still relies partly on site owners doing their own policing. In an era where AI-generated link spam can be produced at industrial scale, that’s a structural vulnerability — and one that places a premium on earning links from sources Google demonstrably trusts rather than accumulating volume.

SEO Authority in 2026 Is a Systems Problem

Take these three stories together and a pattern emerges. Google’s ability to correctly attribute authority — to a brand, a domain, a piece of content — is under more stress than the company’s public communications typically acknowledge. Impostor sites rank. Vulnerabilities compromise equity. Link signals remain gameable enough that Mueller is still fielding disavow questions in 2026.

For SEA brands, the compounding factor is operational complexity. You’re managing multi-market domains, platform-native content on Shopee and Lazada that doesn’t feed back into your owned SEO, and audiences who often discover brands through LINE or TikTok before ever hitting a search bar. Your authority signals are fragmented by design.

The response isn’t panic — it’s precision. Audit your entity footprint. Lock down your plugin governance. And treat your link profile as a living asset that requires periodic maintenance, not a set-and-forget output of past campaigns.

Key Takeaways

  • Build entity authority across structured third-party sources — Google Business Profile, Wikidata, and relevant industry directories — not just on-site structured data, to reduce impostor vulnerability.
  • Treat high-impact WordPress plugins as a ranking risk, not just a security risk; establish patching SLAs that sit within the marketing team’s accountability, not just IT’s.
  • Use Google’s renewed disavow guidance as a prompt to audit inherited or legacy link profiles, particularly if your brand has grown through acquisition or aggressive outreach campaigns in the past three years.

The broader question worth sitting with: as AI-generated content and adversarial SEO tactics scale faster than Google’s detection systems, how much of your search visibility strategy is built on signals the algorithm genuinely validates — and how much is borrowed time?

A fractured search engine results page with a fake website ranking above a verified original, representing broken trust signals in SEO
Illustrated by Mikael Venne
Cosmic Grizzly

Written by

Cosmic Grizzly

Mapping the evolving cosmos of search — from traditional SERP dominance to answer engine optimisation and AI-cited authority. Obsessed with how machines decide what the world deserves to read.

Enjoyed this?
Let's talk.

Start a conversation