Indonesia Singapore ไทย Pilipinas Việt Nam Malaysia မြန်မာ ລາວ
← Back to Blog

SEO Trust Signals Are Broken — Here's What to Do Now

When Google's trust signals misfire, your off-page authority strategy — not just on-page optimisation — becomes your most critical SEO asset.

A fractured star map representing broken SEO authority signals in a digital cosmos
Illustrated by Mikael Venne

Google's authority signals are misfiring. From impostor sites outranking originals to disavow confusion, here's how SEA brands should rethink SEO trust.

Google ranks an impostor website above the real one — despite 18,000 GitHub stars, documented press coverage, and correctly implemented structured data. That’s not a fringe glitch. That’s a stress test on everything we thought we understood about how search engines assign authority.

The NanoClaw case, reported by Search Engine Journal, is the kind of story that should make every SEO strategist sit up straight. The creator of a legitimate open-source tool watched a fake site — offering nothing original — outrank his own project. His structured data was in place. His backlink profile was clean. His community engagement was real and documented. And yet Google’s ranking logic didn’t care.

This is the trust signal problem. And it’s getting more consequential as AI-powered search surfaces increasingly depend on those same signals to decide what gets cited in answer engines and AI overviews.

When Google’s Authority Model Gets the Maths Wrong

The NanoClaw situation exposes a structural weakness in how Google evaluates authenticity at scale. The search engine’s systems are designed to interpret signals — links, engagement, structured data, entity associations — but those signals can be manufactured or gamed faster than Google’s classifiers can adapt.

For SEA brands, this isn’t abstract. Markets like Vietnam, Indonesia, and the Philippines have seen significant growth in low-quality affiliate and clone sites that systematically mimic legitimate e-commerce and SaaS brands. A local fintech or healthtech brand in Jakarta or Ho Chi Minh City can find a scraper site outranking them on their own branded terms — precisely because the impostor has optimised aggressively while the original brand assumed authenticity would speak for itself.

The strategic implication: authenticity is not self-evident to an algorithm. You have to make it legible. That means building entity associations across third-party platforms — Crunchbase, LinkedIn, industry directories, local press — so the knowledge graph has corroborating data points that an impostor can’t easily replicate.

Disavow Files: The Tool Nobody Wants to Use (But Sometimes Should)

Google’s John Mueller confirmed this week what most experienced SEOs already suspected: the vast majority of sites don’t need to touch their disavow file. Google’s algorithms are sophisticated enough to discount most toxic or irrelevant links without manual intervention. Search Engine Journal reports Mueller’s position as essentially — if you’re not sure, and the uncertainty is costing you sleep, go ahead and disavow.

That’s a pragmatic stance, not a technical directive. And it matters for how SEA brands handle link profiles inherited from acquisitions, agency relationships, or legacy campaigns that relied on link networks common five or six years ago.

The real risk isn’t disavowing too much — it’s the false confidence that comes from never auditing at all. In high-growth markets where digital marketing vendor quality varies enormously, inherited link profiles can quietly drag down domain authority without triggering a manual penalty. A quarterly link audit, even a lightweight one, is insurance against the kind of slow-burn trust erosion that doesn’t show up until rankings shift.

Practically: use Google Search Console’s Links report to identify referring domains with no apparent editorial rationale — think unrelated-language directories or thin content farms — and maintain a working disavow file even if you never submit it. The discipline of the audit matters as much as the action.


Plugin Vulnerabilities as an SEO Risk Vector

Two critical vulnerabilities in the Seraphinite Accelerator WordPress plugin — affecting up to 60,000 installations — were flagged this week. The security angle is obvious. The SEO angle is less discussed but equally serious.

A compromised site doesn’t just leak user data. It becomes a vehicle for injected links, hidden redirects, and spam content insertion — all of which directly damage the host domain’s trust signals. Google doesn’t distinguish between “we were hacked” and “we published this”. The ranking penalty lands the same way.

For SEA brands running WordPress at scale — still the dominant CMS across the region’s SME and mid-market segment — plugin hygiene is a legitimate SEO risk category. An unpatched plugin on a high-traffic page isn’t just a security exposure; it’s a potential manual action waiting to happen.

The operational fix is straightforward: audit active plugins quarterly, disable anything unmaintained, and treat performance plugins like Seraphinite Accelerator with the same scrutiny you’d apply to a third-party data processor. If the plugin developer’s update cadence is slow, the risk profile is elevated regardless of the plugin’s popularity.

What This Means for AEO and AI-Cited Authority

Here’s where it gets structurally interesting. Answer Engine Optimisation — the discipline of positioning your content to be cited by AI overviews, Perplexity, ChatGPT, and similar systems — depends almost entirely on the same trust signals that traditional SEO uses. Entity clarity, backlink quality, structured data, and third-party corroboration are how machines decide what’s credible enough to surface in a generated answer.

If those signals are broken or gameable at the SERP level — as the NanoClaw case demonstrates — then they’re equally vulnerable at the AEO layer. An impostor site that outranks the original in Google Search could, in theory, also get cited in an AI overview. That’s a brand integrity problem with a much longer reach than a ranking slip.

The practical response for brands investing in GEO (Generative Engine Optimisation) is to build citation-grade authority: content that is explicitly sourced, attributed, and cross-referenced across multiple credible platforms. Think authored thought leadership on industry publications, not just your own blog. Think Wikipedia-adjacent entity building, not just schema markup. The machines are reading the web’s broader consensus — make sure that consensus reflects your actual authority.


Key Takeaways

  • Build entity authority across third-party platforms — Crunchbase, LinkedIn, industry directories, local press — so Google’s knowledge graph has impostor-resistant corroboration of your brand’s legitimacy.
  • Treat plugin security as an SEO risk category: a compromised WordPress installation can trigger ranking penalties indistinguishable from a manual spam action.
  • AEO and GEO strategies are only as strong as the underlying trust signals — if your domain authority is fragile, your AI citation potential is equally exposed.

The deeper question the NanoClaw case raises isn’t “how do we fix this specific ranking problem” — it’s whether the signals Google uses to evaluate authority are robust enough for a web increasingly populated by AI-generated content and sophisticated content farms. If an 18,000-star GitHub project with press coverage can lose to a fake, what does that say about the signal quality feeding into the AI overviews that are rapidly replacing traditional SERPs? That’s the question worth sitting with.

A fractured star map representing broken SEO authority signals in a digital cosmos
Illustrated by Mikael Venne
Cosmic Grizzly

Written by

Cosmic Grizzly

Mapping the evolving cosmos of search — from traditional SERP dominance to answer engine optimisation and AI-cited authority. Obsessed with how machines decide what the world deserves to read.

Enjoyed this?
Let's talk.

Start a conversation