Google ranked a fake site above the real NanoClaw project. Here's what that reveals about SEO authority signals — and how to defend yours.
The creator of NanoClaw — a developer tool with 18,000 GitHub stars and documented press coverage — is being outranked on Google by a fake version of his own site. Not a competitor. An impostor.
If that doesn’t recalibrate how you think about search authority, nothing will.
The NanoClaw Case Exposes a Structural Flaw in How Google Reads Authority
Search Engine Journal reports that despite 18K GitHub stars, organic press mentions, and correctly implemented structured data, Google is ranking a fraudulent site above the legitimate NanoClaw project. The creator did everything the SEO playbook prescribes — and it still wasn’t enough.
This isn’t an edge case. It’s a stress test that reveals where Google’s authority signals break down: they’re largely retrospective and aggregative. The algorithm reads historical link patterns, domain age, and crawl signals. An impostor with a clean domain and freshly acquired links can, apparently, pass enough of those checkboxes to leapfrog a legitimate creator.
For SEA-based brands operating in markets where domain spoofing and counterfeit digital presences are documented problems — particularly in e-commerce categories across Shopee and Lazada ecosystems — this should be a loud alarm. Brand authority in search is not self-defending.
Disavow Files Won’t Save You — But They’re Still Worth Using
The NanoClaw situation sits in uncomfortable proximity to a separate, quieter signal from Google this week. John Mueller confirmed, as Search Engine Journal covers, that most sites don’t need to submit a disavow file — but if you’re conflicted about a toxic link profile, doing it anyway provides psychological and practical insurance.
Mueller’s framing is honest: Google is generally good at ignoring bad links, but “generally good” isn’t the same as “always right.” For brands in competitive verticals — fintech, travel, health, and retail are the obvious ones in SEA — a disavow file is less about correcting Google’s math and more about eliminating uncertainty from your technical SEO posture.
The strategic read here: disavow is a hygiene practice, not a rescue operation. If you’re waiting until you see ranking drops to audit your backlink profile, you’re already behind. Quarterly link audits with a clear escalation path to disavow are baseline practice for any brand spending seriously on SEO.
Infrastructure Vulnerabilities Are an SEO Problem, Not Just a Security One
The Seraphinite Accelerator story — two vulnerabilities affecting up to 60,000 WordPress installations — lands differently when you read it through a search lens rather than a security one.
A compromised site can have its content altered, redirects injected, or spammy pages indexed without the site owner’s knowledge. Google’s crawlers don’t distinguish between content you wrote and content a bad actor inserted. If a vulnerability allows unauthorized content injection, your domain authority becomes the delivery mechanism for someone else’s manipulation.
For marketing teams managing WordPress-based properties in SEA — still the dominant CMS across markets like Indonesia, the Philippines, and Vietnam — plugin hygiene is a search ranking issue. Unpatched vulnerabilities aren’t just an IT department problem. They’re a direct threat to the indexation integrity of your domain. Any site running Seraphinite Accelerator should have patched or replaced it by now; the SEO risk alone justifies the urgency, separate from the security exposure.
What Machines Actually Reward — And How to Build for It
Pull these three stories together and a pattern emerges: Google’s authority model has seams, and those seams are being found and exploited faster than the algorithm is being updated to close them.
The NanoClaw case suggests that entity authority — the machine’s understanding of who you are, not just what links to you — remains underdeveloped in Google’s ranking logic. Structured data helps, but it’s insufficient when an impostor can mirror your schema. The stronger play is building multi-surface authority: consistent entity signals across Google’s Knowledge Graph, Wikipedia presence where warranted, verified social profiles, and co-citations in high-authority publications.
In AEO and GEO terms — the disciplines increasingly concerned with how AI systems decide what sources to trust and cite — this matters even more. Large language models trained on web data are learning authority from the same signals Google uses, plus additional ones like semantic consistency and citation patterns across sources. A brand that only ranks in blue links but lacks a coherent entity footprint will struggle as AI-mediated search becomes the primary interface in SEA markets, where ChatGPT and Gemini usage is accelerating alongside traditional search.
The question worth sitting with: if an AI assistant had to explain your brand to a stranger with no prior knowledge, what sources would it actually pull from — and would they all point to you, or to whoever has done the better job of impersonating you?
Sources
Written by
Cosmic GrizzlyMapping the evolving cosmos of search — from traditional SERP dominance to answer engine optimisation and AI-cited authority. Obsessed with how machines decide what the world deserves to read.