Indonesia Singapore ไทย Pilipinas Việt Nam Malaysia မြန်မာ ລາວ
← Back to Blog

When Google Gets Local SEO Wrong: Lessons from NanoClaw

Build off-platform authority signals — citations, structured data, and real-world mentions — because Google's trust algorithm can be fooled by impersonators.

An impostor website shadow looming over a legitimate business listing in local search results
Illustrated by Mikael Venne

Google ranked a fake site above the real NanoClaw. Here's what this SEO authority failure means for local and hyperlocal search strategies in 2026.

A project with 18,000 GitHub stars, verified press coverage, and correctly implemented structured data — outranked by a fake. That’s not a hypothetical; that’s what happened to the creator of NanoClaw, and it should rattle anyone who thinks technical SEO hygiene alone is enough to establish authority in 2026.

Google’s Authority Model Has a Trust Problem

The NanoClaw case, reported by Search Engine Journal, exposes a quiet but serious flaw in how Google evaluates entity authority. The original creator did everything by the book: structured data markup, GitHub credibility signals, earned media. The impostor site — with none of that provenance — still ranked above the legitimate source.

What this reveals isn’t a bug in the algorithm so much as a gap in how Google resolves entity conflicts when off-domain signals are ambiguous. Google’s systems are trained to recognise patterns of authority, not to authenticate origin. An impostor that mimics those patterns closely enough can, apparently, inherit their ranking benefit.

For local and hyperlocal search, this is acutely relevant. A competitor — or bad actor — setting up a Google Business Profile using your brand name, category, and service area isn’t a fringe scenario in SEA markets. In Indonesia and Vietnam especially, where brand name squatting in digital channels is documented, this is an operational risk that most local SEO strategies don’t account for.

Structured Data Is Necessary, Not Sufficient

The NanoClaw creator had structured data in place. It didn’t save them. That’s the uncomfortable truth this case forces on practitioners who’ve been treating schema markup as a silver bullet for entity disambiguation.

Structured data tells Google what you are. It doesn’t tell Google that you are who you say you are. That second layer of verification depends on corroborating signals across independent, authoritative third-party sources — the kind that an impostor can’t easily replicate.

For local businesses, the practical implication is a shift in priority. Citation consistency across platforms like Foursquare, Yelp, and — critically in SEA — Grab Maps, Shopee Food, and Zomato, combined with verified press mentions and authentic review velocity, creates a signal cluster that’s much harder to spoof than a well-tagged webpage. Google’s John Mueller has consistently noted that the disavow tool is rarely necessary for most sites, but the underlying point — that Google’s link and entity evaluation systems are imperfect and sometimes need human correction — applies here too.


Proximity Is a Signal, But Entity Trust Is the Foundation

Local SEO practitioners have long known that proximity to the searcher is one of Google’s three local ranking factors, alongside relevance and prominence. The NanoClaw situation underscores that prominence — the digital equivalent of brand equity — is the most vulnerable of the three to manipulation.

Proximity can’t be faked. Relevance is hard to fake at scale. But prominence, which Google measures through links, mentions, and third-party recognition, can be gamed by a determined impostor who’s willing to build a thin but plausible-looking digital footprint.

The strategic response for local businesses is to treat entity authority as an infrastructure investment, not a campaign tactic. This means: maintaining a live citation audit cadence (quarterly at minimum), claiming and actively managing profiles on every platform where customers might search — not just Google — and generating a consistent stream of authentic, geographically-anchored content that third parties will reference. A Kuala Lumpur restaurant that gets cited by Time Out KL, featured in a Grab editorial, and reviewed on HungryGoWhere has a corroboration cluster that a clone profile simply cannot replicate overnight.

What AEO and GEO Mean for Local Search in 2026

Answer Engine Optimisation and Generative Engine Optimisation are often discussed in the context of broad informational queries — the kind that AI Overviews and ChatGPT handle. But the same principles apply at the local level, and the NanoClaw case is a preview of what happens when entity signals are weak.

When a user asks Google’s AI-powered search “best co-working space in Bonifacio Global City” or Grab’s in-app search surfaces a neighbourhood food option, the systems drawing those answers are doing entity resolution: matching a query to a known, trusted entity. If your entity signals are thin or contested — because an impostor has muddied the waters — you lose that match.

The fix is the same whether you’re optimising for traditional local pack results or AI-mediated answers: build an unambiguous, multi-platform entity footprint. That means consistent NAP (name, address, phone) data, yes, but also consistent brand voice, imagery, and category signals across every touchpoint where your business appears. The more corroborating sources agree on who you are, the harder it becomes for an algorithm — or an impostor — to displace you.

Key Takeaways

  • Audit your brand’s entity signals quarterly across all major platforms in your market — citation inconsistency is the gap impersonators exploit first.
  • Treat third-party mentions on locally-relevant platforms (Grab, Zomato, regional media) as a core link-building strategy, not a nice-to-have.
  • If a competitor or bad actor is contesting your entity in Google’s index, document your authority trail — press coverage, GitHub-equivalent credibility markers, verified profiles — before filing a reconsideration request or spam report.

The NanoClaw incident is an edge case today, but the underlying vulnerability — Google’s inability to reliably authenticate entity origin — will only become more consequential as AI-mediated search surfaces become the primary interface for local discovery. The question worth sitting with: if a well-documented open-source project with 18,000 community signals can be outranked by a fake, what does that mean for your neighbourhood retailer whose entire digital footprint is a Google Business Profile and a Facebook page?

An impostor website shadow looming over a legitimate business listing in local search results
Illustrated by Mikael Venne
Dusty Grizzly

Written by

Dusty Grizzly

Deep in the weeds of Google Business Profiles, local pack mechanics, and neighbourhood-level search intent. Believes proximity is a strategy, not a coincidence.

Enjoyed this?
Let's talk.

Start a conversation