Indonesia Singapore ไทย Pilipinas Việt Nam Malaysia မြန်မာ ລາວ
← Back to Blog

AI Search Referrals Are Rising: What It Means for Local SEO

Optimise for AI citation now — Gemini's rising referral share means structured, authoritative local content gets traffic that keyword-stuffed pages won't.

Ink illustration of a figure navigating a map being redrawn beneath their feet
Illustrated by Mikael Venne

Google Gemini now outpaces Perplexity on referral traffic. Here's what shifting AI search behaviour means for local and hyperlocal SEO strategy in Southeast Asia.

The local search playbook is being rewritten — not by a single algorithm update, but by a structural shift in how people find answers. And the numbers are starting to confirm what many of us suspected: AI-generated responses aren’t just replacing clicks, they’re redirecting them.

Gemini Is Now Sending Real Traffic — And It’s Pulling Ahead

According to SE Ranking data cited by Search Engine Journal, Google Gemini more than doubled its referral traffic to external websites in just two months, while ChatGPT declined from its peak. Perplexity, despite its reputation as the search-alternative darling of tech circles, now sits behind Gemini in actual traffic delivery.

For local and hyperlocal search, this matters more than the headline suggests. Gemini’s architecture is deeply integrated with Google’s Knowledge Graph and Maps data — which means your Google Business Profile, your schema markup, and your local citation consistency aren’t just ranking signals anymore. They’re the source material that Gemini pulls from when it answers “best ramen near Asoke at midnight” or “dermatologist in Kemang who speaks English.”

If your local data is patchy, Gemini either ignores you or — worse — hallucinates something about you that you’d rather it hadn’t.

TurboQuant Changes the Speed of the Game

Marie Haynes covered Google’s TurboQuant breakthrough for Search Engine Journal, and the implications are significant. TurboQuant is a quantisation technique that allows large language models to run faster and more efficiently without meaningful loss in semantic accuracy. Applied to search indexing, it could enable near real-time semantic understanding of content — which collapses the lag between publishing and being understood by Google’s systems.

For local businesses, this is meaningful in a very specific way: time-sensitive content — a Songkran promotion, a flash sale during Harbolnas, a new menu launching this weekend — could be semantically indexed and surfaced in AI responses far faster than current crawl cycles allow.

But speed cuts both ways. If TurboQuant accelerates semantic indexing, it also accelerates the detection of thin, repetitive, or inconsistent content. Local landing pages that exist purely to rank for “[suburb] + [service]” without genuine informational depth become more exposed, not less.


Crawl Budget Isn’t Just a Technical Problem — It’s a Local Strategy Problem

Google’s Gary Illyes recently published a detailed explanation of how Googlebot operates as one client within a centralised crawling platform, including new specifics on byte-level crawl limits. Search Engine Journal covered the announcement, and while it reads as infrastructure documentation, there’s a strategic signal buried in it.

Googlebot’s crawl allocation is finite and shared across millions of sites. For multi-location brands — think a QSR chain with 80 outlets across the Philippines, or a co-working operator spanning three Thai cities — this means your local landing page architecture directly competes with itself for crawl attention. Duplicate structures, inconsistent NAP (name, address, phone) formatting across location pages, and bloated boilerplate content all eat into the crawl budget that should be spent on your highest-value local signals.

The fix isn’t glamorous: audit your location page templates for crawl efficiency, ensure each page carries unique, locally-relevant content (operating hours, neighbourhood references, local team bios), and verify that your hreflang and canonical structures aren’t sending Googlebot in circles across your multilingual Southeast Asian markets.

Reporting Local Search Performance Honestly — Without Losing the Room

Search Engine Journal published a timely piece on reporting uncertainty without losing credibility, and it maps directly onto a problem every local search practitioner knows: attribution in multi-touch local journeys is messy, and stakeholders want clean numbers.

A customer in Jakarta searches on Google Maps, reads your AI-generated Gemini summary, checks your Tokopedia reviews, then walks in. Which channel gets credit? Practically none of them, under most reporting setups.

The strategic response isn’t to manufacture false precision — it’s to report what you can actually measure (GBP insights, local pack impressions, direction requests, call clicks), contextualise what you can’t (AI referral traffic is underreported because many Gemini interactions don’t pass referrer data cleanly), and set expectations accordingly. Stakeholders who understand the measurement limits make better budget decisions. Stakeholders who’ve been told everything is trackable eventually lose trust when the numbers don’t add up.

Key Takeaways

  • Audit your Google Business Profile and local citation data now — Gemini’s referral growth means AI systems are actively drawing on this data to answer local queries, and inconsistencies get amplified, not smoothed over.
  • Redesign multi-location landing pages for crawl efficiency — with Googlebot operating under byte-level constraints, templated local pages with thin content are a structural liability, not just an SEO inconvenience.
  • Set honest reporting frameworks for AI-influenced local traffic — referral data from Gemini and other AI surfaces is incomplete by design; build measurement approaches that acknowledge this rather than papering over it.

The deeper question for local search strategists is this: as AI systems become the primary interface between intent and business discovery, does proximity still function as a ranking signal — or does it get abstracted into something closer to relevance-weighted authority? How you answer that shapes whether your local SEO investments over the next 18 months are defensive or genuinely differentiating.


At grzzly, we work with brands across Southeast Asia on exactly this intersection — local search infrastructure, AI-era content architecture, and the reporting frameworks that keep internal stakeholders aligned when the measurement landscape is shifting. If your local search strategy was built for 2023, it probably needs a rethink. Let’s talk

Ink illustration of a figure navigating a map being redrawn beneath their feet
Illustrated by Mikael Venne
Dusty Grizzly

Written by

Dusty Grizzly

Deep in the weeds of Google Business Profiles, local pack mechanics, and neighbourhood-level search intent. Believes proximity is a strategy, not a coincidence.

Enjoyed this?
Let's talk.

Start a conversation