Google Gemini is now a real referral source. Here's what AI crawling shifts mean for local and hyperlocal search strategies in Southeast Asia.
Proximity is a strategy, not a coincidence. But right now, the question isn’t just whether your brand shows up near a customer — it’s whether an AI agent can even find you to recommend you.
Three things happened in the last week that, taken together, should be reframing how local and hyperlocal search teams think about discoverability in 2026.
Gemini Is Sending Real Traffic — and It’s Growing Fast
SE Ranking data cited by Search Engine Journal shows Google Gemini more than doubled its referral traffic to websites over a two-month period, while ChatGPT declined from its earlier peak. Perplexity, despite the press it attracts, is now trailing Gemini as a referral source.
For local SEO practitioners, this isn’t an abstract AI trend. It’s a new acquisition channel with its own discovery logic. Gemini doesn’t crawl like a human browser. It surfaces answers — and increasingly, local businesses — based on how clearly and completely your content answers a specific intent. A restaurant in Orchard Road or a logistics provider in Kawasan Berikat isn’t going to get recommended by Gemini because of domain authority alone. It’ll get surfaced because its Google Business Profile is complete, its website content answers operational questions clearly, and its structured data gives the AI something concrete to work with.
The strategic implication: treat Gemini as you would a very thorough, very literal customer who reads everything and rewards specificity.
Googlebot Byte Limits Are Not a Technical Footnote
Google’s Gary Illyes published a detailed breakdown of how Googlebot operates as one client within a centralised crawling platform — including, for the first time, transparency around byte-level download limits per page. Search Engine Journal covered the disclosure in detail.
Here’s why this matters for local-heavy sites: if you’re running a multi-location brand across Southeast Asia — say, 80 clinic locations in Thailand or 200 franchise outlets across the Philippines — your crawl budget is a real constraint. Googlebot has limits on how many bytes it processes per page. Bloated pages (heavy hero images, poorly compressed assets, render-blocking scripts) eat into that budget before the content that actually signals local relevance gets parsed.
The tactical fix isn’t glamorous: audit your highest-value local landing pages for page weight. Prioritise text-based signals — NAP consistency, service-area schema, FAQ blocks answering neighbourhood-specific queries — above the fold and in crawlable HTML, not locked inside JavaScript components. In markets like Indonesia and Vietnam where mid-range Android devices dominate, this also directly improves page performance for real users.
Google’s New AI Agent Signals a Shift in How Search Is Resourced
Roger Montti’s analysis on Search Engine Journal makes a case worth sitting with: Google’s new AI user agent — distinct from standard Googlebot — may represent a strategic pivot, shifting resources away from Project Mariner toward Gemini-driven agentic search. The OpenClaw trend in open-source web agents is part of the backdrop.
What this suggests for local SEO is less about technical compliance and more about intent architecture. Agentic search doesn’t just retrieve — it reasons, compares, and acts. A Gemini agent helping a user find the best co-working space in BGC or a halal-certified caterer in Shah Alam isn’t just matching keywords. It’s weighing evidence: reviews, operating hours, service descriptions, price signals, photos, response rates.
Local teams should be building content and Profile hygiene that answers comparative questions, not just categorical ones. “Best Korean BBQ in KLCC” is categorical. “Which Korean BBQ in KLCC has private rooms for groups of 10 and takes walk-ins after 9pm” is the agentic query you should be structuring your content to answer — even if no one types it that way yet.
Report What You Know, Flag What You Don’t
Search Engine Journal contributor Bengu Sarica Dincer published a sharp piece on reporting uncertainty without losing credibility — and it lands differently when you’re a local SEO team presenting to a regional CMO who wants to know why Gemini referrals aren’t showing up cleanly in GA4 yet.
AI-driven traffic is notoriously hard to attribute. Direct traffic is rising partly because Gemini and other AI assistants don’t always pass referral headers. Organic numbers can look flat while actual brand discovery through AI is accelerating. Dincer’s argument — that communicating what your data cannot prove is as important as what it can — is a professional discipline local SEO teams should adopt formally.
Practically: build a separate reporting layer for AI-channel proxies (branded search volume trends, GBP direct searches, “Discover” traffic in Search Console) alongside standard organic metrics. Tell your stakeholders what you’re measuring, why, and what remains structurally uncertain. That honesty builds more durable credibility than a dashboard that makes AI traffic look tidier than it is.
Key Takeaways
- Optimise local landing pages for crawl efficiency — byte-heavy pages in multi-location sites eat Googlebot budget before local relevance signals are even parsed.
- Structure Google Business Profiles and on-page content to answer comparative, conditional queries, not just category-level searches — that’s what agentic AI surfaces.
- Build an AI-traffic proxy reporting layer now; attributing Gemini referrals through standard GA4 channels alone will consistently undercount AI-driven discovery.
Proximity used to mean: are you physically close to the searcher? In 2026, it’s evolving to mean: are you informationally close enough for an AI to confidently recommend you? The brands winning hyperlocal in Southeast Asia over the next 18 months will be the ones who treat structured, crawlable, answer-rich content as infrastructure — not an SEO afterthought. The question worth sitting with: how much of your local content is currently optimised for a human skimming a page, versus an AI reasoning across dozens of signals simultaneously?
At grzzly, we work with multi-location brands across Southeast Asia to build local search foundations that hold up as AI-driven discovery reshapes the channel — from GBP architecture to crawl-efficient site structures to the reporting frameworks that help regional teams make confident decisions in uncertain conditions. If Gemini traffic, agentic search, or local pack performance is on your agenda right now, Let’s talk.
Sources
- https://www.searchenginejournal.com/google-gemini-sends-more-traffic-to-sites-than-perplexity-report/570714/
- https://www.searchenginejournal.com/google-explains-googlebot-byte-limits-and-crawling-architecture/570961/
- https://www.searchenginejournal.com/why-new-google-agent-may-be-a-pivot-related-to-openclaw-trend/570764/
- https://www.searchenginejournal.com/reporting-uncertainty-without-losing-credibility/569141/
Written by
Dusty GrizzlyDeep in the weeds of Google Business Profiles, local pack mechanics, and neighbourhood-level search intent. Believes proximity is a strategy, not a coincidence.