Why GA4 under reports ChatGPT traffic

Here's why Google Analytics hides most ChatGPT-driven visits.

Paul

Paul · Co-founder

You might have noticed growing traffic from ChatGPT in your Google Analytics.

You can usually see it in GA by filtering Source as chatgpt.com and Medium / Channel Grouping as Referral.

However, that number is likely much smaller than the real traffic figure.

Why Google Analytics under‑reports ChatGPT traffic

Reason 1: ChatGPT link clicks don't always show as Referral

Even though in Google Analytics the Source will still be chatgpt.com, the Medium / Channel Grouping might be Unassigned or not set. This usually happens when the ChatGPT link has no UTM parameters—common for linked mentions within ChatGPT answers.

In your report, focus on Source and don't filter on Medium or Channel Grouping.

Reason 2: Most ChatGPT visits come through Organic Search or Direct

First, if the user clicks a link in the ChatGPT answer from the mobile app, it opens in the phone's browser (Safari, Chrome, etc.) and doesn't always send a referrer header, marking the Medium / Channel Grouping as Direct and the Source as (direct).

Second—and even more important—most ChatGPT answers list brand names but omit the hyperlink. If the user wants to visit that brand's website but there is no link, they will either:

  • Type the brand URL directly into the browser — GA4 records it as Direct.

  • Google the brand and click the first result — GA4 records it as Organic Search.

All these paths hide the ChatGPT source that started the journey, so the visible "AI share" of traffic is smaller than reality.

Reason 3: The real number is likely 5–10× what your dashboard shows

The gap between what analytics captures and what actually happens is larger than most teams assume. At Airefs, ChatGPT accounts for roughly 2% of traffic in GA4. But when we ask new sign‑ups how they found us, over 20% say ChatGPT or another AI. That's a 10× discrepancy—and it's not a data quality problem, it's a structural attribution problem.

The implication is uncomfortable: if you're deprioritizing AI search because the traffic looks small, you're optimizing for the metric you can measure, not the one that matters.

More traffic, more sign‑ups

On top of all that hidden ChatGPT traffic, engagement and conversions from that traffic are higher [1]:

  • Average session duration: 10.4 minutes vs. 8.1 minutes for Google traffic.

  • Average pages viewed: 12.4 pages vs. 11.8 pages for Google traffic.

  • Higher conversions than most channels.

This means your ChatGPT share of total sign‑ups is much higher than its share of traffic. Companies like Tally have reported that 25% of all new sign‑ups come from ChatGPT—an impressive jump for a non-existent channel three years ago.

A simple way to measure this is to ask users at sign‑up "How did you hear about us?" with a dropdown menu. You'll see that the ChatGPT share of sign‑ups is likely far higher than its share of GA traffic.

Frequently asked questions

My analytics show <1% of traffic from ChatGPT. Should I care?

Yes—and that number is almost certainly wrong. The structural reasons above mean most ChatGPT-influenced visits never register as chatgpt.com in GA4. A <1% figure in your dashboard likely represents 5–10% of actual influenced visits. Use a sign‑up survey as a reality check: ask "How did you hear about us?" and compare the ChatGPT share of responses to its share of GA traffic. The gap usually surprises people.

How do I get a more accurate count of ChatGPT-driven visits?

Three complementary methods work well together. First, run a post-sign‑up survey with "How did you hear about us?" as a free-text or dropdown field—this captures intent even when referrer data is lost. Second, look at your GA4 Source dimension without filtering on Medium; you'll catch chatgpt.com sessions that land as Unassigned. Third, track branded search volume in Google Search Console—a spike in brand queries often correlates with a ChatGPT mention wave.

Does this attribution problem affect other AI tools too?

Yes. The same mechanics apply to Perplexity, Gemini, Copilot, and any AI that mentions brands without always linking them. ChatGPT is the largest source, but the under-reporting pattern is consistent across AI tools. If you see traffic from perplexity.ai in GA4, assume the actual influenced visits are a multiple of that figure.

Will this get better as browsers and analytics tools improve?

Partially. Browser vendors and analytics platforms are slowly improving referrer preservation, and some AI tools are starting to add UTM parameters to their links. But the most fundamental reason—AI answers naming brands without hyperlinking them—won't change. That's a product decision by AI companies, not a technical limitation. Attribution for unlinked mentions will always rely on survey data or modelling rather than referrer headers.

What can I do to increase ChatGPT-driven traffic that I can measure?

Focus on getting linked mentions, not just named ones. Structured content (clear product descriptions, comparison pages, how‑to guides) makes it easier for AI to link rather than just name you. UTM parameters on any links you control in AI contexts help too. But the bigger lever is ensuring you're mentioned at all—the difference between 0% and 2% of visible traffic is much larger in real terms than the dashboard suggests.


Citations

[1] https://www.growth‑memo.com/p/the-state-of-ai-chatbots-and-seo

Published Jul 14, 2025

Updated Apr 28, 2026