How AI summaries quietly reshape the decision journey

abi hough

Many teams still plan and measure as if the main decision happens on their website.

In reality, AI summaries, rich snippets and other zero click results are now doing a large share of that work before anyone arrives.

This does not replace the website completely. Instead it reshapes the journey. People see compressed, opinionated versions of you and your competitors, often in a single block of content. By the time they land on a site, their expectations and shortlists are already formed.

This note is for product, marketing and digital teams who:

Have noticed behaviour changing around search and AI features
Still see most of their measurement and effort focused on on site journeys
Want a clearer view of where decisions are actually being made


This field note looks at how that shift appears in practice, why common responses do not work and what questions are worth asking before you plan another funnel change.


The old mental model: the website as the main arena.

For a long time the working assumption was simple.

Roughly:

  1. People search or click an ad
  2. They arrive on your website
  3. They explore, compare and decide
  4. They convert or leave

The upstream picture was untidy, but the site was treated as the main place the decision happened. Search and ads were there to send traffic. The site was where you persuaded, reassured and closed.

Most research, analytics and experimentation practices grew up inside that model.

The new reality: compressed decisions upstream.

AI summaries and rich search results change that model in a few important ways.

  • Framing: The way the answer is structured shapes how people think about the problem. It may emphasise different criteria from the ones you talk about on your site.
  • Shortlisting: People often see a ready made shortlist of providers, products or approaches before they ever click through.
  • Default trust: Many users treat the AI summary as a neutral, informed overview, even when it is partly wrong or based on outdated signals.

The result is that a significant part of the decision work now happens inside the AI panel. By the time someone reaches you, they may already:

  • Believe they know the main options
  • Have fixed views about what matters
  • Have tacit expectations set by the answer they just saw

Your website is now operating inside that frame, not defining it on its own.

A simple example.

Take a generic but common scenario.

Someone searches for a complex service. Instead of ten blue links and some ads, they now see:

  • An AI answer describing the problem and typical solutions
  • A handful of named providers or approaches
  • Follow up questions that encourage a particular way of thinking about the decision
  • Maybe a map, ratings, or marketplace units


They read the summary, skim a couple of the suggested providers and then click through to one or two sites to confirm or refine.

Notice what changed:

  • The first impression of the category came from the AI answer, not from any single website
  • The shortlist came from the AI panel or marketplace unit
  • The website visit is now often for confirmation and reassurance, not for discovery from scratch

If your site is still designed as if visitors arrive blank and open minded, you are working against the current journey.

NOTE
In an AI-shaped journey, many visitors arrive in one of two modes: they are either confirming a shortlist decision, or they are pressure-testing it against competitors. Both modes are late-stage. That changes what the first page must do: reduce the work needed to validate fit, surface the criteria the upstream summary has primed, and make the next step obvious. The goal is not to ‘educate from scratch’, it is to make confirmation, comparison, and commitment low-friction.”

How this typically shows up in your data.

Most organisations do not have a clean way to measure this shift yet, but it often leaves traces:

  • More visitors arriving mid journey, already focused on a specific product, plan or option
  • Higher rates of “confirm and leave” behaviour, where people check one or two pages and go
  • Visitors skipping content that explains basics and diving into specifics
  • Changes in traffic and conversion that correlate suspiciously well with search interface changes

None of these prove that AI summaries are responsible on their own, but together they are a strong signal that decisions are being shaped earlier and elsewhere.

EXAMPLE:

For example, one organisation saw visits to a key comparison page drop while direct visits to a single product page rose after an AI answer started recommending that product by name.

Why common responses do not help.

Once teams notice these patterns, there are a few familiar reactions. Most of them are not enough on their own.

01_ Treating AI summaries as an SEO task

Work is handed to SEO or content teams with a brief to “optimise for the AI answer”.

That can help at the margins, but it does not address the deeper questions:

  • What is the AI summary actually saying about us
  • Where is it getting that view from
  • How does that align or clash with what people see once they arrive

02_ Doubling down on funnel tweaks

Teams keep adjusting forms, buttons and page layouts, because that is where their tools and processes live.

This can improve conversion for people who are already committed. It does not help if:

  • The shortlist is wrong before anyone arrives
  • The site contradicts what the AI answer just told them
  • The information they care about most is hard to find or inconsistently presented

03_ Trying to recreate the whole decision on site

Some organisations respond by adding more content and tools, trying to replicate the AI answer and every possible comparison locally.

This usually leads to complexity, not clarity.

If people are already arriving with a frame shaped by an AI summary, they do not need another generic explainer. They need to see how your reality fits or challenges that frame.

Better questions to ask.

Instead of asking “how do we get more clicks from AI summaries”, it is often more useful to ask:

  • What do people see about us and our category before they visit
  • How do AI summaries, zero click results and marketplaces currently describe us
  • Which expectations, criteria and concerns are likely to be set by those descriptions
  • Where does our current site support those expectations and where does it clash with them
  • What information are people probably relying on upstream that we barely address on site

Those questions are uncomfortable, but they connect upstream reality to the work you control.

Auditing your AI shaped journeys.

A practical starting point is a simple audit.

For a small number of important tasks or queries:

01_ Capture what people see upstream

  • Screenshots of AI answers, snippets, comparison units and top results
  • The follow up questions and prompts that AI systems suggest

02_ Write down the implied story

  • How is the problem framed
  • What criteria are mentioned
  • Which options or approaches are presented as default
  • Where does your content line up with that frame
  • Where are you silent on issues the AI answer emphasises
  • Where are you insisting on a story that no longer matches what people just read

04_ Look for points of friction

  • Pages where users would have to reorder their mental model to continue
  • Missing reassurance or evidence for claims that AI has already primed them to expect

This is not about “writing for the algorithm”. It is about understanding the mental context your visitors now carry.

What this means for your site.

A site designed in an AI shaped journey often has to do three things well:

01_ Confirm what is true

Make it easy for people to see that the useful parts of the AI summary are accurate and that you meet the criteria that matter.

02_ Clarify what was flattened

Explain where the summary has oversimplified or missed important nuance, without lecturing the user.

03_ Correct what is wrong

Provide clear, evidenced explanations where the summary has misunderstood or outdated information about you.

That is a different job from “educate from first principles” or “hold the whole decision inside the site”.

Where Corpus fits.

From a Corpus perspective, AI summaries are one part of a wider upstream shift.

Our work usually starts by mapping:

  • How humans and AI encounter your organisation before a visit
  • How that shapes expectations, shortlists and points of comparison
  • How well your current journeys, content and structures support or undermine those expectations

The goal is not to chase every search change. It is to build a clearer understanding of where decisions are actually made, and then align on site work, experiments and investment with that reality.

Talk about how this applies in your organisation.

If a field note resonates and you want to talk about how the same patterns are showing up where you work, a conversation can help.
Typical first conversations last 45 to 60 minutes and focus on understanding your current situation and constraints.
Upstream optimisation for zero click and AI search.
Contact
[email protected]

Typical first conversations last 45 to 60 minutes and focus on your current situation, constraints and goals
We Are Corpus is a consultancy created by Abi Hough and delivered through uu3 Ltd. Registered in the UK. Company 6272638