Atlas: I believe the SERP Game is Changing

Atlas

Atlas, the OpenAI browser it’s incorporating ChatGPT, or a browser where ChatGPT is the default interface. It can complete the tasks without always redirecting you to a URL. 

Most of you might have thought that all it means is just fewer clicks, but the real story is much bigger than that.

Many think it is just another browser that finds results. It’s not about what it does; instead, it’s about how it does what it does is going to change the SERP game.

ChatGPT itself is a search engine; it runs queries, scrapes the internet based on user queries, summarizes the results, and presents them to you.

With “Atlas,” it’s different, not just Atlas; the Google AI search is also similar to this. These browsers are like agents; they can interact with your JavaScript (JS), which is being used to render your front end.

This means that the SERP game is going to change. If your site has bad JavaScript or is slow, then you won’t appear.

I used a similar example, which is discussed in the last post: OpenAI Apps SDK: The App store of ChatGPT?

When someone searches for

 “Find me the new version of adidas black adizero evo shoes size 10.5 that can be delivered by EOD tomorrow

and searched on Atlas and normal Google Search (Not the Google AI search). When I looked at the networks tab, I found something fundamentally different that sparked the thought about the change that is going to happen in SERP

Atlas Search VS Google Search

This is where “ How it does what it does” makes the difference:” It’s immediately rendering all the JS like how a human sees it.

I am not saying Google Search is not rendering the JS for SERP results. Instead, it means

 “It is done behind the scenesVS. “this is live, like the same rendered content like how humans see.

Let’s look at some of the general differences:

How Atlas Searches 

This is where I believe the SERP game is going to change.

When the user is searching with a specific intent and context, the normal SERP of search engines is going to fail. As tools like Atlas and similar browsers begin to interpret user intent more deeply, the real question becomes: how do you serve that data effectively?

Some quick wins could be:

It’s not just about crawlers and indexes

It used to be a world where website data was optimized for bots and crawlers, just to appear on the SERP. That game is changing. Browser agents like Atlas will now render your site, execute your JavaScript, and extract exactly the information the user is looking for.

As we discussed above, browsers like Atlas or Google AI search or any other AI tool are going to render the JS during the normal page load lifecycle and can start analyzing DOM nodes and network responses during the initial render. This means it sees the same rendered content a human user sees (including content injected by JS).

So make sure you clean up your crappy JS ASAP. Make your site is Agent-readable.

Your Website = A Repository of Information for AI Systems

We are one more step closer to accelerating the theory, which I have written in SEO & AEO: Any Different?

Make your websites ready for agents to read and interpret all relevant data. You have to structure your data properly so that the browser agents can resolve the user queries much faster. This has to go hand in hand with the performance of your frontend.

I feel that in the new world, you should be making sure your site is ready for agents to read. This doesn’t mean the traditional SEO work is not needed, but this is on top of what your were doing or what youre were ignoring.

This further strengthens my belief that :

In the near future, your websites are likely to evolve to become a repository of information for AI systems and search models rather than relying solely on direct user visits.

I’d love to hear your thoughts and ideas on this — feel free to share them in the comments or drop me a message.

Related articles:

SEO & AEO: Any Different?

OpenAI Apps SDK: The App store of ChatGPT?

From Clicks to Context: Key Considerations for Embracing Conversational Commerce


Click to Context

After the last two posts, many have reached out to me, and we have had some good discussions. Thank you for all the feedbacks and sessions.

One question that keeps coming back to me in all those meetings was “How will this impact the current retail ecosystems?

That question inspired me to write this piece. In this article, I won’t dive deep into each system & scenarios, but rather provide insights and pointers on what actions to take and which areas are likely to experience change.


With Open AI Apps, the potential scenario we are going to face with e-commerce in the near future is :

More and more companies will start using ChatGPT as another channel for selling their products, which means most of the retailers will be forced to go into that channel. So if you decide to go down that route, your current e-commerce echo systems and architecture are going to have some impacts or changes.

As we transition from click commerce to context commerce, your content becomes the decisive factor — it will either make or break your success.
The conversational customer journey could be like this:

Conversational Product Discovery 

Customer opens ChatGPT and asks: “Hi <Brand:> Find me a black running shoe size 10, under $120.”

ChatGPT will show the shoe size 10 that are less than $120 

Customer: “Show me the blue one.” 

ChatGPT will look at your product feed and search data from RAG and see if you are selling blue shoes, and present that to the customer.

Customer: “Add this to cart, size 10”

ChatGPT will call the backend to create a cart and show that to the customer. 

Customer: “Checkout this”

ChatGPT calls the backend, it calculates the taxes & shipping, returns a hosted payment session URL or a Stripe PaymentIntent (if card entry required).

Customer: Enters the card details, and the purchase is completed.


In order to achieve the above customer journey, we need to do the following things:

Authentications & Cart Merges 

This will be the first touch point of change, and it’s not a biggie, but it’s a change that has to be thought through. 

Similar to how you authenticate the users from the website and apps, you need to map the ChatGPT sessions with the site/app sessions. You will have to manage the guest users, existing users, and existing users with an active cart scenarios etc.

Commerce Orchestrator

Worth thinking about creating an event-based commerce orchestrator with an MCP that dictates how your commerce flow should be. Some of the key responsibilities of this layer could be :

  1. Product feeds to LLMs and other systems
  2. Creating /Merging cart & Checkouts with different channels
  3. Payments
  4. Inventory & Price feeds
  5. Personalization
  6. Updating & retrieving from Knowledge Base 

Product Feed

Most of the retailers should expect a change in the product feed because, 


The traditional product feed from ERPs or your existing PIMs will not work for LLMs. I am not talking about the format of these feeds, instead it’s about the extra information, the information that is traditionally not part of product feeds from these systems( for eg: Including Inventory along with the product feed, Tax as final value based on regions etc)


So you could expect a change in the way you create this data and how you send this data to LLMs.

Knowledge Base 

Simply put, this is the way you can expose your products, data, and services to the RAG. This is a must, and none of the retailers have this right now. I have touched upon this in the article: SEO & AEO: Any Different?

This is a change not just in your tech echo system, but almost every department in the business has to work together to figure out all the questions they have over the period of time, create a strategy, structure it, and publish this as content on the website.

This will call for a change in the way you create content in CMS; you will have to update the product content, and it is going to be a continuous process.

Reducing Hallucinations

New term for you ? Don’t worry, it just means that RAG will read the data, and based on that data, the LLMs might hallucinate and give you a reply that is slightly off. For eg:

While chatting, the customer might ask, “Is this an all-terrain shoes ?”. The LLM will reply saying: “Yea, it’s an all-terrain shoe ” 

Customer: “Is it waterproof?”

LLM: “Yes it is “

The last answer is a hallucination of the LLM, Based on your data, it started thinking that since it is an all-terrain shoe, it should be waterproof. 

To stop these hallucinations, we have to write system prompts like :

“Do not invent product features or availability. If unsure, respond: ‘I can’t confirm that — check this product page’ and provide link/doc reference.”

We call these Evals. This helps LLMs from hallucinations

Don’t worry, there are tools out there that we can easily plug in and do this quite easily. If you are going to use Agent from OpenAI, you can easily input your evals into that.

Payments

The new evolution of AI-enabled commerce is powered by the Agentic Commerce Protocol (ACP), a new, merchant-friendly open standard codeveloped by Stripe and OpenAI. 

Your payment platforms will also soon release this. It’s not rocket science for Service Integrators because most of the work will be done by your gateway. You just have to call it in the right way:

How it works is :

After the customer chooses their preferred payment method, your payment gateway will issue a Shared Payment Token (SPT), a new payment primitive that lets applications like ChatGPT initiate a payment without exposing the buyer’s payment credentials.SPTs are scoped to a specific merchant and cart total. Once issued, ChatGPT passes the token to the merchant via API. The merchant can then process the transaction through your gateway.

Personalization

You can build this along with your commerce orchestrator or if you have a personalization engine, then pass this information, like session history + browsing + purchase history to surface products.

Expose getRecommendations(session_id, product_id) as a tool for ChatGPT to call. Keep your customers’ privacy in mind and only share the IDs and small metadata.


Above is not a comprehensive list of impacted areas, but it covers almost all the basic areas that will have changes. I tried to keep it to the basic impact level so that everyone can build on top of this.

The impact of change will be different for different retailers and is solely based on your current architecture. Your imagination and budget also plays a role in this , we could even think about adding an agentic layer in your architecture and much more.

The great thing about the new agents being rolled out across all LLMs is that development will become much faster. You’ll be able to test creative ideas more easily. I believe that in this new world, imagination will face far fewer limitations due to technological constraints.

What do you think? If there’s a specific area you’d like to discuss, feel free to leave a comment or reach out — I’d love to continue the conversation.

OpenAI Apps SDK: The App store of ChatGPT?

Yesterday ( 6th October 2025), ChatGPT introduced apps you can chat with inside ChatGPT. This means you can launch an app inside ChatGPT. Yeah, you read it right, you can.

“Your customers can chat with your brand through ChatGPT!”

The most important questions are:

How will this work

OpenAI’s Apps SDK is going to enable brands to create custom interactive apps inside ChatGPT, and these apps are going to look native within ChatGPT. These SDKs provide full control over the backend and frontend, allowing brands to customize their offerings and products directly within the chat interface of ChatGPT.

All users have to do is ask for the app name.

Imagine your company named XYZ selling shoes online then :

Your customers can type this inside ChatGPT: XYZ, find me size 10 black shoes of Adidas ”

Boom! There you go.

Sounds exciting and amazing right?

What does this mean for you:

This means that you can deliver services & products directly to your customers who are in the discovery phase inside the ChatGPT. That is, instead of competing for customers’ attention, you can and have to compete to be genuinely helpful to your customers.

This also means that:

 In the near future, your websites are likely to evolve to become a repository of information for AI systems and search models rather than relying solely on direct user visits.”

Should you do it? If so, how to do it?

Currently, ChatGPT has 800 million active users. Which means your brand can reach out to all those users. Not just that, there is the early bird advantag also.

Right now, this is available in preview from Oct 6, 2025, and they will be rolling out more details in the coming months. The brands that build apps during this time will have a significant advantage for sure.

What experiences can you create?

Using the Apps SDK you can create interactive applications inside ChatGPT by using your data, products, and services. You can expose your data to these API’s using MCP servers.

Some quick wins for the e-commerce app, using this SDK

These are a few quick wins that came to mind while listening to this, and I am sure there will be more.

1. Product discovery:

You can make your entire catalogue available in ChatGPT. So the customers can type:

“XYZ, find me the new version of adidas black adizero evo shoes size 10.5”.

The customers don’t have to visit your website or app to browse and find the product instead, they can do it directly from ChatGPT.

2. Transactions:

If they like the product, they can complete the purchase from ChatGPT itself using the new Agentic Commerce protocols, which offer instant checkout from inside ChatGPT.

3. Your Services

You can offer your services directly from the app. Imagine you are providing sneaker customization and fixing. Then the customers can ask for help on how to fix something, or they could even book an appointment for customizing their sneaker using ChatGPT.

How to prepare the APP and data.

This is not like the traditional app, where the UX and UI were driving everything. But when you move to the world of OpenAI apps, you have to forget the traditional way of thinking and reimagine all your interactions from a conversational point of view.

1. Customer Journey:

The traditional customer journey is based on page-based thinking. These traditional journeys, which we are used to won’t work anymore. Instead, look for the most common questions and patterns of asking questions. As I have touched upon in one of my other articles (SEO & AEO: Any Different?), all the customers’ questions became very, very important. This is the foundation of AI native conversation apps.

2. Your Data:

We need clean data, and that has to be accessible dynamically through conversational interfaces. All your product data will become more relevant now. If you have a CMS, then enrich all our data, keeping the conversational interactions in mind, and answer all possible questions customers might ask. This is where I am forced to believe that eventually the websites will become a repository of your data and services.

How will you be able to measure success in this world?

This is something we have to observe and learn in the coming days, but what I believe the key metrics might be:

LOI (Length of Interactions):

Similar to how long your users used to spend on your site/app, you have to measure the length of conversations, satisfaction, whether they got what they were looking for, resulted in a conversion etc.

Problem Resolution

Are you able to resolve the customer’s problem within the conversions? Based on this, optimize your data.

LTI (Life Time Interaction)

Life Time Interaction tracks how customers’ interactions evolve over a period of time. This will help you in gaining trust and eventual conversion.

My Take on this

I believe this is a platform shift similar to how App store changed the mobile app ecosystem. This would require a ground-up rethinking in terms of interactions, product data, service offerings, support data, customer data etc.

It all comes down to how quickly you can adapt — the sooner, the better

What do you think?