Chat GPT based search

The search engine market has heated up.

The cockroach of the tech world, aka Microsoft, whose one trick pony of Microsoft Office has yielded rich dividends, has upended the search market through “integration” of the Chat GPT in the “flagship” search engine. I don’t want to comment on their efforts. It’s not required. Google has shot its bow by announcement of “Bard”. Baidu in China has its own transformer based engine.

I’ll quickly link the three links here:

Will ChatGPT Make Baidu Relevant Again?

It’s true that Baidu is glomming on to the ChatGPT-induced frenzy to boost its relevance, just like Google with its $300 million investment into Anthropic (an OpenAI competitor) and every startup with cash left to pivot in Silicon Valley. However, Baidu has been an active participant in this so-called “AI arms race” for at least a decade. Its ChatGPT-like bot, named Ernie (or 文心 in Chinese) is an application derived from PaddlePaddle – a homegrown deep learning framework that began its development as early as 2013 and open sourced on GitHub in 2016. (PaddlePaddle is analogous to Google’s Tensorflow and Facebook’s PyTorch.)

This is the same logic that underpins Microsoft Azure’s symbiotic relationship with OpenAI – whether you use ChatGPT directly or use OpenAI APIs to make your own chatbot, Azure makes money either way. That’s why last month, riding on the buzz of ChatGPT, Microsoft announced that all of OpenAI’s latest and greatest APIs and frameworks – GPT-3.5, Codex, DALLE 2 – will be generally available for Azure customers. Baidu is fully aware of this playbook and has been executing along the same direction – various AI models labeled Ernie (or 文心) have been available as APIs on Baidu AI Cloud for quite some time.

So Microsoft invests in an external service and then markets its cloud service as if it was the natural choice for Open AI. You pay me to pay you back. That’s the closest explanation. Why does it “escape” the scrutiny of the media?

A little more context:

Search wars reignited by artificial intelligence breakthroughs | Financial Times

On their own, systems such as ChatGPT, based on so-called large language models that can “understand” complex queries and generate text responses, do not represent a direct alternative to search. The information used to train ChatGPT is at least a year old and the answers it gives are limited to information already in its “memory”, rather than more targeted material pulled from the web in response to specific queries.

That has led to a race to develop a new hybrid of AI and traditional search. Known as retrieval augmented generation, the technique involves first applying search tools to identify the pages with the most relevant material, then using natural language processing to “read” them. The results are injected into a large language model such as OpenAI’s GPT-3, which then spits out a more precise answer.

This one is an interesting observation:

None of the companies experimenting with AI in search, however, has yet tested whether consumers will be willing to pay for something they currently receive for free. Google’s founders themselves once wrote that relying on advertising might be the wrong business model for a search engine, said Srinivas. At this stage, he added, it is better to leave all options on the table.

Who will pay for this? Unless the objective is generative data, I don’t foresee much action on the payments front. There is a possibility that general search queries may be refined for enterprises for an uptick in payments.

Finally the blog post from Google:

Google AI updates: Bard and new AI features in Search

It’s a really exciting time to be working on these technologies as we translate deep research and breakthroughs into products that truly help people. That’s the journey we’ve been on with large language models. Two years ago we unveiled next-generation language and conversation capabilities powered by our Language Model for Dialogue Applications (or LaMDA for short).

We’ve been working on an experimental conversational AI service, powered by LaMDA, that we’re calling Bard. And today, we’re taking another step forward by opening it up to trusted testers ahead of making it more widely available to the public in the coming weeks.

The future is personalised search. However, these are interesting developments. Let’s see how they are taken forward.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.