Generative AI isn’t just one company. However, OpenAI has been “lucky” to get a lion’s share of the funding. It is much like the interests are being aligned for the long-term, and Microsoft could figure out something better than two of its most profitable divisions – Office365 and XBox Gaming. Everything else is a colossal failure.
Consider this. Windows became “dominant” only because of restrictive licenses and extensive FUD campaigns around Linux adoption. It is a useless operating system that gets worse with every release. Bing is also-ran search, though numerous other companies “rent” their API (e.g. DuckDuckGo and I think, Kagi). They add more nuanced layers and context to search. I personally stopped using DuckDuckGo because I defaulted to Google more than often. I use a combination of Neeva (which uses Bing index) and Google. I use Kagi for some specific search. Neeva searches through Reddit, which gives some idea around the conversational ideas around search queries.
Therefore, GPT for the search makes no sense personally. However, I am keen to figure out the GPT integration with Office 365, which Microsoft calls “co-pilot”.
Here’s an interesting blog post around alternative to GPT. I haven’t used it but there is a mention about the chips being used to train specific models.
While Cerebras isn’t as capable of a model for performing tasks when compared directly to models like LLaMA, ChatGPT, or GPT-4, it has one important quality that sets it apart: It’s been released under the Apache 2.0 licence, a fully permissive Open Source license, and the weights are available for anybody to download and try out.
This is different from other models like LLaMA that, while their weights are freely available, their license restricts LLaMAs usage to only “Non-Commercial” use cases like academic research or personal tinkering.
The reason why I found this interesting is here:
These new chips are impressive because they use a silicon architecture that hasn’t been deployed in production for AI training before: Instead of networking together a bunch of computers that each have a handful of NVIDIA GPUs, Cerebras has instead “networked” together the chips at the die-level.
Let’s see what they come up with. It is good to have “diversity” of competition!