I have seen a flurry of news (and blogs) around this “transformational AI” and generative AI has its space in the sun. Large Language Models are not new; what is new is the hype.
This is exactly my reaction:

ChatGPT – The Revolutionary Bullshit Parrot
ChatGPT was attributed with some supernatural abilities and proclaimed the almost-AGI or at least the tool that would take jobs. And what is disturbing, this AGI wouldn’t make our lives easier by taking over the physical and repetitive tasks in factories and mines and storages what was to this time almost always the goal of successive “industrial revolutions”.In all of these absurd people are trying to push it into the creative/free professions like doctors, teachers, programmers, artists etc.
An earlier attempt at this LLM concept:
Word2Vec, with its famous (but probably cherry-picked) word arithmetics: King – Man + Woman = Queen, took this domain out of the hands of linguistic feature engineers, making it more of a deep learning problem. However, even then, the vector arithmetics (without explicit disentanglement) was an overreach especially given the fact that rare words (and highly inflected languages would have a large amount of them) tended to instead create clusters on their own (clusters of rare words) \cite[Frage].
The practical considerations:
From the scientific point of view, ChatGPT fits into another dangerous trend – a kind of race between the largest players about who can train bigger models. I remember times when there was at least some consensus about the fact that too large models are not good at generalization but rather tend to memorize the input given during training. With OpenAI’s legendary “openness” (neither open source nor transparent) it’s very hard to say if the model is not fed with such a large amount of input training data that it learned to effectively memorize the “whole internet”. And it learned how to use this memory quite nicely.
Associated costs:
It’s estimated that one search with ChatGPT would cost X times more than the “usual” googling. This is probably the reason bing provides information with references to more than one site in condensed form – if only the references were leading to the claimed information.
No affirmative numbers are available
Interestingly, I posed the question to ChatGPT that refuses to answer the question about itself.

ChatGPT answer in detail:
As an AI language model, I am not aware of the specific server infrastructure that is used to run me, nor do I have access to financial or operational data related to my maintenance and operation. However, I can provide you with a general idea of the costs associated with running a large language model such as myself, as well as some factors that can affect the carbon footprint of such operations.
The cost of running a language model such as ChatGPT can vary widely depending on factors such as the size of the model, the amount of data it processes, and the computational resources required to train and maintain it. Some estimates suggest that running a model like GPT-3 can cost millions of dollars per year in server and cloud computing costs alone.
In terms of server costs, a large language model like ChatGPT requires significant computational resources to run. This can include high-performance CPUs and GPUs, as well as large amounts of memory and storage capacity. Additionally, language models like ChatGPT require significant amounts of electricity to run, which can contribute to their carbon footprint.
The carbon footprint of running a language model such as ChatGPT can depend on a variety of factors, including the type of energy used to power the servers, the efficiency of the servers themselves, and the location of the data centers where the servers are housed. Some estimates suggest that the carbon footprint of running a single AI model can be equivalent to the emissions of several cars per year, although the actual impact can vary widely depending on the specific circumstances.
In conclusion, while I cannot provide exact figures for the cost and carbon footprint of running ChatGPT, it is clear that running a large language model requires significant computational resources and can contribute to carbon emissions. As awareness of the environmental impact of AI continues to grow, efforts are being made to develop more energy-efficient models and to explore ways to reduce the carbon footprint of AI operations overall.
Its a scary situation nonetheless.