GPT3 primer by Financial Times:
GPT-3, which stands for generative pre-trained transformer version three, is, in essence, a super-sophisticated auto-complete function, which sounds less than exciting.
But what makes GPT-3 remarkable is its scale and flexibility and the possibilities for future development.
Drawing on hundreds of billions of words ingested from the internet and using neural network technology similar to that used by Google DeepMind’s AlphaGo, GPT-3 was trained to spot and then replicate sophisticated patterns. GPT-3 contains 175bn language parameters, more than 10 times the next biggest equivalent model.The astonishingly good but predictably bad AI program | Financial Times
The possibilities are endless. One way GPT3 can find its utility is in “patient support”- although it would require a lot of training to understand the context. The author als cautions about the “dark uses”- to mount disinformation campaigns at scale and despite the ethical objections, it misses the point completely.
It is relatively easier to set up websites at scale and automate the process. Tiny variations in text (and context) can be made. It may be easier to dismiss this “advance” as being “silly” but how can you stop the model from iterating further?