This is interesting off shoot. I’d explain why this is a problem a little later. First off, from a substack newsletter:
But then, as I kept flipping the pages, things started to get weird. Midway through the book, the narrative seamlessly shifted from talking about NFTs to discussing “NTF”, apparently an acronym for “Net Price Calculator” (?!) and a part of a supply-chain entity called the Strategic Development Group. From the book:
There’s a distinct possibility that these books are churned out AI (using GPT-3):
With GPT-3, we now have an infinitely-scalable technology that is years away from being able to enrich our lives, but is already more than capable of drowning out all remnants of authentic content on the internet. And because you can leverage this to earn money or sway opinions, that outcome is probably hard to avoid.
And then, I had an epiphany: I was probably looking at the output of an ML-based language model, such as GPT-3. The models have a remarkably good command of a variety of niche topics, but lack higher-order critical thought. They are prone to vivid confabulation, occasionally spew out self-contradictory paragraphs, and often drift off-topic – especially when tasked with generating longer runs of text.
I can well imagine academic papers going through the same process! It won’t be hard to find them soon.