Where will it stop?
GPT-4 from OpenAI shows advances — and moneymaking potential | Financial Times
OpenAI announced that GPT-4 showed “human-level” performance on a range of standardised tests such as the US Bar exam and the SAT school tests, and showed off how its partners were using the AI software to create new products and services.
What is more worrying?
But for the first time, OpenAI did not reveal any details about the technical aspects of GPT-4, such as what data it was trained on or the hardware and computing capacity used to deploy it, because of both the “competitive landscape and the safety implications”.
This is an ominous trend. I am witnessing a gradual “secrecy” behind the OpenAI, which isn’t open at all. From a seed funding of USD $1 billion to a multi-billion pivot for Microsoft, the criticism was mostly on its change to “for-profit” status from “non-profit”. Besides, there was nothing else that the mainstream media ever wrote about it.
A valid criticism though:
“It’s so opaque, they’re saying ‘trust us, we’ve done the right thing’,” said Alex Hanna, director of research at the Distributed AI Research Institute (DAIR) and a former member of Google’s Ethical AI team. “They’re cherry-picking these tasks, because there is no scientifically agreed-upon set of benchmarks.”
Precisely. There are no “benechmarks” so whatever OpenAI will claim will have to be taken on the face-value.
A more worrying trend here:
GPT-4 is also able to generate and ingest far bigger volumes of text, compared to other models of its type: users can feed in up to 25,000 words compared with 3,000 words into ChatGPT. This means it can handle detailed financial documentation, literary works or technical manuals.
Its more advanced reasoning and parsing abilities mean it is far more proficient at analysing complex legal contracts for risks, said Winston Weinberg, co-founder of Harvey, an AI chatbot that was built using GPT-4 and is used by PwC and magic circle law firm Allen & Overy.
If its legal texts, it can well be medical texts and more. We need to be worried. Very worried.