There have been several attempts to make AI think like a “human”. A brilliant essay written in The Economist attests to this fact. It makes many assumptions and gives a unique historical perspective, but one thing that should have come out from this write-up is that AI was funded and nurtured by the defence think-tanks to provide an “alternative reality”.
At RAND, Simon and two colleagues—Allan Newell, a young mathematician, and J. Clifford Shaw, a former insurance actuary—tried to model human problem-solving in terms that a computer could put into operation. To do so, Simon borrowed elements from the framework that he had developed in “Administrative Behaviour”. To make a computer “think” as a human, Simon made it think like a corporation.
The product of the trio’s labour was a virtual machine called the Logic Theorist, heralded as the first working prototype of artificial intelligence. Printouts of the Theorist in operation turned heads at the 1956 Dartmouth Summer Research Project on Artificial Intelligence, which gave the field its name and initial membership. In notes from the Dartmouth conference, one participant wrote that the Theorist helped to solve the dreaded “demo to sponsor” problem. This was essential, because the foundation funding AI was sceptical that the research area was worthwhile.
Rest of the article debates its social-sciences construct and the fact that we haven’t learned to “tame the corporations”. However, any half-assed commentator would clearly understand that power structures and corporations are mutually beneficial to each other. Each wouldn’t operate in a silo or a power vacuum. The “direction” of AI shouldn’t be decided by the corporations, but by public collectives or a robust debate in people’s forums. End users should clearly understand the harms and effects of emerging technologies.
As such, we need to demystify the AI from the neon-inspired robots and buttons and explain that like any mathematics theorem, it is only a system of dos and don’ts in a flow-chart. Only then, it would be empowering.