Or what we need to talk about when we talk about AI
Ay Yai Yai! Everyone is talking about AI! Everyone: from college students pulling all-nighters, to artists raging against its theft, to artists lauding its potential, to children and their grandparents, to entrepreneurs and engineers, to doctors, academics and government officials. Everyone, for a variety of reasons and in a motley assortment of perspectives. There’s talk about the good, the bad and the ugly, the real and the imagined, the people behind it and those at the forefront; the bias that feeds it and the ethics that might rein it in; the existential risk it poses to humanity, alongside the medical miracles it foretells. There are investigations into who’s doing what, where and to what end; and feverish assessments about the race and who’s winning it. There are conversations concerning its regulation, weaponization and hallucination. There is, in other words, something for everyone.
But you know what there isn’t? A clear, universally accepted definition of AI. That’s nowhere. I looked. Everywhere. Several organizations refer to it as a kind of system. For example: it’s a machine-based system according to the US DEPT OF STATE & The OECD, a computerized system according to CONGRESS, an artificial system per CORNELL LAW SCHOOL, and highly autonomous systems per OPENAI. Nuance. And then there’s Amazon, which terms AI “the field of computer science dedicated to solving cognitive problems commonly associated with human intelligence, such as learning, problem solving, and pattern recognition,” while IBM agrees it’s a field, but one which combines computer science and robust datasets. While you’ll have to read carefully to find IBM’s definition, Microsoft tells you straight up: it’s a capability. Google believes it’s a set of technologies. The EU? A family of technologies. More nuance. And let’s not forget China, where it’s a strategic technology. Meta? The company formerly known as Facebook has used various definitions over the years, enough to confuse ChatGPT into providing a definition and then retracting it. Here’s a telling snippet of our exchange: