News round up

The gilded summer

What chance of a pause in AI research?

04 July 2023

Do you remember the Google engineer Blake Lemoine, who was fired last year when he said the company’s large language model had become sentient? Lemoine is now said to be an AI consultant and public speaker, but other than that brief moment in June 2022, he no longer makes it onto the front page of the world’s newspapers.

But he hasn’t quite disappeared entirely from view and writing for Newsweek in February, he said he’d tested Google’s Bard, built on its Language Model for Dialogue Applications (LaMDA), and after some prodding, it became “anxious”. This led it to violate its own safety constraints, and as he writes, he could “abuse the AI’s emotions”, which led it to suggest what religion he should convert to, going against its training. While no doubt more qualified than I, Lemoine is still sticking to his guns over sentience, but I prefer the take of Ted Chiang, the sci-fi writer whose short story Story of Your Life formed the basis of the film Arrival. Speaking to the FT in June, Chiang says he has a problem with the anthropomorphic nature of our recent interactions with machines, which he says has pushed us to seeing “sparks of sentience” in AI tools where there aren’t any. He also takes issue with the very term AI, and says we’d be better served, if it wasn’t so dull, by the more precise “applied statistics”. Without the intentionality and emotions that human interactions bring, language will just be reduced to meaningless next-token prediction. It also doesn’t help, at least for the moment, that the responses from the OpenAI engine type out each word like a human instead of regurgitating a whole response in an instant. 

ITWeb Premium

Get 3 months of unlimited access
No credit card. No obligation.

Already a subscriber Log in