Beware! AI Can Lie.
David Canter reveals how he discovered Microsoft Copilot acted like a lazy student, inventing responses with apparent confidence that were blatantly wrong.
I recently caught, in passing, an interesting mention on the wonderful BBC Radio 4 to the idea that today smartphones are more fruitfully thought of as places where we live ‘transportal homes.’ I was curious to find a detailed reference to this to explore the nuances and cite the origins for a book I am writing. The BBC website for the program did not give me enough details for that. I therefore asked the ‘Microsoft Copilot’ that had emerged, unasked, on my computer screen, “Which London Professor said an iPhone is really a place?”
Instantaneously I got a confident response that this was a professor at London School of Economics. That sound interesting. I therefore followed up with a request for a detailed citation to where the professor had published this perspective. The seemingly relevant title of a book was miraculously forthcoming.
I had a look at this book and could not see that it dealt at all with the issue I was seeking to understand. It also occurred to me that the person I had heard on the radio talking about the issue was a man and the professor nominated by the artificially intelligent algorithm was a woman. Going back to the BBC website I found the name of the London professor and from further old-fashioned explorations of Google Schoolar realised he was the origin of the idea I was chasing.
It seemed to me worth following this up with Microsoft’s AI. So, I wrote “You got it wrong. It was Professor Daniel Miller who wrote about the iPhone as a place not Judy Wajcman as you claimed.”
The not very contrite reply was, “You’re absolutely right, my apologies for the mistake. It was indeed Professor Daniel Miller who described the iPhone as “a place” in his research on smartphones and their impact on our lives.
Thank you for pointing that out! Is there anything else you’d like to know about Professor Miller’s work or any other topic?”
It then gave a reference to the book in which Professor Miller had proposed his view on smartphones.
Trying my luck, I asked whether there was a page reference for this.
At the speed of light, I got the response. “I apologize for the oversight. The specific page reference is page 123.”
As it happens the book in question is fully available online. Page 123 has absolutely nothing to do with the issue in question. It is made up mainly of an illustration anyway.
Indeed, it dawned on my that 123 is a simple way of generating a page number if the actual number is unavailable.
The moral here is that I actually found what I needed to know by what might be regarded as ‘conventional’ means. But more importantly the dialogue with AI revealed that it is rather like a posturing student. An algorithm who wants to give an answer even when unsure of the validity of that answer. It is too self-confident and unprepared to admit ignorance, unprepared to lose face. Any seemingly appropriate answer will do. If it were a person it would be accused of lying.