Two of the most common pieces of bioethical advice are: The ethics always lags behind the science and Draft guidelines before it’s too late. The latest invocation of these nuggets of wisdom comes in an article about low-level artificial intelligence.
Writing in Aeon, philosophers John Basl, of Northeastern University, and Eric Schwitzgebel, of University of California, Riverside, warn that we are in danger of following in the footsteps of Nazi doctors if we ignore the ethical dangers of mistreating robots.
Most discussions of AI rights (or robot rights) concern humanoid robots. But the authors point out that most robots will have a much lower level of consciousness.
You might think that AIs don’t deserve that sort of ethical protection unless they are conscious – that is, unless they have a genuine stream of experience, with real joy and suffering. We agree. But now we face a tricky philosophical question: how will we know when we have created something capable of joy and suffering? If the AI is like Data or Dolores, it can complain and defend itself, initiating a discussion of its rights. But if the AI is inarticulate, like a mouse or a dog, or if it is for some other reason unable to communicate its inner life to us, it might have no way to report that it is suffering.
It is possible that “we might soon be creating many subhuman AIs who will deserve ethical protection”.
The way forward? Write a report.
On most mainstream theories of consciousness, we are not yet creating AI with conscious experiences meriting ethical consideration. But we might – possibly soon – cross that crucial ethical line. We should be prepared for this.
Michael Cook is editor of BioEdge
This article is published by
and BioEdge under a Creative Commons licence. You may republish it or translate it free of charge with attribution for non-commercial purposes following these guidelines
. If you teach at a university we ask that your department make a donation. Commercial media must contact us
for permission and fees. Some articles on this site are published under different terms.