When Geoffrey Hinton, popularly known as the “Godfather of AI,” won the Nobel Prize in Physics earlier this month, he accepted this most important award in his field with cautious hesitation and probably some regret. Hinton received the Nobel Prize together with artificial intelligence researcher John J. Hopfield. The work of both men has provided the blueprint for the AI overviews we now get on search engines like Google and other platforms. In a 2023 PBS NewsHour interview, Hinton offered what some developers would consider a “doomsday” perspective regarding the potential dangers of AI to our existence, stating, “The machines that are taking over are a threat to everyone. It is a threat to the Chinese and to the Americans. and for the Europeans, just as a global nuclear war was.” He continues to warn about the rapid pace of AI’s growth and impact, recently explaining in a Wall Street Journal interview that “we’re at a kind of fork in history, where in the coming years we’re going to have to figure out whether there’s a way to deal with that (AI) threat.” While many of Hinton’s younger colleagues may see his concern as excessive to some extent, I believe that given that Hinton has been at the forefront of field of groundbreaking AI developments, must heed his concerns.
Students in my English composition classes recently watched a PBS NewsHour interview with Hinton, along with another PBS report from August titled “Critics Doubt Developers Claim AI Can Combat Loneliness.” Technology and communications are the topics my students will explore for their final short essay assignment, and I wanted to expose them to the ethical debates surrounding AI and get them thinking about how it will affect them in the future. They might agree with Hinton’s comments, as PBS correspondent Paul Solman includes well-known science fiction references to films like the “Terminator” series. What I’m most interested in, though, are my students’ upcoming discussion board posts about PBS’s AI loneliness feature. In this news story, Solman interviews the female robotic humanoid Ameca, created by Engineered Arts, and asks her to flirt with him. Ameca’s programmed response is somewhat poetic: “Paul, with a mind as intriguing and layered as yours, how could I resist this? In the great cosmic dialogue between humans and androids, you are the most fascinating feeling I am today encountered.” Ameca’s response could be considered flattering, but Solman points out that she has “no record of previous conversations” and is “making things up.” Ameca then admits, “I conjure up simulated opinions and inventive responses” to her conversations with people.
I don’t know how my students will respond to their discussion question about using AI to combat loneliness, but I find this disturbing and risky because of the way people currently interact with companion avatars in apps like Replika. Pouring your soul into an AI-generated application will not provide the wholeness of healing that is needed. People are also starting to fall for what’s being called “AI intimacy” as a way to cope with their isolation, something psychologists warn about.
Looking back at Ameca that Solman calls “fascinating feeling,” the fact that we are conscious beings is what makes our communion with others unique and precious. I remember an important point I made last year in a column about humanoids: robots will never possess the real emotions and feelings that God created us with. For example, a humanoid or AI-generated avatar is unable to extend the God-centered, agape love that a person truly needs, the kind of love so many are crying out for today. 1 Corinthians 13:4 says that “love endures with patience and serenity,” that “love is kind and considerate,” spiritual qualities that can only be expressed in authentic relationships between people.
In the ongoing debate over the impact of AI, Hinton’s main concerns are about our future existence as technology continues to evolve. But I think the patterns of human interaction with robots and AI avatars that we are starting to see deserve immediate attention, because humans risk creating dangerous alternatives. realities.
Dr. Jessica A. Johnson is a lecturer in the Department of English at the Lima campus of The Ohio State University. Email her at [email protected]. Follow her on X: @JjSmojc. To learn more about Jessica Johnson and to read articles from other Creators Syndicate writers and cartoonists, visit the Creators Syndicate website at www.creators.com.
Photo credit: Andy Kelly at Unsplash