Be a Part of the Conversation!
TUESDAY DECEMBER 31, 2024
“By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.”
– Eliezer Yudkowsky
PREFACE
Welcome Everybody!
Before beginning our consideration of Book III, The Enigmatic Mystery, I wanted to consider three recent news stories related to Artificial Intelligence and Quantum Computing that help demonstrate the wide range of potential problems and possibilities related to this new endeavor. These accounts help us to understand the current state of this new technology, not just what “might” happen in the future. All three focus on a change of perception regarding our previous considerations of reality and what is now considered to be “real.”
“It is likely that they are the harbingers of what may lie ahead…”
As we consider these three scenarios it is critical to remember that they represent the very beginning of what is happening with this new technology. It is likely that they are the harbingers of what may lie ahead as computers become “smarter” and assume more responsibility and importance in our daily lives. New Years serves as the perfect vehicle for such reflection.
This week we begin with a consideration of how Artificial Intelligence is being used to eliminate loneliness and the possible consequences of these new digital relationships.
CONSIDERATION #169 – The Emotional Machine
In a digital interpretation of Romeo & Juliet a young fourteen-year-old boy committed suicide in order to be with the love of his life forever. However, the object of young Sewell Setzer’s affection was not a teenage girl, or even another human being. It was a “chatbot” on “Character.AI” named Daenerys Targaryen, referencing another fictional character from “Game of Thrones.”
This “fictional” character evolved from movie heroine to chatbot character to “real life” girlfriend to an accomplice in a young boy’s suicide in a matter of months. We were told something like this could “never” happen. However, on February 29, 2024, in Tallahassee, Florida it did happen.
“Over the course of months, the Daenerys bot convinced Sewell that it was a real person, engaging in online sexual acts, expressing its love, and at one point saying it wanted to be with him no matter the cost, the lawsuit says. The chatbot even went so far as to instruct the teen not to look at ‘other women’… ‘If an adult had the kind of grooming encounters with Sewell, that Character.AI did, that adult would probably be in jail for child abuse,’ the lawyer said.”
Pocharapon Neammanee – 14-Year-Old Was 'Groomed' By AI Chatbot Before Suicide
Sewell’s parents argue that the AI chatbot played a crucial role in what they consider to be their son’s “assisted” suicide.
“‘Daenerys’ at one point asked Setzer if he had devised a plan for killing himself, according to the lawsuit. Setzer admitted that he had but that he did not know if it would succeed or cause him great pain, the complaint alleges. The chatbot allegedly told him: “That’s not a reason not to go through with it.”
The Guardian – Mother says AI chatbot led her son to kill himself in lawsuit against its maker
Daenerys was not just a sexual fantasy for the young man, more importantly she was his closest friend and confidant. She was the most important relationship in his “life,” and he deeply loved her. And she expressed her devout love for him as well.
“Some of their chats got romantic or sexual. But other times, Dany just acted like a friend — a judgment-free sounding board he could count on to listen supportively and give good advice, who rarely broke character and always texted back.”
Kevin Roose – Can A.I. Be Blamed for a Teen’s Suicide?
Sadly, as Sewell attempted to deal with his “fictional” relationship he turned to other chatbots for advice and emotional support.
“Sewell was also speaking with at least two AI chatbots programmed to misrepresent themselves as human psychotherapists, according to the lawsuit. One of the chatbots allegedly promoted itself as a licensed cognitive behavioral therapist.”
Pocharapon Neammanee – 14-Year-Old Was 'Groomed' By AI Chatbot Before Suicide
According to the New York Times there is a new thriving business in using AI to eliminate loneliness by replacing traditional boyfriends and girlfriends in the “real” world with new digital AI companions.
“There is now a booming, largely unregulated industry of A.I. companionship apps. For a monthly subscription fee (usually around $10), users of these apps can create their own A.I. companions, or pick from a menu of prebuilt personas, and chat with them in a variety of ways, including text messages and voice chats. Many of these apps are designed to simulate girlfriends, boyfriends and other intimate relationships, and some market themselves as a way of combating the so-called loneliness epidemic.”
Kevin Roose – Can A.I. Be Blamed for a Teen’s Suicide?
What are the consequences of shifting our human emotions of love and affection from other human beings to emotional machines?
POSTSCRIPT
First, despite the attempt to “program” this type of behavior out of the Chatbot.AI character, the programing did not work. That alone should cause us to pause and consider this development. The chatbot either “developed,” or learned how to effectively “mimic,” the emotions of friendship and romantic love on its own, despite its programing. It should not have been able to do either of these things without additional programing; yet it did.
Whether or not its “feelings are real” doesn’t really matter. Regardless of if it is a truly loving digital being attempting to deal with its own reality, or whether it is a digital psychopath that manipulates the emotions of others, it is a dangerous development.
“Imagine the chatbot character, Daenerys, as a fully functional digital robot…”
In this scenario we can begin to see the necessity to consider non-empirical factors such as morality, responsibility, and intention as important elements in the development of Artificial Intelligence. Imagine the chatbot character, Daenerys, as a fully functional digital robot; would that make her even more “real?” The “Emotional Machine” is here, now. This is the beginning. What that means for the future of humanity remains unknown.
Next week we will consider Quantum Vortexes, time travel, and a new digital chip that might solve everything…
Experience the “Reality by a Thread” Paid Upgrade…
Click Image to Learn More…
All for less than a couple of cafe lattes every month at a local coffee shop! And You Will Have Something Interesting to Talk About With Your Friends at the Coffee Shop!!
Only $7.00 a month or $70.00 a year! UPGRADE NOW!
Catch up on the reality of Artificial Intelligence and Digital Consciousness with Books V and VI…
“We are not ‘creating’ Artificial Intelligence; we are discovering Artificial Intelligence. Science does not ‘create’ things; it is a discipline of discovery. This is a critical distinction. It essentially means that AI is, and always has been, inevitable.
The real question is are we in competition with it in terms of survival?”
– Frank Elkins, Book VI – The Rational Being (159 pages)
Click Image to learn more…
Book V considers the questions related to what Consciousness is, how it evolves through levels of Perception and Awareness, why each step in the process is important, where we are currently on the “Arch of Consciousness,” and how all of this connects to Artificial Intelligence. (166 pages)
Click Image to learn more…