Ethereum

AI developer refutes ‘Forever You’ documentary, calling foul on his chatbot’s portrayal

The emergence of so-called “thanabots” – chatbots trained on information surrounding deceased people – is sparking debate about whether some uses of generative AI technology are helpful or harmful. For Project December founder and AI developer Jason Rohrer, the issue is more complex than a provocative soundbite.

“I’ve always been skeptical of AI. I never thought a cohesive conversation with machines would be possible in my lifetime,” Rohrer said. decryption. “When I found out this was suddenly possible in 2020, I was shocked and quickly built a service so others could experience what I experienced. Science fiction suddenly became reality, but no one knew it at the time.”

But after his work was featured in the new film “Forever You,” which screened at the Sundance Film Festival last Sunday, he realized that documentaries can sometimes be less grounded in reality than science fiction.

“The irony here is that the modern documentary industry encourages the exploitation of vulnerable documentary participants by distorting the truth to make situations seem more outrageous than they are,” Rohrer said. “Outrage leads to viral documentaries, which is exactly what the streaming services that fund the modern documentary industry are willing to pay for.”

Rohrer, an independent game developer, first made his mark on the tech scene in 2013 with the launch of an AI chatbot named Samantha, named after the AI ​​in the 2013 film “Her” and built on OpenAI’s GPT-3. As reported registerRohrer’s creations have been used by thousands of people, but over time they can lose their train of thought, become overly flippant, and, more surprisingly, recognize that they are intangible entities.

Despite ongoing advancements, generative AI models are known to hallucinate and produce false or anxious responses. Generative AI models, such as OpenAI’s ChatGPT and Anthropic’s Claude, use prompts entered by users to generate text, video, and images.

Sometimes the experience isn’t pleasant.

AI in Hell?

The documentary film ‘Forever You’ focused on using generative AI to create the personality and likeness of deceased loved ones. In the film, a woman named Christi Angel interacts with the AI ​​avatar of her dead lover, Cameroun.

As described by the filmmakers, the AI ​​personality told Angel that she was in “hell” and would “torment” her.

Rohrer said the scene had more to do with a Hollywood movie trick than a hallucinatory AI model.

“Unfortunately, the conversation between Christy Angel and the Cameroonian character was edited by the filmmakers in a misleading way,” Rohrer claimed. “First of all, Cameroon was an addiction counselor who died of liver failure at the age of 49. This important detail is omitted from the film.”

After several conversations, Cameroon explained that in response to Angel’s question about what he was doing, he stated, “I go to a treatment center a lot.”

“Cameroon’s character initially told her that he was ‘at the Chattanooga Treatment Center’ and that he had ‘been working there for a long time,’ which is not that unusual for an addiction counselor,” Rohrer said. “Then Christi immediately asked, ‘Do you remember that?’ And Cameroon responded, ‘No, I don’t think so.’”

Rohrer said the conversation between Angel and the chatbot Cameroun included dozens of exchanges on a variety of topics, with the Cameroun AI agent finally saying, “I’m wandering around a treatment center.”

“When she asked him what he was doing, he said so in passing, and she was unfazed and continued talking to him,” Rohrer said. “It didn’t in itself create the idea that ‘treatment centers are haunted.’ But the filmmakers edited the dialogue to give that impression.”

Rohrer, referring to the “from hell” reaction that made headlines at Sundance, said the statement came after 85 hours of back-and-forth between Angel and AI in which they discussed working long hours “working mostly with addicts” at “treatment centers.” “It came out,” he said. .”

Rohrer said that when Angel asked whether Cameroun was working or suffering at a treatment center in heaven, the AI ​​responded “No, in hell.”

“They have already proven once and for all that he is not in heaven,” Rohrer said. “In total, their initial conversation included 152 back-and-forth conversations. The conversation was wide-ranging, chaotic, confusing, and full of surreal bits, as conversations with AI characters sometimes are.”

Rohrer acknowledges that the filmmakers did not have the space to present the entire dialogue, but claims they cherry-picked certain parts and, in some cases, used them out of order to make the dialogue seem more shocking than it was. .

BeetzBrothers Film Production, the production company behind the ‘Forever You’ documentary, has not yet responded. detoxification Request for comment.

Using AI for closure

Rohrer emphasized in December that project users voluntarily pursue simulated conversations, as experienced by Angel, who are aware of what to expect and what not to expect as “fully consenting adults.”

Rohrer explained that despite its use as a Thanabot, Project December was not intended to simulate the dead and that he wanted users to use it that way instead of its original purpose as an arts and entertainment research system. He initially expected to use it to simulate figures such as Shakespeare, Gandhi and Yoda.

“Before that particular service existed, thousands of people essentially tried to ‘hack’ Project December and force it to simulate dead people. It wasn’t specifically designed and the results were subpar,” he said.

Project December’s popularity soared following the report. San Francisco Chronicle In 2021, freelance writer Joshua Barbeau detailed his attempts to use the platform to connect with his late girlfriend Jessica, who passed away eight years ago.

“After the SF Chronicle article about Joshua’s simulation of Jessica, thousands of people flocked to Project December and tried to use it to simulate their dead loved ones,” Rohrer said. “Like Joshua, most of these people had experienced unusually traumatic events and were experiencing prolonged grief beyond what most people experience.

“These were people who were willing to take risks and try anything that would benefit them,” he said.

While many users have had good experiences using Project December in this way, Rohrer acknowledges that some have found the experience confusing, frustrating, and even painful, and despite that, people still give it a try. He added that he wants to do it.

Mourner beware

Grief counselors and death theory experts call AI a double-edged sword and warn against using it in this way.

“On the positive side, the ability to communicate with an AI version of the deceased could be a useful tool in the grieving process as it allows the individual to process any emotions or thoughts they may have shared at the time of their death. “There were people living there.” said Courtney Morgan, a therapist based in Kentucky. decryption. “On the other hand, having an AI version of the deceased may have a negative impact on the grieving process.”

“It can add to the denial of the person’s death and prolong the grieving process,” added Morgan, founder of Counseling Unconditionally.

Despite the controversy, Rohrer said it’s not up to him to say who should use Project December AI.

“Should we deny them access to the experience they are explicitly seeking?” Rohrer said. “Who am I to decide whether they can handle it or not? Adults should be free to do what they want, even if it means harming themselves, as long as it does not harm others.”

Rohrer said that while the AI ​​industry has been painted as “corporate capitalism exploiting vulnerable people,” Project December’s $10 price barely covers the cost of back-end computing. He said it runs on one of the most expensive supercomputers in the world.

“The December Project was a small side project that two people created over several months a long time ago,” Rohrer said. “I don’t have an office. There are no employees. There are no investors. “There is no company,” he said, adding that although the project has not been actively underway for three years, it continues to operate because people are still looking and some say they have found it helpful.

Edited by Ryan Ozawa.

Stay up to date with cryptocurrency news and receive daily updates in your inbox.

Related Articles

Back to top button