Offline
OpenAI Cautions About Potential Emotional Dependency with New ChatGPT Voice Feature
News
Published on 08/16/2024

OpenAI is expressing concern over the potential for users to develop emotional dependency on its new ChatGPT voice feature, which has been described as strikingly lifelike. Following the recent rollout of this enhanced voice mode to premium users, OpenAI issued a safety review highlighting these concerns.

The voice feature is designed to be highly realistic, capable of handling interruptions, responding in real-time, and mimicking conversational sounds like laughter and "hmms." It can also gauge user emotions based on their tone. This advancement has drawn comparisons to the 2013 film Her, where the protagonist forms a romantic relationship with an AI.

OpenAI now fears that such fictional scenarios could become reality. The company has noted that some users are already interacting with ChatGPT in ways that suggest a developing emotional connection with the AI. While this technology could offer companionship to lonely individuals, it might also negatively impact real-world relationships.

The review also cautions that users might over-rely on the AI simply because it sounds human, highlighting a broader issue with the rapid advancement of AI technology. As companies race to introduce new capabilities, the full implications of these developments remain uncertain.

Experts like Liesel Sharabi from Arizona State University warn that forming close bonds with AI could be challenging due to the constantly evolving nature of the technology. OpenAI remains committed to researching the potential for emotional dependence as it continues to develop its tools.

Comments