Communicating with the ChatGPT AI chatbot in the new voice mode, when its voice is no different from a human one, can lead to the user becoming dependent on interaction with AI, CNN resource writes with reference to an OpenAI report.

Image source: Andrew Neel/unsplash.com

According to OpenAI, which published a security report on Thursday regarding the use of Advanced Voice Mode (AVM) for the ChatGPT service, which has become available to a small number of ChatGPT Plus subscribers, the voice of the ChatGPT chatbot in AVM mode sounds very realistic. The chatbot reacts in real time, can adapt to interruptions, and reproduces the sounds that people make during a conversation, for example, chuckles or grunts. He can also judge the emotional state of the interlocutor by the tone of his voice.

After OpenAI announced this feature in its multimodal generative AI model GPT-4o, it began to be compared to the AI ​​digital assistant from the 2013 film “Her,” with which the main character falls in love.

Apparently OpenAI is concerned that the fictitious story has turned out to be close to reality, after observing users speaking to ChatGPT in voice mode in a language that “expresses common connections” with the communication tool.

As a result, “users can form social relationships with AI, reducing their need for human interaction—potentially benefiting lonely people, but possibly affecting healthy relationships,” OpenAI noted. The report also found that receiving information from a bot in a human-sounding voice may make the user more trustworthy than it should be, given the AI’s fallibility.

OpenAI said how users interact with ChatGPT in voice mode may also, over time, impact what is considered normal in social interactions. “Our models are respectful, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for AI, is not the norm in human interactions,” the company said in a report.

At the same time, OpenAI emphasized that it strives to create AI “safely” and plans to continue studying the potential for “emotional dependence” of users on proposed AI solutions.

Leave a Reply

Your email address will not be published. Required fields are marked *