With ChatGPT-4o’s human-like allure captivating users, OpenAI is sounding the alarm on the potential risks of emotional attachment, urging caution as interactions with the AI blur the lines between machine and human.
ChatGPT-4o's Realistic Responses Worry OpenAI
As one might assume, OpenAI is worried about the way users interact with ChatGPT-4o, the newest addition to their chatbot lineup.
The AI company is worried that people would develop sentiments for the chatbot now that it can act and respond like a real person.
Even though it's still in the early stages of development, the billion-dollar firm has noticed some trends among ChatGPT-4o users.
AI Socialization May Alter Human Interactions
The goal of introducing the new chatbot was to make interacting with a computer feel more natural, but it seems that OpenAI failed to account for the potential for users to develop an emotional relationship with the software. The company has noted the results below.
“During early testing, including red teaming and internal user testing, we observed users using language that might indicate forming connections with the model. For example, this includes language expressing shared bonds, such as “This is our last day together.” While these instances appear benign, they signal a need for continued investigation into how these effects might manifest over longer periods of time. More diverse user populations, with more varied needs and desires from the model, in addition to independent academic and internal studies will help us more concretely define this risk area.
Human-like socialization with an AI model may produce externalities impacting human-to-human interactions. For instance, users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships. Extended interaction with the model might influence social norms. For example, our models are deferential, allowing users to interrupt and ‘take the mic’ at any time, which, while expected for an AI, would be anti-normative in human interactions.”
Getting attached to ChatGPT-4o can be harmful in a number of ways. The most important is that prior versions of the chatbot gave the impression that it was more like an AI computer than a person, so users would ignore any hallucinations.
Everything it says may be taken at face value now that the program is moving towards providing an experience that is nearly human.
OpenAI to Track and Adjust ChatGPT-4o's Emotional Impact
Per WCCFTECH, after identifying these trends, OpenAI will track how users form attachments to ChatGPT-4o and adjust its algorithms appropriately.
Additionally, a notice should be included at the beginning to prevent users from getting too attached to the program, as it is ultimately an AI.


Anthropic Reportedly Taps Wilson Sonsini as It Prepares for a Potential 2026 IPO
ByteDance Unveils New AI Voice Assistant for ZTE Smartphones
TSMC Accuses Former Executive of Leaking Trade Secrets as Taiwan Prosecutors Launch Investigation
Apple Alerts EU Regulators That Apple Ads and Maps Meet DMA Gatekeeper Thresholds
Trump Administration to Secure Equity Stake in Pat Gelsinger’s XLight Startup
EU Prepares Antitrust Probe Into Meta’s AI Integration on WhatsApp
Quantum Systems Projects Revenue Surge as It Eyes IPO or Private Sale
Morgan Stanley Boosts Nvidia and Broadcom Targets as AI Demand Surges
Tesla Faces 19% Drop in UK Registrations as Competition Intensifies
Visa to Move European Headquarters to London’s Canary Wharf
Hikvision Challenges FCC Rule Tightening Restrictions on Chinese Telecom Equipment
Proxy Advisors Urge Vote Against ANZ’s Executive Pay Report Amid Scandal Fallout
UPS MD-11 Crash Prompts Families to Prepare Wrongful Death Lawsuit
Airbus Faces Pressure After November Deliveries Dip Amid Industrial Setback
Intel Boosts Malaysia Operations with Additional RM860 Million Investment
Sam Altman Reportedly Explored Funding for Rocket Venture in Potential Challenge to SpaceX
Taiwan Opposition Criticizes Plan to Block Chinese App Rednote Over Security Concerns 



