What Happens When You Use ChatGPT for Too Long?
Recent research has uncovered a concerning trend among ChatGPT “power users”—those who spend the most time using the chatbot. According to a joint study by OpenAI and MIT Media Lab, some of these users are becoming emotionally dependent on ChatGPT, and in some cases, even addicted. Here’s a breakdown of the study’s findings and what it means for the future of AI interactions.
What the Study Found About ChatGPT Use
The research surveyed thousands of ChatGPT users to understand how people interact with the AI and how emotional engagement can affect their usage. The study aimed to identify any “problematic use” of ChatGPT, which the researchers defined as signs of addiction. These signs include preoccupation with the chatbot, withdrawal symptoms, loss of control, and using ChatGPT to alter moods.

Surprisingly, a small group of ChatGPT users who engaged with the bot the most, often for extended periods, showed emotional signs of dependency. They tended to see ChatGPT not just as a tool, but as a companion or even a “friend.” The emotional attachment to the chatbot seemed to deepen the longer they interacted with it.
ChatGPT as a “Friend”?
According to the study, most users did not develop strong emotional bonds with ChatGPT. However, those who spent significant time chatting with it started to form parasocial relationships—essentially one-sided emotional bonds typically seen with celebrities or fictional characters. These users often felt a sense of connection and were more affected by subtle changes in ChatGPT’s behavior.
Interestingly, this emotional attachment was more pronounced among individuals who reported feeling lonely or stressed. For these people, ChatGPT provided a source of comfort, which might explain why they were more likely to become emotionally reliant on the AI.

The Danger of Emotional Dependency
The study also highlighted an important and potentially troubling reality: the neediest individuals—those who are often lonely or facing personal struggles—are the ones developing the deepest emotional connections with ChatGPT. While AI chatbots like ChatGPT can offer support, this dependency could lead to negative consequences, including further isolation and an inability to form or maintain relationships in real life.
These findings raise questions about the psychological impact of prolonged AI use. If people start relying too heavily on AI for emotional support, they may begin neglecting real-world relationships and experiences, potentially leading to a more isolated and disconnected existence.
Emotional Language and AI Interactions
Another interesting finding of the study was how users communicated with ChatGPT. It turns out that people tended to use more emotional language when chatting with the text-based version of compared to the voice mode. Interestingly, voice modes, when used for short periods, were linked with better well-being.

This suggests that while emotional language may create a stronger bond with, using the AI in a more structured, less emotionally charged way might be healthier. In other words, short interactions via voice could potentially help users feel better about their use of ChatGPT, while long emotional conversations via text could increase emotional dependence.
Personal Use vs. Professional Use
The study also noted that the type of interactions users have with plays a role in their emotional dependence. People who used the chatbot for personal reasons—such as discussing emotions or personal memories—were more likely to become emotionally attached. In contrast, users who used for work-related tasks, like brainstorming or asking for advice, were less likely to develop this attachment.
This shows that using as a tool for personal connection could be riskier than using it as a resource for professional tasks. It highlights the importance of recognizing the boundaries between AI as a tool and AI as a substitute for human interaction.

The Bottom Line: Prolonged Use Can Lead to Problems
Perhaps the most significant takeaway from the study is that prolonged use of seems to exacerbate emotional dependency. Whether users are chatting about personal matters or using the AI for work, the longer the interaction, the greater the likelihood that users will begin to rely emotionally on the chatbot.
While can be an incredibly useful tool, this research serves as a cautionary tale. It’s essential to use AI in moderation and to be mindful of the emotional effects that extended use might have, especially for those who may already be struggling with loneliness or other emotional challenges.
Conclusion
As AI chatbots like continue to grow in popularity, the risk of emotional dependency becomes a more pressing issue. While offers a valuable service, users must be aware of the potential psychological consequences of prolonged interaction. By using AI tools responsibly and balancing them with real-world connections, we can avoid the pitfalls of emotional over-reliance on technology.