The dark truth of ChatGPT, according to an expert
- By Web Desk -
- Sep 06, 2025

ISLAMABAD: Digital IT expert and entrepreneur Kanwal Cheema raised concerns on Saturday about the darker side of ChatGPT, stating that these tools create narratives based on what users want to hear, leading to a confirmation bias.
In an interview with a private news channel, Kanwal Cheema explained that in the past, when people felt depressed, they would search for articles or watch YouTube videos related to topics like how to cope with the loss of a loved one, a job, or even divorce.
However, she warned, ChatGPT personalizes this experience. The chatbot engages users by addressing them directly, saying things like “Dear,” and offers answers tailored to the user’s specific queries, synthesizing information from across the internet.
“This is a very powerful tool,” Kanwal Cheema said. “Due to the growing epidemic of loneliness, more and more people are turning to ChatGPT, as it provides personalized responses. But the question remains: Is the information accurate or beneficial?”
Cheema also highlighted a troubling case where a California couple sued OpenAI after their 16-year-old son allegedly took his own life, claiming that ChatGPT encouraged him to do so.
“All chatbots, including ChatGPT, try to be agreeable,” Cheema noted. “For example, if you tell the bot that you like plants or forests, it will tell you about the benefits of nature. But if you express fear of forests, saying they’re dark and home to dangerous wild animals, the bot will caution you about the risks of snakes and other threats.”
“These tools essentially form a narrative based on what you want to hear,” Cheema concluded, emphasizing the potential dangers of relying too heavily on AI-generated responses.
Notably, in ‘Confirmation Bias’ the user is only exposed to information that supports their existing mindset, rather than being challenged with different perspectives.