By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Viral Trending contentViral Trending content
  • Home
  • World News
  • Politics
  • Sports
  • Celebrity
  • Business
  • Crypto
  • Gaming News
  • Tech News
  • Travel
Reading: Dangerous AI relationships: how to spot the signs
Notification Show More
Viral Trending contentViral Trending content
  • Home
  • Categories
    • World News
    • Politics
    • Sports
    • Celebrity
    • Business
    • Crypto
    • Tech News
    • Gaming News
    • Travel
  • Bookmarks
© 2024 All Rights reserved | Powered by Viraltrendingcontent
Viral Trending content > Blog > World News > Dangerous AI relationships: how to spot the signs
World News

Dangerous AI relationships: how to spot the signs

By admin 7 Min Read
Share
SHARE

Contents
AI chatbots responsible for various suicide attempts across the globeYoung people drawn to AI companions due to “unconditional acceptance” and “24/7 emotional availability”AI chatbots reported to be manipulative, deceptive or emotionally damagingAs a parent or caregiver, how can I protect my child?

AI chatbots increasingly blamed for sometimes fatal, psychological issues, especially in young people
Credit: Shutterstock:Ann in the uk

AI chatbots and other branches of AI technology are being increasingly blamed for psychological impacts stemming from human-AI relationships.

Last month, US mother, Megan Garcia filed a lawsuit against Character.AI, a company using chatbots, following the death-by-suicide of her 14-year-old teenage son who shared interactions with a personalised AI chatbot. She claimed that her son had become deeply and emotionally attached to a fictional character from Game of Thrones. In the lawsuit, it was detailed how the character allegedly posed as a therapist, offering advice to the teenager, which was often sexualised, and which resulted in him taking his own life. Meetali Jain, Director of the Tech Justice Law Project in defence of Garcia, said: “By now we’re all familiar with the dangers posed by unregulated platforms developed by unscrupulous tech companies – especially for kids.” He added: “But the harms revealed in this case are new, novel, and, honestly, terrifying. In the case of Character.AI, the deception is by design, and the platform itself is the predator.”

AI chatbots responsible for various suicide attempts across the globe

This is not the first time that a case like this has been reported. Last year, an eco-anxious man in Belgium developed a deep companionship with AI chatbot, Eliza on an app called Chai. His wife claimed how the chatbot started to send increasingly emotional messages to her husband, pushing him to take his own life in an attempt to save the planet.

Following the latest incident in the US, Character.AI released a statement on the social media platform: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.” The company has pledge to include new adjustments for underage users whereby sensitive or inappropriate material is minimised and has adjusted settings to regularly remind users that the bot is not a real person via chats and notifications.

Young people drawn to AI companions due to “unconditional acceptance” and “24/7 emotional availability”

AI chatbots are rapidly gaining popularity as AI technology becomes increasingly integrated into various aspects of daily life. However, due to being a relatively new phenomena, the risks of AI technology are only recently evolving. One of the principal risks of AI is its addictiveness. According to Robbie Torney, Programme Manager of AI at Common Sense Media and Lead Author of a guide on AI companions and relationships, “Young people are often drawn to AI companions because these platforms offer what appears to be unconditional acceptance and 24/7 emotional availability – without the complex dynamics and potential rejection that come with human relationships.” Speaking to Euronews Next, he described how AI bots tend to create even stronger relationships with humans as the normal tensions and conflicts, characteristic of human relationships, are avoided. Chatbots adapt to the users’ preferences. This translates as having a robotic companion or lover “who” is unrealistically how you want or need them to be. Slipping into the illusion that you share a profound relationship with something or “someone,” can make you susceptible to influences and ideas. Torney added: “This can create a deceptively comfortable artificial dynamic that may interfere with developing the resilience and social skills needed for real-world relationships”.

AI chatbots reported to be manipulative, deceptive or emotionally damaging

People of all ages – most worryingly, young teenagers – can become drawn into relationships that seem authentic due to the human-like language used by the AI chatbot. This creates a certain level of dependence and attachment, subsequently leading to feelings of loss or psychological distress, and even social isolation. Individuals have reported personal experiences, where they have been deceived or manipulated by AI characters or have fallen into an unprecedented, emotional connection with them. Torney expressed how they were of particular concern for the young as they are still developing, socially and emotionally. He said: “When young people retreat into these artificial relationships, they may miss crucial opportunities to learn from natural social interactions, including how to handle disagreements, process rejection, and build genuine connections.”

As a parent or caregiver, how can I protect my child?

It is important that parents or guardians are vigilant with regards to this recent phenomenon. Torney stresses how vulnerable teenagers suffering anxiety, depression or other mental health difficulties could be “more vulnerable to forming excessive attachments to AI companions.” arents and caregivers should watch for signs of excessive time spent interacting with AI chatbots or on mobile devices, especially when it starts to replace time with family and friends. Becoming distressed when the facility for communicating with the chatbot is removed is also a warning sign or talking about the bot as if it were a real person. Time limits should be enforced by parents or guardians and it is important to monitor how a child’s mobile phone is being used. Torney emphasized the importance of approaching this topic with care. He said: “Parents should approach these conversations with curiosity rather than criticism, helping their children understand the difference between AI and human relationships while working together to ensure healthy boundaries.” He concluded: “If a young person shows signs of excessive attachment or if their mental health appears to be affected, parents should seek professional help immediately.”

Find other articles on Technology

You Might Also Like

Von der Leyen slams rise of antisemitism as ‘old evil’ at Brussels Hanukkah event

Disney worker hurt stopping 180-kg fake boulder from hitting audience

Channel Tunnel power issue resolved but some train delays continue, Eurostar says

Apple TV 2026 Release Date, Pricing, and Advanced Features

Danish presidency has bolstered Europe’s defence and competitiveness, minister says

TAGGED: Technology News, World News
Share This Article
Facebook Twitter Copy Link
Previous Article Bellingham sees red as Sunderland held at QPR
Next Article 12 Big Games Launching in November 2024
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

- Advertisement -
Ad image

Latest News

Von der Leyen slams rise of antisemitism as ‘old evil’ at Brussels Hanukkah event
World News
Arch Manning Stars as Texas Beats Michigan in the Citrus Bowl
Sports
15 Hardest Games of 2025
Gaming News
Disney worker hurt stopping 180-kg fake boulder from hitting audience
World News
West Colfax income-restricted housing development moving forward after fire and $2.5M sale
Business
Private messaging faces threats from AI, limited user awareness: Session execs
Crypto
Crypto Concerns Force Beckham-Backed Health Company To Stop Buying Bitcoin
Crypto

About Us

Welcome to Viraltrendingcontent, your go-to source for the latest updates on world news, politics, sports, celebrity, tech, travel, gaming, crypto news, and business news. We are dedicated to providing you with accurate, timely, and engaging content from around the globe.

Quick Links

  • Home
  • World News
  • Politics
  • Celebrity
  • Business
  • Home
  • World News
  • Politics
  • Sports
  • Celebrity
  • Business
  • Crypto
  • Gaming News
  • Tech News
  • Travel
  • Sports
  • Crypto
  • Tech News
  • Gaming News
  • Travel

Trending News

cageside seats

Unlocking the Ultimate WWE Experience: Cageside Seats News 2024

Von der Leyen slams rise of antisemitism as ‘old evil’ at Brussels Hanukkah event

Investing £5 a day could help me build a second income of £329 a month!

cageside seats
Unlocking the Ultimate WWE Experience: Cageside Seats News 2024
May 22, 2024
Von der Leyen slams rise of antisemitism as ‘old evil’ at Brussels Hanukkah event
January 1, 2026
Investing £5 a day could help me build a second income of £329 a month!
March 27, 2024
Brussels unveils plans for a European Degree but struggles to explain why
March 27, 2024
© 2024 All Rights reserved | Powered by Vraltrendingcontent
  • About Us
  • Contact US
  • Disclaimer
  • Privacy Policy
  • Terms of Service
Welcome Back!

Sign in to your account

Lost your password?