Why you shouldn’t ask ChatGPT for relationship advice — it’ll just tell you you’re right and ‘may worsen rather than resolve conflict’




  • A new study found that AI chatbots are far more likely than humans to validate users during personal conflicts
  • That tendency can become dangerous when people use chatbots for advice about fights
  • AI can easily make people feel overly justified in making bad decisions

Bringing interpersonal drama to an AI chatbot isn’t exactly why developers built the software, but that isn’t stopping people in the middle of fighting with friends and family from seeking (and getting) validation from digital supporters.

AI chatbots are always available, endlessly patient, and very good at mimicking the right emotions. Too good, really, because they often default to agreeing with users, potentially causing much bigger problems, according to a new study published in Science.


https://cdn.mos.cms.futurecdn.net/7pCG5vaWUZqbM9Bd7zP3TG-2120-80.jpg



Source link
ESchwartzwrites@gmail.com (Eric Hal Schwartz)

Latest articles

spot_imgspot_img

Related articles

Leave a reply

Please enter your comment!
Please enter your name here

spot_imgspot_img