More

    ChatGPT is getting better at knowing when you need real human support – and I think its about time


    If you watched the recent launch of ChatGPT-5 from OpenAI, you’d be forgiven for thinking that it was purely a coding tool. While Sam Altman and his staff did interview one person who used ChatGPT to help understand the medical jargon her doctors were saying to her, the majority of the presentation seemed to be concerned with how great ChatGPT-5 was at writing code.

    Out in the real world, however, people use AI and ChatGPT specifically a bit differently. As the outcry from the recent dropping of the old ChatGPT-4o model after the launch of ChatGPT-5 shows, a lot of people use ChatGPT for their mental health, and if you change its personality, it affects them directly. For them, it acts as a mix between a life coach, a therapist, and a friend.

    OpenAI seems to be slowly waking up to this fact and the responsibility it bears, and has recently posted an announcement, in which it says, “We sometimes encounter people in serious mental and emotional distress. We wrote about this a few weeks ago and had planned to share more after our next major update. However, recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us, and we believe it’s important to share more now.”

    Strengthening safeguards

    https://cdn.mos.cms.futurecdn.net/PSMaGAEju5jP7CtdFkVcxa.jpg



    Source link

    Latest articles

    spot_imgspot_img

    Related articles

    spot_imgspot_img