More

    Think you can trust ChatGPT and Gemini to give you the news? Here’s why you might want to think again



    • Nearly half of all AI assistant answers about the news contain major errors, a major international study has found
    • The factual, sourcing, or contextual issues were apparent across 14 languages and 18 countries
    • Gemini fared the worst, with double the rate of significant problems compared to competitorsfor sure,

    When you ask an AI assistant about news and current events you might expect a detached, authoritative answer. But according to a sweeping international study led by the BBC and coordinated by the European Broadcasting Union (EBU), nearly half the time, those answers are wrong, misleading, or just plain made up (anyone who’s dealt with the nonsense of Apple’s AI-written headlines can relate).

    The report dug into how ChatGPT, Microsoft Copilot, Google Gemini, and Perplexity handle news queries across 14 languages in 18 countries. The report analyzed over 3,000 individual responses provided by the AI tools. Professional journalists from 22 public media outlets evaluated each answer for accuracy, sourcing, and how well it discerned news from opinion.

    The results were bleak for those relying on AI for their news. The report found that 45% of all answers had a significant issue, 31% had sourcing problems, and 20% were simply inaccurate. This isn’t just a matter of one or two embarrassing mistakes, like confusing the Prime Minister of Belgium with the frontman of a Belgian pop group. The research found deep, structural issues with how these assistants process and deliver news, regardless of language, country, or platform.

    News Integrity in AI Assistants: An international PSM study

    (Image credit: BBC/EBU)

    In some languages, the assistants outright hallucinated details. In others, they attributed quotes to outlets that hadn’t published anything even close to what was being cited. Context was often missing, with the assistants sometimes giving simplistic or misleading overviews instead of crucial nuance. In the worst cases, that could change the meaning of an entire news story.

    Source link
    erichs211@gmail.com (Eric Hal Schwartz)

    Latest articles

    spot_imgspot_img

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here

    spot_imgspot_img