More

    AI voice scams are on the rise – here’s how to stay safe, according to security experts



    • AI voice-clone scams are on the rise, according to security experts
    • Voice-enabled AI models can be used to imitate loved ones
    • Experts recommend agreeing a safe phrase with friends and family

    The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.

    What are AI voice scams?

    Scam calls aren’t new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family.

    The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to execute phone scams automatically, encouraging victims to disclose sensitive information.

    https://cdn.mos.cms.futurecdn.net/Ddnh4w9x9NgrrufrM5QjJk-1200-80.jpg



    Source link

    Latest articles

    spot_imgspot_img

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here

    spot_imgspot_img