More

    Google warns criminals are building and selling illicit AI tools – and the market is growing



    • AI tools are being purpose built for criminals, new GTIG report finds
    • These tools side-step AI guardrails designed for safety
    • ‘Just-in-time’ AI malware shows how criminals are evolving their techniques

    Google’s Threat Intelligence Group has identified a worrying shift in AI trends, with AI no longer just being used to make criminals more productive, but also now being specially developed for active operations.

    Its research found Large Language Models (LLMs) are being used in malware in particular, with ‘Just-in-Time’ AI like PROMPTFLUX – which is written in VBScript and engages with Gemini’s API to request ‘specific VBScript obfuscation and evasion techniques to facilitate “just-in-time” self-modification, likely to evade static signature-based detection’.

    https://cdn.mos.cms.futurecdn.net/DVffQnnibMWmNpx2Wfb5Se-1920-80.jpg



    Source link

    Latest articles

    spot_imgspot_img

    Related articles

    Leave a reply

    Please enter your comment!
    Please enter your name here

    spot_imgspot_img