- NordVPN’s new Chrome feature spots audio deepfakes
- The real-time checker provides immediate scam alerts
- AI Voice Detector also keeps your data protected
Are you feeling a little uneasy about the AI-generated audio deepfakes that are increasingly polluting your online experience and potentially tricking you into a scam?
If the answer is yes and you are worried you might be talking to an online deepfake, being tempted to reveal information only your mum knows, you’ll be glad to know that NordVPN has you covered.
The best VPN on the market has just integrated a new AI-powered voice detector into its Chrome extension. This means that users can now identify audio deepfakes in real time whenever they browse the web.
NordVPN said that the firm currently has no plans to extend the feature to different apps. “But we would consider it if there is a need,” a spokesperson told TechRadar.
With the move, the VPN provider continues to expand its security offerings in its relentless fight against cybercrime, proving anti-scam capabilities are now an essential part of any top VPN service.
How does it work?
AI-generated voice technologies are becoming increasingly sophisticated and difficult to recognise, even to the best-trained human ear.
While it may be shocking to discover that your favourite influencer is actually a synthetic voice, the dangers associated with AI-generated audio go far beyond mere fake content for entertainment purposes.
Scammers are now able to clone voices using just a few seconds of audio recording, using them for identity theft scams, fraudulent calls and spreading misinformation.
“AI-generated voices have become one of the most convincing tools in a scammer’s arsenal, and most people have no reliable way to tell the difference,” stressed product director at NordVPN Domininkas Virbickas.
Developed by NordVPN and a team of cybersecurity experts at NordLabs, AI Voice Detector captures the audio stream from an active browser tab and processes it using a neural network model built into the device.
While the audio continues to play normally for the user, the tool compares the sound to thousands of real and AI-generated audio samples, verifying in real time whether the user is listening to a human or synthetic voice.
If the content is generated by AI, the result is displayed immediately in the form of a red pop-up or a small notification on the web page. A yellow pop-up warns that the content may be a deepfake, whilst a green pop-up indicates that the audio poses a minimal risk.
Installation is a breeze: users can simply enable the feature in the NordVPN Chrome extension to start audio detection.
Protecting your privacy
AI Voice Detector also analyses audio data directly within the browser. This should ensure that users’ data is not compromised during the process.
Because of that, the tool never records your identity, browser history, cookies, or account details, ensuring that your data is never exposed. Furthermore, as soon as the recording stops or the tabs are closed, all audio buffers are immediately cleared
But above all, the tool does not actually listen to what is being said. It solely examines acoustic patterns, meaning it is technically incapable of understanding, recording, or interpreting the content of any conversation.
It seems that NordVPN is determined to make 2026 its year of anti-scams innovations. It has so far launched a free tool to detect online scams, rolled out its scam-blocking tool for Android globally, and upgraded its dark web monitoring, all of this while scoring top marks in independent phishing tests.
With so many new features being released, it might well seem that we can’t trust anyone or anything. For the time being, at least, your conversations are intact: the next time you are talking on Chrome with your bank manager who wants to discuss your bank account or a friend asking for money, you can continue to sip on your drink and enjoy spring as it comes.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
https://cdn.mos.cms.futurecdn.net/832ekadZq4nxiMFyfBfzz4-2121-80.jpg
Source link




