
Business is competitive, and sometimes requires risk in order to achieve success – but new research has revealed some employees may be taking on more of a risk than they realize.
Shadow IT, or software and services that employees use without authorization from the IT department, can threaten business and customer data and even increase the risk of a cyberattack.
It’s no secret that employees are desperate for time saving and productivity boosting services – but if the business doesn’t provide one, employees will find one themselves.
The struggle against Shadow IT
Artificial intelligence, for better or for worse, has become a part of many workers’ day to day lives. It can help turn an angry email into one that is more corporate friendly, or help organize tasks to improve time management and productivity.
But whether these AI tools are approved by the business is not a question that crosses many minds, as often the primary thought is time saving, especially if a service can make work easier. Policy and governance are often not part of the thought process.
In fact, 1Password’s Annual Report 2025 found over half (52%) of employees have at one point downloaded apps without approval from IT, and while nearly three-quarters (73%) of employees have been encouraged to use AI tools, as many as 33% don’t follow the business’ AI policy.
Many AI tools and software will take user data and inputs to be used as training material, which is particularly dangerous for company and customer data. For example, if an AI tool is fed a table of names, addresses, emails, and other data, it could end up spitting it back out to a carefully crafted prompt.
But that isn’t the only danger presented by Shadow IT. Today’s security software is struggling to cope with the sprawl of tools used by employees during their workday, and in some cases cannot provide protection – especially if an employee is using an unauthorized app.
“People will always avoid friction, creating their own solutions when support isn’t clear. Today that shows up in the complexity of SaaS and AI implementations,” said Dave Lewis, Global Advisory CISO at 1Password.
“The issue isn’t the SaaS and AI tools enterprises use in their corporate environments; it’s our assumptions. Organizations are asking yesterday’s identity tools to govern a cloud-native, AI-accelerated workplace. That disconnect has caused the Access-Trust Gap. If organizations want resilience and speed, the industry must treat access as continuous, context-aware, and largely invisible, protecting every app, every tool, and every identity while letting employees get on with the work.”
The CISO conundrum
The Shadow IT problem is more than just a business problem, in some cases it’s one of livelihood.
Studies have shown cybersecurity professionals are increasingly stressed out by the breadth of threats businesses now face from outside, so also having to face a threat from within doesn’t help.
This is especially true for CISO’s, who are now faced with the challenge of approving a wave of new tools for company use while continuing their eternal battle against the forces of evil.
In a media briefing attended by TechRadar Pro, Mark Hazleton, CSO at Oracle Red Bull Racing, noted, “just the simple breadth of what’s out there and what our employees are going out and looking for – ninety nine times out of a hundred, it’s for purely business purposes and absolute common sense. But whether those sites are as secure as you would want them to be, et cetera. It’s a difficult challenge.”
In many cases, an employee could find a tool that would significantly improve their productivity, but the tool has to be audited, tested, and approved in a process that could take weeks. So in order to harness the benefits immediately, they may just use the tool without considering the risks.
Mark Hillick, CISO at Brex, echoed Hazleton’s sentiments, adding, “it’s very nuanced. You’re trying to enable the business, but also not take on undue risk. And similar to Red Bull, we have employees that are very security conscious. So while they want to innovate and while they want to move, we have the problem of keeping up.”
A lesson to be learned
Security tools often don’t have the visibility into Shadow IT, especially if an employee is entering company information into an app on a personal device. In fact, 1Password’s report found 43% of employees use AI apps on personal devices, and 25% use unapproved AI apps in the workplace.
Moreover, 22% of employees have shared company data with an AI tool, 24% have shared customer call details, and 19% have shared employee data.
When it comes to the security tools most used by employees, such as Single Sign-On (SSO), 74% of security and IT professionals say that it isn’t enough to protect employees and the business, with 30% of applications left outside of the SSO bubble. Employees are also turning to tools or data provided by previous workplaces, with 34% admitting to the practice.
So what can be done?
The fix to this problem, as it so often does, relies on education. Employees need to have a clear knowledge of which tools they can use, what data they can enter into them, and how to get a tool or application approved for use at work. But security teams also have gaps in their knowledge.
Susan Chiang, CISO at Headway spoke on this point during the media briefing, stating, “I think overall there’s a lot of traditional levers and visibility points that we have become accustomed to relying on, albeit imperfectly, that are increasingly not fit for this new age of software adoption.”
AI tools are increasingly being used by IT and security professionals to improve visibility and threat detection, and the C-Suite needs to be aware of both the threats and benefits of AI – CISO’s especially.
“You’re a business leader, so that’s how you have to think and really encourage your team to take an approach based on curiosity and learning,” explained Hillick. “I think ultimately taking that a little bit further to the next generation of CISOs, they are going to inherit an AI native landscape, so they need to focus on how AI can be a solution, not a problem.”
https://cdn.mos.cms.futurecdn.net/UtPcgNuiT5iviBbH9ENm2W-2560-80.jpg
Source link
benedict.collins@futurenet.com (Benedict Collins)




