The huge growth of public cloud computing providers in recent years has given rise to all kinds of digital-first businesses, and taken much of the hassle of establishing on-premise storage and computing power for others. But in moving to the cloud, organizations accept additional complexities around where and how their data is stored, encrypted and accessed in exchange for the flexibility and scalability that the cloud can offer them.
Data sovereignty is centered around these kinds of considerations, and represents a given country or territory’s ability to control the data within its borders. Whether it’s regulatory compliance, seeking a competitive advantage, or simply addressing the basics of building trust with customers and partners and ensuring data security, more and more businesses are turning to data sovereignty as a discipline to help give some of this control back.
SVP for Cyber Security Products at Thales.
What are some of the common business challenges when it comes to data sovereignty?
One significant influence is the fact that many laws around data, such as the US CLOUD Act or the European Union’s GDPR, extend beyond the borders of their originating countries. These impact how companies store and process data globally, and can often create conflicting legal obligations and complexities. At the same time, the increased focus on compliance needs can’t come at the cost of cybersecurity in the cloud, which still needs to be maintained. Security needs to be tight, but the data still needs to be accessible for business operations to stay agile enough.
Challenges also arise around data sovereignty efforts due to the need to classify data by sensitivity. Doing this can be nuanced and challenging – not to mention the fact that sensitivity of data can change over time. Working with such a constantly moving target, it’s therefore no surprise that data sovereignty was ranked amongst the top three emerging security concerns amongst enterprises in the annual Thales Data Threat Report. Businesses need tools that can help them continuously monitor, protect and classify data by risk as contexts change.
How do organizations operationalize data sovereignty?
Typically organizations start by classifying their data. This can become extremely complex when you consider the different business functions, deliverables and divergent needs – but generally organizations use a three- or four-level system.
Next can come an analysis of applicable regulations to the business, such as the GDPR, CCPA or other national laws, to understand what data protection requirements are in play. From there, IT teams need to assess where data is geographically located, as in some jurisdictions, regulations require residents’ data to be stored and processed in the country where the legislation applies. As a result, this may impact the choice of cloud storage, or mandate additional measures around protection to ensure compliance is achieved.
Taking the time to do this allows organizations to determine the extent of their needs, and the resulting encryption key management strategy that’s right for them. There are broadly three types of key management strategies. Bring Your Own Keys (BYOK), where an organization generates keys in their own environment, and imports them into the cloud to then use; Hold Your Own Key (HYOK), where keys remain stored in the organization’s domain; and finally Bring Your Own Encryption (BYOE), which offers the highest level of security with organizations using their own cryptography as well as the keys themselves.
These all offer varying levels of control and security depending on the level of assurance required for a given dataset. Often organizations might find it easiest to take a high-watermark approach, secure everything at the level required for the most sensitive, and avoid the complexity of managing different standards.
How does encryption tie into data sovereignty needs?
Fundamentally, encryption asserts control over who can access data – meaning data sovereignty can’t exist without it. In many legal jurisdictions, it’s also a compliance requirement to be actively using it. But encryption is only as good as the key management in place allows it to be. The key management infrastructure is essential in ensuring proper control over the encryption mechanism. The value of any sovereignty program comes from the control in place of users, entities and applications that can access and use encrypted data.
A good encryption strategy should cover the whole process, from key generation to their correct distribution, usage, and disposal at end-of-life. In turn, that encryption should be also flexible enough to cover data in transit across networks as well as at rest in storage.
What other measures should businesses be considering to protect their data?
Alongside encryption, strict access control and security policies are essential to limit the risk and impact of data breaches. Regularly reviewing who has access to what data, and revoking permissions if they’re no longer needed, can reduce the risk of some of the most common source of breaches.
Upgrading the authentication technologies used beyond passwords alone also makes it harder for account takeover attacks or credential theft to impact the business. Two-factor authentication, passkeys or biometrics can help overcome some of the common security risks that come with using solely passwords, and relying on users to remember and regularly change them.
Cloud backups and snapshots are the final piece of an effective data protection strategy, with rapid restoration and resumption capability in the event of an emergency. Businesses must also take care to regularly test this data to ensure it is immediately operable in such a high-pressure situation.
What role might quantum computing have on the future of data sovereignty?
Quantum computing is not yet a mainstream prospect, but in the wrong hands, it has the power to crack the underpinning cryptography that powers the encryption that we all rely on today. This has led to research into the development of post-quantum cryptography (PQC), that are intended to be resistant to this threat.
The other notable risk around quantum computing is around ‘harvest now, decrypt later’ activity by cybercriminals, which takes advantage of flaws in how encrypted data might be stored now to steal it. The eventual goal would be to leverage the power of quantum at some point in the future in order to later decrypt that stolen data.
The risk from quantum, though seemingly far off, represents a huge change in how organizations approach encryption, and it will take some time to switch. It emphasizes the importance for businesses to begin planning for this transition as soon as possible, staying informed about advancements in quantum computing, and exploring PQC solutions to minimize the chances of any operational disruptions or compliance failures.
How could next-generation mobile connectivity like 5G and 6G impact data sovereignty measures?
With enhanced bandwidth and spectrum efficiency, the evolution from 5G to 6G standards and beyond address the ever-growing demand for improved bandwidth and reduced latency for both consumer and enterprise applications. But as these network capabilities expand, so do the complexities of managing privacy and data sovereignty.
To make the best use of the available bandwidth, network operators will likely have to be more intrusive about the data they’re collecting on usage as well as the devices used. 5G’s very architecture also makes edge computing – the practice of decentralizing data processing and moving it closer to the data source – a reality. For enterprises, this means greater flexibility about how and where they decide to base their operations, and for the countries and territories they’re operating in, greater control around how this data is being generated and processed.
At the same time, 5G’s high-speed connectivity also means significantly more data generated, transmitted and processed, making the control and enforcement of data residency laws and associated regulation more complex. The overall decentralization of data processing that 5G makes possible can make it harder for CSOs to enforce consistent security protocols.
What is AI’s possible impact on data sovereignty?
What and where the underlying data that AI is drawing on comes from is the main concern here. Emphasis should be placed on the importance of protecting the personal data that is used to train AI. As the rollout of this technology continues, robust frameworks around data governance must be established so that data privacy and integrity become foundational aspects in how AI model are developed.
Such is the speed of technological development in this area that it will take some time for legal frameworks to catch up, and this lag creates a vacuum where data can be misused or mishandled. In the meantime, enterprises must establish guardrails of their own in their investigations and research with AI, as they would with any other technology that collects and analyses data to power decision-making.
We’ve listed the best data migration tool.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
https://cdn.mos.cms.futurecdn.net/pL5rBKGq88cnoqgdJgCXGS-1200-80.jpg
Source link