How Heavily Regulated Industries Can Migrate Sensitive Data Workloads to the Cloud
How improvements in data security and governance are helping heavily regulated industries make the move to the cloud.
- By Nong Li
- October 10, 2022
All too often, companies looking to leverage the cloud and its inherent benefits quickly find themselves at a crossroads. Despite predictions that moving data, applications, and/or other business elements to a cloud computing environment would become the norm across industries, more than half (52 percent) of companies reported that the majority of their applications and data workloads are still residing in on-premises data centers.
For fear they may not have the tools or strategy in place to move sensitive data to a public cloud infrastructure, executives in deeply controlled sectors such as finance, healthcare, insurance, and government began to reconsider earlier plans to migrate sensitive data and workloads to public cloud infrastructures due to compliance concerns and overall risk.
Although those concerns are justified, the reality is that these same organizations still must complete their cloud migrations to leverage their data assets and identify new opportunities. Failure to do so has its own risks and can prevent an organization from capitalizing on the benefits of modern data management approaches, infrastructures, and analytics capabilities.
Moving to the Cloud Is Not Without Its Challenges
Aside from achieving a digital transformation, businesses are also moving to the cloud to conduct analytics at speed, quickly scale artificial intelligence and machine learning initiatives, and reduce CAPEX on hardware. However, migrating to the cloud has created a new set of critical issues, such as how to ensure data security at a fine-grained level to protect personally identifiable information (PII), manage data access and control to ensure only the right people can see certain assets, and remain in compliance with the ever-growing list of global mandates.
Typically, an organization opts to leverage the cloud by applying one of the following migration approaches:
- Cloud only: a company commits to having all of their applications and data in one particular cloud offering
- Multicloud: organizations select the cloud providers based on which are best suited for their architectural, regulatory, and geographic needs
- Hybrid IT: cloud management platforms, private clouds, and remote services are used in coordination with public cloud clusters and enterprise systems
Regardless of the approach taken, transitioning to the cloud has created difficulties that challenge security- and privacy-focused executives, as business leaders too often are left wondering where data is, how well it’s protected, and how they can access it.
As digital transformation initiatives progress, and more sensitive data is uploaded to the cloud, secure access and control of all data has become a pressing priority. Therefore, proper data governance has become essential. In fact, results of a recent study found that digital transformation initiatives and the increasing volumes of data being stored and shared in the cloud have led data governance, security, and privacy teams to be concerned with how to provide secure access to sensitive data to adhere to the myriad of compliance-related mandates.
Rather than thinking of the cloud migration effort as a project, organizations have begun to recognize the effort of moving to the cloud as more of a journey. Hybrid is the new normal because not everything can (or should) be moved to a public cloud infrastructure. In fact, most organizations will continue to leverage multiple clouds and on-premises services to control their expenditures and reduce risk.
Data Access Management: Securing Data Governance and Policy Oversight to Remove Cloud Risk
Gartner recently wrote, “As organizations move their data stores and analytics platforms to the cloud, they require streamlined data security policies and access controls that are better integrated than the products they were using on premises.” As a result, data access management has become a top priority for organizations as they seek to avoid any adverse effects to company brand, finances, and reputation. With increasing volumes of data being stored and shared in the cloud, security leaders are seeing the benefits of centralized data access and control. This enables them to automatically enforce universal policies that allow employees, customers, and partners to use data responsibly while simultaneously protecting them from inappropriately accessing data that is confidential, personally identifiable, or regulated.
Although there’s no perfect, everlasting data policy, there are past approaches for defining and updating policies to make them more resilient and responsive to business fluctuations. Historically, organizations opt to start by applying policies to on-premises workloads. This was helpful because testing the policies for protecting sensitive data and enforcing policies here provides the lowest impact. It is also a good place to start before applying the policies to the cloud where unsafe practices and functional blockers could accidentally be put into production.
At the same time, organizations are moving from role-based to attribute-based or fine-grained access controls. As the name suggests, role-based access control (RBAC) is a method of restricting access based on the roles of individual users within an enterprise. RBAC is an acceptable choice, but only when a group of users or a specific department can take ownership of application data and be responsible for its use. Thanks to the advent of big data, the massive volumes generated, and how it is used in cloud application environments, organizations realize the need for more robust controls and are applying attribute-based access control (ABAC) and fine-grained access control (FGAC) instead.
NIST defines ABAC as “a process where access is given based on attributes associated with the individual and the objects that are to be accessed such as location, time of creation, access rights, etc. Access to an object is authorized or denied depending upon whether the required (e.g., policy-defined) correlation can be made between the attributes of that object and of the requesting subject.” Automated scanning and metadata enrichment use attributes to secure and manage data in the real context of application workloads -- rather than relying upon the basic properties of sources, file types, and timestamps.
Using cloud-native enforcement for speed, fine-grained access control is a process that allows authorized applications to access only the specific rows, columns, or cells of data needed to conduct meaningful work. Whether data compliance is enforced via a proxy on a server or co-located with existing computing nodes such as Spark, data access policy enforcement can follow the workloads and scale with them whenever and wherever they spin up.
Enforceable Access Controls Make Change Manageable While Delivering Tangible Results
As organizations build toward a hybrid IT future, they can no longer afford to wait for their cloud architecture to stop evolving before embarking on a modern data management strategy. Continuous change is inevitable, leadership and project champions come and go, and the principles guiding the initiative will shift. Then consider that cloud platform providers and open source projects will continue to pump out new updates and elements to their technology stacks while startups will create new offerings and established ones will get acquired. Finally, don’t forget how new data privacy regulations, safety mandates, and industry standards will undoubtedly be introduced, as will unforeseen situational requirements such as COVID-19, which forced companies to adopt broader work-from-home practices and safeguards.
Retaining a universal set of fine-grained access control policies that can be enforced wherever data goes is key to achieving all the benefits of cloud deployments. Cost savings is one of the first benefits of moving to cloud, but running a common policy can also reduce unnecessary duplication of data back ends and their intermediate staging cloud costs. Teams can also produce quick wins -- or fail sooner -- with the assurance that policies can revert production usage to on-premises or other cloud resources without disrupting ongoing business as needed.
Another benefit is that future-proofing enables enterprises to move their data estate to changing cloud data platforms while respecting the modern cloud-native best practice of separating storage and computing work.
Finally, with reusable policies, automated data management, and improved visibility, organizations can enhance security and compliance by simplifying auditing processes and removing security review barriers.
A Final Word
With so much riding on how sensitive data is used within an organization, the ability to continuously manage and protect data in the cloud has become a board-level demand. Cloud and digital transformation are key to unearthing all the modern benefits and speed from our data assets, which makes the need to safeguard moving to the cloud a necessity. Organizations, especially those in highly regulated industries, now understand that a successful cloud migration depends on using intelligent and automated policies to remove security and compliance roadblocks and to realize all that the cloud has to offer.