Cybersecurity
Written by: CDO Magazine Bureau
Updated 4:43 PM UTC, Thu November 21, 2024
(US & Canada) Sid Dutta, CEO and Founder of Privaclave, speaks with Robert Lutton, VP, Sales and Marketing at Sandhill Consultants, in a video interview about the fundamental elements of a proactive data protection program and balancing the need for data usability with stringent data protection measures.
At the outset, Dutta states that organizations are currently focused on the reactive approach to preventing data breaches because of the struggles around aspects like detection, response, and recovery. Being constantly bombarded with advanced persistent threats, he adds that there isn’t enough time to apply a proactive approach to prevent data breaches.
According to Dutta, a proactive approach to protecting data requires the following fundamental elements:
Knowing the data well
Having proper access management
Sufficient protection of sensitive data
Following a risk-based approach while implementing security controls
Elaborating, he notes that some companies are efficient in application inventory management, understanding lineage, determining what talks to what, and finding where data stores are. It is critical to know where the data is stored. However, even with the Data Security Posture Management (DSPM) tools, it is challenging to go across all the stores to start inventorying, says Dutta. It is also a problem considering the point-in-time aspect, as scanning and inventorying are always point-in-time, he says.
However, knowing and understanding the data regularly and not as a one-time project is critical. After establishing where the data is and what data is there, the next step for an organization should be to determine who has access to the data, he adds.
One of the fundamental challenges organizations face is appropriate access management, says Dutta. He maintains that a business rarely voluntarily reduces access; instead, it ends up granting more access. This creates a massive challenge in demarcating who needs to access the data and who does not.
Additionally, the decision to grant access falls upon data stewards or owners. According to Dutta, establishing this ownership is challenging, especially for shared data stores. He says that many times security or IT professionals approve access in the name of operational necessity.
Granting inappropriate access due to a lack of robust approval processes is a common issue across enterprises, affirms Dutta. To address this, organizations must have proper access management while continuously assessing risks and communicating the same to stakeholders.
Moreover, organizations need sufficient protection for sensitive data such as PII. Shifting the focus to security controls, Dutta states that implementing every possible control is not feasible, as it would impact the business usability of data.
Therefore, he recommends having a risk-based approach, focusing on the high-risk areas—the crown jewels—and protecting what matters most. Indulging in too much security also creates challenges, and with limited resources and budgets, prioritizing the critical areas is essential.
When it comes to balancing the need for data usability with stringent data protection measures, Dutta affirms focusing mostly on “protect surfaces” rather than “attack surfaces.” To strike a balance between security measures and usability of data, it is crucial to understand the data with proper taxonomy, he notes.
For this, Dutta mentions doing data segmentation, which leads to the categorization of data and eventually classification. From the risk and sensitivity perspective, the classification levels determine which areas need stringent security controls.
For instance, the different data classification levels could be restricted, confidential, internal, and public, and the first focus goes to the restricted data store, where the organization would elevate its security controls.
Furthermore, Dutta shares that when it comes to applying such stringent controls, technology has come a long way. For instance, the capability to perform computations in an encrypted mode ensures there is no risk of data leakage, even when it is a multiple-party collaboration.
In other scenarios, more practical security measures can be applied, such as FFX Format-Preserving Encryption (FPE) or tokenization. These methods preserve the analytical capability of data and the referential integrity of data points, enabling workloads to run on obfuscated data without impacting business functionality.
Sometimes, usability raises concerns while tightening security controls, says Dutta, specifically while incorporating advanced access-based controls, whether it is role-based (RBAC), attribute-based (ABAC), or context-based (CBAC).
Reiterating the zero-trust principle of continuous validation of user access, he states that in the end, no one is trusted, be it an internal or external user. Instead, access requests are continuously verified before access is granted. This principle advocates a verify first and then trust approach, says Dutta.
In conclusion, he states that these measures need to be adjusted while involving business in the conversation and considering business needs. In many cases, establishing the right practices and processes enables businesses to get comfortable with the change. Therefore, a gradual scaling-up of controls would make sense in terms of ensuring a balance between maintaining data usability and enforcing stringent security measures.
CDO Magazine appreciates Sid Dutta for sharing his insights with our global community.