November 2, 2015 By Vikalp Paliwal 3 min read

Expectation: I am an IT security and compliance director, and I have an audit coming in a month. I have few databases that have customer data, product data and employee data. I think I will be OK and may be able to pass the audit if everything goes well.

Reality: I am an IT security and compliance director and I have an audit coming in a month. I have databases that I don’t even know about in my environment, and I am not sure if those databases contain customer data, product data, employee data or some other sensitive data. I don’t think I’ll be OK, and I will fail the audit. I am sure nothing will go well.

Is Your Compliance Status at Risk?

Does this reality sound all too familiar? And more importantly, are you covered?

Compliance is most challenging when you have to meet various security baselines (e.g., STIG, CVE, CIS, SOX, PCI or HIPAA) for all your data sources. Multiply it by the number of data sources across several platforms and the equation gets very complex.

Read the white paper: Three Guiding Principles to Improve Data Security and Compliance

Organizations are almost constantly trying to minimize the cost of compliance and are always looking for a way to meet all their compliance needs cost-effectively and without much hassle. This sounds too perfect, right?

But it is possible. It starts by understanding the strategy to be in compliance from a data source vulnerabilities perspective and adopt industry best practices in this approach.

Why Do You Need This Approach to Compliance?

Organizations have thousands of data sources across multiple platforms (e.g., databases, data warehouses, big data platforms) with thousands of users accessing data from applications or natively using sequel queries. Database administrators (DBAs) deploy the production system with default usernames and passwords or passwords that do not comply as per corporate baseline. As a result, they are easy to decode for gaining access to the data.

Data control language (DCL) commands like Grant and Revoke are not set properly and often provide excessive privileges to users who do not need them. There are so many misconfigurations and default database settings that need to be changed. There are also missing patches that need to be updated.

Managing the data repositories’ vulnerabilities requires a great deal of skill and maintenance, especially when you multiply this by the number of data sources and data platforms. It’s a big project, and relying on resolving this all manually is a massive investment in time and resources. But it will not make you sufficiently compliant or more secure on its own.

This leads to the next point: Insider threats increasingly lead to data breaches. Enterprises often have no authority to restrict access, they do not have a simple source of information on who has access to what and they cannot manage entitlements across a large inventory of data sources. All of this must be remediated.

Is There a Solution?

Enterprises are looking for an automated way to analyze sensitive data servers, identify sensitive data, check for the risk posture in those servers and get a vulnerability assessment report to understand the overall risk. Once they gain access to this information, they can then look for best practices to remediate all vulnerabilities scanned on those servers.

This will solve the twofold problem of managing compliance and securing your data.

That may be your ideal scenario, but it can also be a reality. Some organizations are leveraging IBM Security Guardium Vulnerability Assessment to solve this big challenge. To learn more, watch the on-demand webinar “Avoiding the Data Compliance Hot Seat” or watch the latest Guardium Vulnerability Assessment demo:

https://youtu.be/i60ht6UF27s

More from Data Protection

How secure are green data centers? Consider these 5 trends

4 min read - As organizations increasingly measure environmental impact towards their sustainability goals, many are focusing on their data centers.KPMG found that the majority of the top 100 companies measure and report on their sustainability efforts. Because data centers consume a large amount of energy, Gartner predicts that by 2027, three in four organizations will have implemented a data center sustainability program, which often includes implementing a green data center.“Responsibilities for sustainability are increasingly being passed down from CIOs to infrastructure and operations…

Why maintaining data cleanliness is essential to cybersecurity

3 min read - Data, in all its shapes and forms, is one of the most critical assets a business possesses. Not only does it provide organizations with critical information regarding their systems and processes, but it also fuels growth and enables better decision-making on all levels.However, like any other piece of company equipment, data can degrade over time and become less valuable if organizations aren’t careful. What’s even more dangerous is that neglecting data hygiene can expose organizations to a number of security…

Router reality check: 86% of default passwords have never been changed

4 min read - Misconfigurations remain a popular compromise point — and routers are leading the way.According to recent survey data, 86% of respondents have never changed their router admin password, and 52% have never adjusted any factory settings. This puts attackers in the perfect position to compromise enterprise networks. Why put the time and effort into creating phishing emails and stealing staff data when supposedly secure devices can be accessed using "admin" and "password" as credentials?It's time for a router reality check.Rising router risksRouters…

Topic updates

Get email updates and stay ahead of the latest threats to the security landscape, thought leadership and research.
Subscribe today