Is The Sensitive Data In Your Oracle Database Really Safe?

Is The Sensitive Data In Your Oracle Database Really Safe?

Discovering Sensitive Data

Oracle has long had strong database encryption functionality and it keeps getting better. But they have been lacking a simple way to identify sensitive data in the database so you know what to encrypt, audit or protect via policies.  I thought that might have improved recently with the release of a new database security assessment tool called DBSAT.

New Oracle Security Tool

I was very excited when Oracle released their new free database security assessment tool (DBSAT). It is very easy to install and use so it lowers the barriers to properly securing your data. This is good and I think if you use no other tool, then this is a good place to start with your database security if you are using Oracle. The tool helps you find many common vulnerabilities relating to missing patches, poor configuration, excessive access rights, and other database vulnerabilities that can be very dangerous if they go unnoticed.

However, I found what I consider to be a serious deficiency in the tool and a missed opportunity,

There are three parts to the DBSAT tool. These are called the Collector, the Reporter, and the Discoverer.

The Collector collects information about vulnerabilities such as missing patches, poor configuration, etc. It does a pretty good job here, although I did find some vulnerabilities identified by other scanning tools that that it did not catch. More on that in a future article.

The Reporter takes the results of the Collector step and presents them in an easy to read PDF report.  This is very helpful but it lacks the ability to sort and filter vulnerabilities that some of the other tools have, and there can be a very large number of vulnerabilities so this can be a material deficiency as you begin to take steps to mitigate the vulnerabilities that are reported. More on that in a the future article.

Discoverer Doesn’t Discover Everything

The Discover is the real topic of this article. The Discover is intended to identify sensitive data in the database. I was very excited to see this feature in DBSAT  because the only Oracle tool available to do this previously was a feature of Application Data Modeling (ADM) which is an element of the Data Masking and Subsetting package. This package  requires an extra license and it is a bit cumbersome to use for this purpose. ADM gives the user the ability to identify sensitive data and together with Transparent Sensitive Data Protection (TDSP), the ability to enforce rules protecting that data.

As I tested DBSAT, I was disappointed to find that it is significantly less robust then the alternate method using ADM.  The more robust ADM method allows the user to specify three patterns to search for sensitive data. These patterns are checked against the Column Names, Column Comments, and the data itself.  You can specify a regex expression looking for certain patterns or words in the data, for example, ^\d{3}-\d{2}-\d{4}$  for a string that looks like a social security number.  Unfortunately, DBSAT seems to only allow searching for patterns within the Column Name and the Column Comments, and does not provide the ability to search for patterns in the data.  This approach relies heavily on the designers having used descriptive column names or comments and I think this leaves significant potential to miss sensitive data.

Application Data Modeling

To use ADM in order to find the sensitive data in your database, you must license the Data Masking and Subsetting package. You find ADM in OEM by navigating to Enterprise/Quality Management/Data Discovery and Modeling. The process involves creating a data model which collects meta data about your schema including table and column information. Then you define patterns for finding sensitive data in the tables, and a job will be executed in the background that will identify columns with names, comments, or data that match the patterns that you specify.   The job will produce an XML file that can be used with Transparent Sensitive Data Protection or for other purposes.

If you specified a robust set of sensitive data patterns, then this XML file is a much more robust list of potential sensitive data than the list provided by DBSAT Discoverer.

Example

Lets say you have a the following table in an organization that uses An Employee’s Social Security Number as his Employee Number. For simplicity, lets also assume that no column comments have been assigned.

Employee (Employee Name,  EmployeeAddress, EmployeeNbr)

“John Smith”, “18 Main Street, Exton, PA”, “721-99-2998”

“Peter Jones”, “20-42 Broadway, NYC, 10019”, “782-48-2332”

“Anna Newman”,”Anchorage AL, 22884″,”773-33-2002″

Without knowing the actual name of the EmployeeNbr column, or that it uses the SSN, we might set up search criteria similar to this:

  • Column Name:  Contains SSN, Identifier, Social
  • Column Comment:  Contains SSN, Identifier, Social

Using DBSAT, you can only specify these two criteria, so we would miss the entire EmployeeNbr column because it does not match the criteria for names or column comments.

But if you use the Application Data Model functionality to identify the sensitive data,  you can specify a pattern to match against the data in the table. Then it will pick up this column despite the vague column name. A Pattern might look like this:

Data Pattern: ^\d{3}-\d{2}-\d{4}$

Use ADM With Transparent Sensitive Data Protection

So if you can license Data Masking and Subsetting, then I recommend using ADM to identify sensitive data in your database. Then encrypt that data and/or protect it with Transparent Sensitive Data Protection.

Either way, the other features of DBSAT such as finding vulnerabilities related to patching and poor configuration make it a good addition to the database security administrator’s toolkit.

Other Tools

This article only discusses tools provided by Oracle. There are numerous other vendors that provide external tools to discover sensitive data including Imperva and Spirion.

If you would like more information about how to keep your Oracle data safe contact us or request an Oracle database health check.

Critical Oracle Security and Stability Flaw Discovered

Infoworld today announced that they have been researching a flaw in oracle that can result in the inability to restore your database. Large oracle shops with interconnected databases are most at risk.

The problem revolves around oracle’s SCN (system change number). The number has a limit that if exceeded, can render a database unusable, even after restoring the database. Under normal circumstances, this limit would never be reached. However, there are two ways that the limit can be reached accidentally.

  • A bug in Oracle’s hot backup mechanism results in a sharp increase in the SCN number under certain conditions.
  • The SCN can be increased manually, resulting in all connected databases to increase their SCN as well.

The January Oracle Critical Patch Update has a patch that resolves the hot backup problem. We recommend that this patch be applied immediately, especially if you are a large shop or use hot backups. Another fix increases the limit and and makes it less likely to reach it, but the accidental (or deliberate) modification to an SCN remains a vulnerability. Extra care should be taken with all databases that connect to any critical databases in your environment.

Read the full article for more details.

If you have any questions or need assistance, please contact us.

Database Downtime: Prepare For The Unexpected

Database Downtime: Prepare For The Unexpected

Test your Assumptions: Database Backup and Recovery

Every now and then something happens that really drives home the need to test and validate the assumptions that we have about the availability of our IT assets. I had an experience this week that brought to mind the importance of testing backup and recovery and business continuity plans.

Planning

At the beginning of each week, I look at each day’s activities and any travel that I need to do that week, and I plan out how I can be most productive each day. I find myself on the train often now between our offices in New Jersey and Boston and I have come to take my wifi service for granted.  I rarely have down time when traveling that way any longer.

Last month, while traveling to San Antonio by air, I was able to use wifi in the air, just as I can on the ground on the train.

Then last week, while planning a trip by air to Austin from Philadelphia,  I decided to make good use of the flight time. I planned to use the roughly four hours of flight time to get some work done that I will need for the next day.

Assuming

After I boarded the flight however, I found out that a huge assumption that I made was not correct. I found that not all airlines have wifi!

So now as I sit on the plane writing this post into a word document,  I am completely disconnected from the web, from my office, from my clients!

The problem here is not that I am not connected for a few hours,  it is that I did not anticipate that this might happen, and so I planned to use that time to get some important work done.  I assumed that the flight had wifi, and I did not validate that assumption!

Think about what will happen if you (or your customers) don’t have access to your servers for a few hours.  It can be that the connectivity to the servers was interrupted, as in my case,  or that the servers are completely down, or that your database software is hanging.  Ask yourself what will happen during those hours, and what you can do to avoid them in the first place.

Validating

Validating your assumptions is key to productivity. In this case, it is only one person whose productivity is compromised today, but consider the cost if your whole company is down for a few hours. What are you taking for granted?

So what does this have to do with the database?

This is a database blog, so you might be asking what this has to do with the database.  In the database business, we see people fail to validate assumptions all the time. A typical (and very dangerous) assumption people make is that their database is properly backed up and ready to restore in a disaster. As I describe in this blog post, this is rarely the case.  This is one of the most important assumptions for any company to validate.

If you haven’t tested your backup procedures lately we can help you validate that your database is indeed protected the way you assume it is.

Database Security: Is Your Database Vulnerable To Internal Attack?

Database Security: Is Your Database Vulnerable To Internal Attack?

Enforcing Least Privilege To Enhance Database Security

The principle of least privilege refers to the practice of ensuring that each individual has only the privilege and access that is necessary to perform their job function.

In most IT shops that run an Oracle database, there are a group of individuals that need administrative access to the operating system and the database. These individuals include both operating system administrators and database administrators.

In order to ensure Database Security, particularly Oracle Security, it is critical that privileges for these individuals be limited and managed properly.

In many cases, these individuals are all given the username and password to a shared, privileged operating system account, such as root on the unix platform. Also in many cases, these individuals are also given the password to the oracle account which owns the oracle binaries.

The Risks

There are two critical problems with this approach that significantly reduce your ability to ensure your Database Security.

  1. By granting multiple users access to these shared operating system accounts, you compromise the effectiveness of any database auditing or operating system auditing that you may have in place. If data is accessed or modified, future forensics would only yield the shared account name, not the name of the individual that took the rogue actions.
  2. Oracle’s operating system authentication mechanism enables a user, once connected to an operating system account, to connect to oracle without a password. Furthermore, if the operating system account is highly privileged, such as an account that is in the osdba group, then the user can connect to Oracle as sysdba (using the Oracle SYS account), which is the most privileged account in Oracle. The oracle account is in the osdba group. This leads to numerous risks including an oradebug vulnerability, described in an excellent post by Pete Finnigan, that results in the ability to turn off auditing completely (without the turn-off action being recorded).

How to mitigate these risks:

There are a number of steps we can take to significantly reduce (though not eliminate) the risks outlined above.

  1. Grant all system and database administrators their own operating system account and do not grant access to any shared administration account (including root and oracle) except when absolutely necessary.
  2. Ensure that the passwords of administrative accounts such as root and oracle are changed regularly. This protects you in the event that passwords had to be given out to resolve an emergency.
  3. Restrict Oracle DBAs that perform operational functions such as backups, or space management, to connect as sysoper or asmadmin instead of sysdba. This can be controlled by placing their operating system accounts into the osoper or osasm group or instead of dba group.

These steps will not completely eliminate the risk of a data breach. However, it will significantly reduce the potential for a breach, and will further reduce the potential of a breach that can avoid detection.

Contact Buda Consulting for more information about how to enhance your Database Security and Database Compliance.

Please reply with other risks that you have found that are caused by similar database management issues.

Database Security Issues in the Cloud, Part 2: Regulatory Compliance

As the number of databases moving to public, private and hybrid cloud computing infrastructure increases, security concerns are a significant and growing problem. Organizations will do well to scrutinize the security practices of cloud providers and other third parties that store their data. But wherever databases are running, responsibility for the security and integrity of data ultimately rests with the organization that owns the data – even when it resides with a service provider.

As I outlined in Part 1 of this post, cloud database security concerns fall into three basic categories: data access control (covered in Part 1), regulatory compliance, and physical/network controls. This post discusses regulatory compliance issues.

Regulatory compliance issues in the cloud

Much has been written about concerns with physical control of data in cloud environments. Cloud providers frequently need to reconfigure and/or move the virtual servers hosting your data, possibly across multiple data center locations.

How can you demonstrate to auditors that your data is secure if you don’t know exactly where it resides? The answer lies in having clear visibility into database activity relative to applicable regulations. You need to:

  • Put the necessary policies in place to meet compliance requirements;
  • Audit your databases against your policies and against all the regulations that apply to you, whether the data resides in a cloud environment or not; and
  • Make sure you can generate all the reports on database activity that you need to demonstrate compliance to auditors.

At Buda Consulting we use automated tools including Application Security’s Appdetective Pro to assess the vulnerability of clients’ databases and audit them against a host of regulations. The following list from the Appdetective documentation describes some of the key audit policies that we check in regulated environments:

  • Basel II – ideal for a Basel II compliance assessment
  • Best Practices for Federal Government
  • DISA-STIG Database Security Configuration – leverages the configuration parameters outlined by the DISA-STIG for SQL Server and Oracle
  • Gramm-Leach-Bliley Act – structured according to GLBA standards and recommended for GLBA compliance assessment
  • HIPAA – structured following NIST standards and best practices for databases security; highly recommended for use in a HIPAA compliance assessment
  • PCI Data Security Standard – recommended for use in PCI compliance assessments
  • Sarbanes-Oxley – follows CoBIT and ISO 17799 standards; recommended for use in a SOX compliance assessment

Using tools like App Detective Pro, auditors and advisors can perform a database security assessment against the organization’s policies and against applicable regulations, capture results for manual verification, and generate compliance reports.

Some of the scans will be difficult or impossible to run in a cloud environment without the assistance of the cloud provider. In particular, scans that require privileged operating system accounts will not be possible without cloud provider cooperation.

Therefore, it is important to obtain from the cloud provider documentation ensuring that they have the necessary controls in place to satisfy the applicable regulations.

This may be more difficult than it sounds. Some cloud providers refuse to give out any information about their security policies and procedures, indicating that doing so may compromise security. Others may withhold specifics, but instead point to the fact that they have undergone a SAS 70 type II audit. While passing a SAS 70 type II audit can be a valuable criterion to use when evaluating a provider, you must be sure to review which controls are included in that audit. These audits do not have to include every control that may be important to the pertinent regulations impacting your business.

Contact Buda Consulting to learn more about how to ensure the security of your data in the cloud.