When You Should Use TDE vs Always Encrypted

When You Should Use TDE vs Always Encrypted

Microsoft SQL Server and Microsoft Azure SQL Database offer two complementary encryption options: Transparent Data Encryption (TDE) and Always Encrypted. This blog post will help you decide when to use TDE versus Always Encrypted, and when to combine them for a “defense in depth” security and compliance strategy.

When to use Transparent Data Encryption

Transparent Data Encryption (TDE) protects data at rest, such as backups on physical media. It prevents access to data in scenarios like improper disposal of disk drives or attempts to restore databases from snapshots or copies.

TDE helps companies comply with regulations that mandate encryption of data at rest, such as HIPAA and GDPR. As a general rule, it’s appropriate to enable TDE for any SQL database, unless its data has no protection requirement at all. 

TDE encrypts the full SQL Server database in a manner that doesn’t require changes to the application. Encryption and decryption of the data and log files are performed in real-time. 

 However, TDE offers no protection for the data once it resides in memory. This leaves it vulnerable to “insider threats” and credential theft-related access from administrator (DBA) accounts, such as sysadmin, or other roles/applications that are authorized to access the database.

When to use Always Encrypted

To protect data in memory from identity/credential-based attacks, businesses can use Always Encrypted, which encrypts sensitive data in specific database columns in memory or “in use” during computations. The data remains protected even if the entire system is compromised, e.g., by ransomware. Attacks that involve scanning the memory of the SQL Server process or attempting to extract data from a memory dump are also ineffective against Always Encrypted.

 Always Encrypted allows SQL Server users to reduce the risk of storing data in the cloud, or to leverage third-party vendors for DBA services without violating compliance requirements.

 However, Always Encrypted relies on a client-side database driver within an application to encrypt the requested data before sending it to the database and to decrypt encrypted data in query results. Reliance on a client-sideWindows driver means that applications may require changes to work with Always Encrypted requirements and restrictions. For example, Always Encrypted supports only a few simple operations on encrypted database columns. This tends to limit its use to only higher-risk sensitive data, such as:

  • Personal data like customer names and credit card numbers, especially in regulated industries
  • To improve security when outsourcing DBA services
  • To improve security of data in transit and in use beyond what SSL alone can offer

A good rule of thumb for Always Encrypted is it works best to protect sensitive data that you need to store but don’t need to search on or display to application users. Beginning with SQL Server 2019 (15.x), Always Encrypted supports secure enclaves, which removes some of the limitations on operations you can perform on encrypted data. 

Using Transparent Data Encryption and Always Encrypted together

To create a “defense in depth” or layered encryption protocol for your data, TDE and Always Encrypted can be used together alongside Transport Layer Security (TLS). 

In this scenario, TDE acts as the defensive front line by encrypting the full database at risk, and may suffice to meet compliance requirements. TLS then encrypts data as it is transferred over a network. Finally, Always Encrypted protects the most sensitive data from privileged user attacks, malware that has compromised the database environments, and other threats against the data while it is in use. 

 TDE works with SQL Server 2008 and above as well as Azure SQL Database, but requires SQL Server Enterprise Edition. Always Encrypted works with all editions of SQL Server 2016 (13.x) SP1 and above, plus Azure SQL Database. Both TDE and Always Encrypted are free in Azure SQL Database. 

Next steps

Want to talk with a database security expert before you implement TDE versus Always Encrypted? Contact Buda Consulting to schedule a free consultation.

Oracle Database Assessment: Here’s What to Focus On

Oracle Database Assessment: Here’s What to Focus On

Organizations need to keep a close watch on Oracle operations to ensure agreed service levels are always being met. Database downtime can quickly lead to financial and reputational impacts, making periodic Oracle database assessments integral to the smooth operation of your most critical business systems—and thus your company itself.   Also called Oracle database health checks, Oracle database assessments are part of creating what we like to call a boring database environment: No surprises and no downtime. This peaceful state doesn’t happen by accident, but requires planning and commitment to best practices.  This post explains what an Oracle database assessment should mainly focus on.

What to Check

Oracle database assessments can potentially include a wide range of tasks and probes, some of which might come under the heading of performance tuning, security vulnerability testing, or everyday DBA tasks (e.g., patching).    But to be effective, an Oracle database assessment needs to cover all the key installation, configuration, and policy factors that help improve uptime and/or prevent downtime. Even currently minor issues can cascade towards failure if left unchecked.   Some of the most important parameters and elements in your database environment to review and optimize include: 

  • Alert logs and trace files, to see if any events show up that point to potential database problems 
  • Database maintenance procedures, to validate best practices are being consistently followed
  • Parameter settings, to look for values that can negatively impact performance, security, stability, etc.
  • Data block validation, to identify corrupt blocks and missing files, which are prime causes of database outages
  • Finding invalid objects, which can proliferate and hurt performance and stability
  • Identifying index and tablespace fragmentation, both top causes of degrading database performance 
  • Validating important file configurations like datafiles, Redo log files and Archive log files to ensure database file and backup file integrity and prevent data loss and crashes
  •  Memory, CPU, and disk usage review to proactively address low resource conditions that can impact performance and stability

In-house or Outsource?

Oracle database assessments require significant expertise and attention to detail, especially if your environment is complex with many interrelationships. While in-house DBAs can perform Oracle database assessments, a fresh set of unbiased eyes from outside your organization can add a valuable perspective, while also offering expert guidance and sharing best practices.

Expect a Detailed Report

Whether you perform your Oracle database assessment in-house or outsource it, stakeholders should expect a comprehensive report that documents and prioritizes areas of concern and recommends best-practice next steps in line with business goals. 

What About Database Security?

In our experience, database security is often overshadowed by other security priorities.  Yet database security protects the lifeblood of your business—its sensitive data—and must be a core part of your overall cybersecurity program and strategy.  Because of database security’s importance and complexity, it makes sense to conduct Oracle database security assessments as an adjunct to your Oracle database assessments. A holistic approach that secures the data, the database configuration, identities and access, the network, the database server, and the physical environment is key to eliminating vulnerabilities and mitigating business risk.   Some database security “quick wins” we often recommend to clients include making the best use of Oracle’s built-in security features, which you’re already paying for as part of your database package. This includes downloading the Oracle Database Security Assessment Tool (DBSAT). This free tool scans your database and gives you a security profile including your overall database security configuration, users & entitlements, and sensitive data identification.

What’s Next?

Based on decades of experience helping our clients keep their databases stable and running optimally, Buda Consulting offers a 35-point Oracle database assessment that is reliable, thorough, unbiased, and keeps your in-house DBAs focused on other essential tasks.  Contact us to schedule time with an Oracle expert to talk over your situation, goals, and concerns.  

How CIS benchmarks plug Cybersecurity Framework Gaps

How CIS benchmarks plug Cybersecurity Framework Gaps

While a good Cybersecurity Framework specifies the implementation of controls to mitigate information-related risk for the full life cycle of critical data, in practice I have observed that in many organizations the framework implementation tends to focus on the networks, the servers, and the applications. The lack of database focus exposes the mission-critical data of the organization to unnecessary risk. This blog is intended to bring this issue to the forefront and to suggest that having a database professional implement a relevant CIS database benchmark can ensure that the database is secure even if a particular risk was not identified by the security framework implementation team.

 Server, Network, and Application Bias

 In my experience, security considerations of servers, networks, and to a lesser degree applications, are given more attention in most organizations than databases. In fact, a firm that we work with that specializes in helping customers implement security frameworks told me that they see database administrators involved in only about 5% of cybersecurity framework implementations! 

 Because of this bias, and because of the absence of database experts in the process, when security implementers examine the controls in the cyber security frameworks and specify corrective or preventive actions to take, they tend to neglect the database.

 This bias toward non-database components of an organization’s IT infrastructure is evident even in the introduction of the well-respected NIST special publication 800-53 A r5 document.  The target audience is described as follows :

  • Individuals with system development responsibilities (e.g., program managers, system designers and developers, systems integrators, information security engineers and privacy engineers);
  • Individuals with information security and privacy assessment and monitoring responsibilities (e.g., Inspectors General, system evaluators, assessors, independent verifiers/validators, auditors, analysts, system owners, and common control providers);
  • Individuals with system, security, privacy, risk management, and oversight responsibilities (e.g., authorizing officials, chief information officers, senior information security officers,11 senior agency officials for privacy/chief privacy officers, system managers, information security and privacy managers); and
  • Individuals with information security and privacy implementation and operational responsibilities (e.g., system owners, common control providers, information owners/stewards, mission and business owners, system administrators, system security officers, and system privacy officers).

 Conspicuously missing from that long list of individuals mentioned as responsible for information security are Database Administrators. But the database is arguably the most important part of the environment to secure. This is where the data lives!

  Why choose CIS benchmarks as database security guidelines?

 The Center for Internet Security is an independent non-profit organization that provides frameworks for keeping organizations safe from cyber threats. These frameworks include lists of controls that protect the organization from internal or external threats.  CIS also provides benchmarks that are essentially configuration guides used to assess and improve the security of specific applications, databases, or operating systems.

 Fortunately, the CIS database benchmarks are just that — database benchmarks. They prescribe vendor-specific configuration settings that need to be set to mitigate known vulnerabilities. 

 CIS benchmarks are a fast and more certain path to database security. They provide a more prescriptive approach to satisfy the key data security objectives of cyber security frameworks like NIST, ISO 27001, and CMMC.

 The CIS database security benchmarks provide a specific set of configuration guidelines one must follow to eliminate or mitigate known vulnerabilities in the target database, operating system, or application. Carefully following these guidelines can fill potential gaps that may remain when an organization determines which controls need to be implemented to satisfy the requirements of the framework and manage information related risk effectively.

Database-First Security

I believe that an important part of the fight against ever-increasing cyber threats is to focus intently on securing the database.   Applying proper controls at the database level first ultimately requires that controls be applied properly at other layers required by the frameworks.

 For example, properly limiting user privileges inside the database (by role), forces designers and administrators to implement role based security and OS authentication in a more thoughtful way.  Also, thoughtfully limiting OS system privileges and DBA privileges at the database level forces System Administrators to allocate privileged accounts in a more thoughtful way, enforcing principals like segregation of duties.  

If you are leveraging Oracle, MS SQL Server, MySQL, or MongoDB to hold mission critical or sensitive data, I strongly recommend that you leverage CIS benchmarks as a compliment to any cyber security framework.

The CIS benchmarks are available for free here. Contact us today for a free no-obligation consultation

5 Database Security Risks You Probably Don’t Know You Have

5 Database Security Risks You Probably Don’t Know You Have

I recently appeared on an episode of The Virtual CISO Podcast hosted by my friend John Verry titled “Confronting the Wild West of Database Security.” In our conversation, I emphasized that despite the criticality of the data involved, many companies fail to appreciate the cybersecurity risks associated with their databases. They simply don’t realize how big their database attack surface really is.

Here are 5 significant threats to your databases that we often find our clients are unaware of.

One: Inconsistent user account management

A great many of the database vulnerabilities we see relate to sloppy, inconsistent, or ad hoc management of user accounts and login profiles. Issues with privileged users, obsolete accounts, and default passwords in use very often slide under the radar. This potentially leaves the door open for unwelcome guests to pay a visit to your database. 

Two: Non-masked data in QA and dev environments

It’s scary how often we see non-masked data used in dev/test scenarios. In many cases, the production environment is well secured, but the development and QA environments are much less well secured. Yet the same data is being used in both. There’s no reason for this given the plethora of tools available for masking or obfuscating data.

Besides leading to data exfiltration, this is a potential compliance violation. Depending on your regulatory environment and the nature of the non-masked data (e.g., financial, medical, or other sensitive personal data), just the fact that you’re retaining that outside the production environment where it’s accessible to QA engineers and others who don’t have a legitimate reason to access it could be deemed a data breach.

Another danger is that code and data in dev/test environments frequently end up on developers’ local machines, which greatly increases the risk of data loss or a breach. On the podcast, John recalled an incident where a developer working for the City of New York dumped about 500,000 unmasked HR records onto his laptop, which he then left behind at a Korean restaurant. That ended up costing the city $23 million.

Three: Database sprawl

An extremely common but frequently disregarded threat to database security is database sprawl. The more databases you have, the more likely some will have unmitigated vulnerabilities that lead to compromise.

And as bad as database sprawl is on-premises, it’s exponentially worse in the cloud where everything is virtualized. It’s just too easy sometimes to spin up databases and then forget about them. Organizations need policies and processes to reduce the risk (not to mention the wasted money) from database sprawl.

Four: Pipeline leakage

A little-known database security concern that we are seeing more and more frequently is what I call “pipeline leakage.” I’m not a DevOps expert, but in my view, this “pipeline leakage” creates a very significant risk in the DevOps and CI/CD world or the data engineering world. 

Here’s what happens: Data gets taken out of a very well-protected database. Then, teams create XML, CSV, or JSON files that hold some of the data and put it somewhere else. Now it’s in temporary files or holding areas or spreadsheets that are scattered all over the place. Is the data still secure? Who knows? Teams need to be aware of this issue and clean up their processes to close this hole.

Five: Insider threats

Insider threats, both intentional and unintentional, are the root cause of something like 50% of data breaches. Whether they result from revenge, greed, or a user clicking a malicious link designed to harvest their credentials, insider attacks often target databases because of all the valuable data they contain. Yet many organizations underestimate the prevalence of insider threats and their potential impact.

To protect a database from insider threats, you need a way to log and detect activity against the database, both authorized and unauthorized (i.e., user activity monitoring). Then you need a way to alert on potential issues and investigate them. Finally, you need preventive controls like robust identity & access management (IAM) policies, such as quickly deleting unused accounts and only authorizing access to sensitive data for those who really need it. 

Now You Know About These Database Security Risks, What’s next?

The most comprehensive way to identify and prioritize your database security risks is a database security assessment. This cost-effective process covers everything from policies to user rights to auditing your databases for vulnerabilities with automated tools.

For more information on how a database security assessment can reduce your security and compliance risk, contact Buda Consulting.

 

How is Database Security NOT Like a Bank Vault?

How is Database Security NOT Like a Bank Vault?

As John Verry and I discussed in a recent virtual CISO podcast episode, many people think of database security as they think of bank vaults. They secure the perimeter, place the valuables in the vault (database), and then assume those valuables are as safe as if they are in the bank vault.

How Database Security and Bank Vaults Differ

If a thief gets past the physical security at the bank (doors, locks, window bars, alarms, they will still have a very difficult time getting in to the vault. (unless it is an inside job — more on that later).

But when we think of a database as a bank vault, we are missing an important difference. Bank vaults have a single point of entry, and it is secured by a complex locking mechanism that will thwart all but the most talented criminals.

Databases on the other hand, have many points of entry. Numerous administrator accounts, potentially hundreds of database user accounts, application accounts, operating system accounts that have access to the underlying data files, accounts in other databases that have access to database links into your database, network sniffers, the list goes on and on.

The Manufacturer Takes Care of That! Or Do They?

A bank vault manufacturer ensures that all of the seams on the vault are sealed properly and that all of the walls are resistant to power tools. They ensure that, in essence, there is only one point of entry. Providing more than one point of entry would render the safe less secure and would render the safe less useful.

Manufacturers of database software, on the other hand, work hard to provide as many points of entry as possible. User accounts, web services, database links, export utilities. Providing only one point of entry would render the database much less useful.

If Not the Manufacturer, Then Who?

It is clear based on these competing interests that the database software manufacturer is not and cannot be responsible to secure the database. Nor can the database host, whether this is an MSP or cloud provider that provides the server on which the database runs, or the database host in a PAAS (platform as a service — think RDS or Aurora) environment. All of these parties must provide as many points of entry as possible in order to make the databases valuable to the broadest set of customers.

Database and security professionals with a clear understanding of all of the entry points, and who work closely with the data owners or data security team, are required, in order to bring the security of a database even within reach of that of a bank vault.

More on the Inside Job

I promised more on the inside job earlier: Whether we are talking about a bank vault or a database, an insider with bad intent can render many security controls ineffective.

In a bank, an insider with the vault combination can easily bypass the most challenging part of getting to the valuables. They don’t even need a blow torch or explosives.

With a database, it is more complicated than that: An insider with a password can easily bypass both the perimeter security and the database security. At first glance this makes the database appear less secure than the bank vault, and it is — most of the time.

How to Close the Gap

There are many things that can and must be done in order to ensure that a database is and remains secure, including patching to remove known vulnerabilities, ensuring proper Disaster Recovery is in place and ensuring that encryption at rest and in transit is used. This article is about how a database differs from a bank vault, so I will only mention the points relevant to that comparison here.

In order to close the gap between the security of the bank vault and that of the database, we must eliminate or lock all unused entry points, and restrict access and track use of the remaining entry points.

While databases have many entry points, when configured properly, most enterprise-level database tools have very granular levels of control. Combining these granular levels of control with solid security procedures, we can significantly tighten the security of the database.

Some Examples:

  • Restrict the time of day that a user has access to and the machine that is being used to connect to the database with a particular username.
  • Audit all activity in the database, including the username, machine used, activity, timestamp, and other information, and take action quickly if we see activity that looks suspicious in order to reduce damage.
  • Insist that all privileged database users have individual usernames, that they are protected by two-factor authentication, and that their passwords are robust.
  • Have procedures in place and enforced to remove access immediately as part of the termination process for employees or contractors.

These practices, together with others, can render a database as safe (or perhaps even safer) than a bank vault.

But when a database instance is spun up by a developer with no Database Administration or security expertise, it is more accurate to compare the database to a cash register at the local convenience store than to a bank vault.

Listen to the conversation that John Verry and I had about this and other database security topics, and let us know what you think about how well people are protecting their critical data assets.

For more information or to schedule a consultation, please click here to contact Buda Consulting.

Need Continuous Database Protection across Oracle and SQL Server? Consider Dbvisit Standby MultiPlatform.

Need Continuous Database Protection across Oracle and SQL Server? Consider Dbvisit Standby MultiPlatform.

Availability of your database environment and continuous database protection is business-critical. Without continuous database protection, you can’t ensure business continuity. But it’s only a matter of time before you experience a failure. When (not if) that happens, will you be ready?

When it comes to disaster recovery, many businesses rely on conventional backup/restore procedures to protect their database from risks like operational failures, cyber-attack, disaster impacts, and data corruption. But restoring from traditional backups can be slow, taking hours or even days. Restoring from backups is also notoriously failure-prone because testing and validation are usually infrequent. Plus, depending on how frequently backups occur, you could lose hours’ worth of the most recent changes to your data.

If your organization requires rapid, resilient disaster recovery and business continuity capabilities and/or cannot tolerate data loss, you may want to consider a standby database configuration. A standby database is a copy of the primary database, usually at a remote location. It updates continuously to minimize data loss and can quickly “failover” to support ongoing operations if the primary database goes down or is corrupted.

Why use a standby database for disaster recovery and continuous database protection?

A standby server has several important advantages over traditional backup/restore tools for disaster recovery and data loss prevention:

  • It is always operational and available in seconds, not hours or days, so you can recover more quickly.
  • It minimizes potential data loss by updating continuously with minimal time lag.
  • Its operational readiness is constantly verified, which guarantees database integrity after failover.
  • It enables you to test your disaster recovery plan much more easily, with minimal risk or impact to your primary database and the applications that rely on it.
  • It can be offsite, geographically distant, and running on separate infrastructure from your primary database, which reduces disaster risk in the event of operational failure at your production site.
  • You can enjoy peace of mind knowing that your database is always backed up and can be restored or recovered at any time with no surprises.

In short, a standby database can be an ideal solution for organizations that want to ensure continuous database protection to minimize downtime, data loss, and business risk. The following figure illustrates a standby database configuration.

Meet Dbvisit, Buda Consulting’s standby database partner

Buda Consulting has considerable experience helping organizations implement backup/restore, high availability and disaster recovery solutions for their databases on Oracle, Microsoft SQL Server and open-source platforms. We have found our longtime partner Dbvisit to be a world-class standby database solution provider whose solutions are easy to use, cost-effective and backed by great customer service. Our customers of all sizes love Dbvisit, which is why we’re sharing this blog post.


We’re especially excited to share with our client base that Dbvisit now offers the industry’s first multiplatform option. Called StandbyMP, it enables you to manage standby databases for Oracle and SQL Server through a single pane of glass. Imagine confronting an outage and being able to failover all your databases automatically or with a single click! PostgreSQL support is also coming soon in 2022.

Another big advantage of Dbvisit solutions is you can deploy them on-premises, in a public cloud or on hybrid cloud. Supported public clouds include Amazon Web Services (AWS), Microsoft Azure and Oracle Cloud.

Gold Standard Disaster Recovery and Continuous Database Protection

The folks at Dbvisit are disaster recovery specialists, with thousands of customers in 120 countries and offices in North America, Europe and Asia Pacific. While they serve some of the world’s leading enterprises, including Verizon, Barclays, 7-Eleven, the US Navy, Volkswagen, PWC and CBS, Dbvisit’ exceptional support and industry-leading total cost of ownership (TCO) make them a great choice for small to midsized businesses (SMBs) as well.

According to Neil Barton CTO of Dbvisit, “Dbvisit Standby guarantees database continuity through a verified standby database that is always available and ready to take over at the moment you need it.” Even if your most trusted DBA is on vacation when an emergency occurs at 3AM, your database(s) will be protected from contingencies ranging from human error to hardware failure to hurricanes to hackers.

Dbvisit Standby solutions for Oracle and/or SQL Server promise minimal data loss (a maximum of approximately 10 minutes) and fast database recovery/failover (within a few minutes). Continuous exercising and testing maintains and validates the integrity of your standby database 24×7. This is what Dbvisit calls “Gold Standard Disaster Recovery.” It offers the following value propositions:

  • Database integrity with a verified standby database that is identical to the primary database and fully operational to ensure successful failover
  • Resilience to meet your recovery requirements across all outage and disaster scenarios
  • Automated and intuitive to eliminate manual processes, opportunities for error and dependence on highly skilled staff
  • Decision simplification to “de-stress DR”
  • Near-zero data loss
  • Cost-efficiency and low risk


Dbvisit lives up to its motto: “We believe nothing should stand in the way of your business moving forward.”

Dbvisit StandbyMP: Enterprise-class DR for multiple database platforms

Using different disaster recovery tools and processes across multiple database types has always been complex. Dbvisit’s new StandbyMP offering promises to reduce this complexity and for the first time allow customers to manage DR processes for SQL Server and Oracle SE databases through a single console. We are very excited about the multi-platform concept and are looking forward to the addition of PostgreSQL and other popular databases soon.

Prioritizing risk reduction, disaster resiliency, recovery speed and ease of use, StandbyMP delivers rapid time-to-benefit, ease of administration and automated, on-demand failover. Dbvisit guarantees database continuity and radically reduces database risk with a consistent, “Gold Standard” approach to protecting both Oracle and SQL Server databases.

“Our software costs the equivalent of two minutes’ downtime,” said Tim Marshall, Product Marketing Manager, in a recent Dbvisit blog post. “Great doesn’t have to be expensive.”

Dbvisit highlights these key value propositions for its StandbyMP solution:

Simplify – Control your Oracle and SQL Server disaster recovery configurations from a single central console
Speed up – Multi/concurrent database actions accelerate recovery across both Oracle and SQL Server
Risk down – Automation removes manual processes, hard-to-maintain scripts, and opportunities for error
Level up – Simplify your disaster recovery plans and ensure best practices are implemented across all your databases

Next steps

An industry-leading standby database solution like Dbvisit StandbyMP can be the perfect way to continuously protect your critical data—but it’s not right for every database. To connect with an expert on whether a standby database makes sense for your business, contact Buda Consulting to schedule a 15-minute conversation.

For more information on Dbvisit solutions and services, check out Dbvisit.com