Advantages and Disadvantages of Thin Provisioning

Advantages and Disadvantages of Thin Provisioning

Thin provisioning uses virtualization technology to allocate disk storage capacity on demand as your needs increase. Thick provisioning is the counterpart strategy of pre-allocating storage capacity upfront when you create a virtual disk drive.

Thin provisioning creates the illusion of more physical resources than are available in reality. For example, you can assign 1TB of virtual disk space to each of the 2 development teams, while actually allocating only 500GB of physical storage. With thick provisioning, you would need to start with 2TB of physical storage if you wanted to assign 1TB to each of those 2 teams.

Before you make a decision, here are it’s top 2 advantages and its top 2 disadvantages:

Advantage #1: Optimizing your storage utilization

In environments where storage is shared, thin provisioning lets you optimize the usage of your available storage, so it’s not sitting idle. For example, say you assign a 2TB drive to an application that ends up using only 1TB of storage. In this configuration, another application can leverage unused storage. With thick provisioning, the unused capacity is never utilized.

Advantage #2: Scaling up cost-effectively

As long as you are monitoring and managing storage effectively and can confidently predict usage trends, thin provisioning lets you incrementally add more storage capacity as needed and not buy more than you need for the immediate future. 

Disadvantage #1: Increased downtime and data loss potential

Most approaches don’t automatically account for your growing storage needs—putting your environment at significant risk for storage shortages and associated downtime issues when the volume of virtual storage provisioned exceeds the physical disk space available. This includes crashes and/or data loss on your virtual drives that can hurt user productivity and customer experience while leaving your DBAs with a big mess to clean up.

Disadvantage #2: Lack of elasticity

As just noted, thin provisioning is great for helping you scale up your storage environment cost-effectively. But it doesn’t work in reverse. If your applications need fewer services, you may need to reduce the allocations manually as part of your storage monitoring and management program unless your array controller or other technology can handle that for you.

Choose thin provisioning based on the use case

As a general rule, whether to use thin or thick provisioning depends on the balance of resources used versus allocated for your specific use case. Thin provisioning is much safer and more efficient when the resources you actually need are significantly less than what you plan to allocate. Thick provisioning is a better choice when the resources you use are close to what you allocate.

This is why Buda Consulting doesn’t recommend thin provisioning for production storage except for clients that can consistently manage and forecast data storage needs. Otherwise, this can lead to major problems and expenses that outweigh the cost savings associated with improved storage utilization. However, thin provisioning can be a good option for many businesses when used in development, testing, or other non-production scenarios.

Next steps

For expert advice on how you can best leverage thin or thick provisioning, or to explore a range of options for making the best use of physical and virtual storage in your unique environment, contact Buda Consulting.

 

When You Should Use TDE vs Always Encrypted

When You Should Use TDE vs Always Encrypted

Microsoft SQL Server and Microsoft Azure SQL Database offer two complementary encryption options: Transparent Data Encryption (TDE) and Always Encrypted. This blog post will help you decide when to use TDE versus Always Encrypted, and when to combine them for a “defense in depth” security and compliance strategy.

When to use Transparent Data Encryption

Transparent Data Encryption (TDE) protects data at rest, such as backups on physical media. It prevents access to data in scenarios like improper disposal of disk drives or attempts to restore databases from snapshots or copies.

TDE helps companies comply with regulations that mandate encryption of data at rest, such as HIPAA and GDPR. As a general rule, it’s appropriate to enable TDE for any SQL database, unless its data has no protection requirement at all. 

TDE encrypts the full SQL Server database in a manner that doesn’t require changes to the application. Encryption and decryption of the data and log files are performed in real-time. 

 However, TDE offers no protection for the data once it resides in memory. This leaves it vulnerable to “insider threats” and credential theft-related access from administrator (DBA) accounts, such as sysadmin, or other roles/applications that are authorized to access the database.

When to use Always Encrypted

To protect data in memory from identity/credential-based attacks, businesses can use Always Encrypted, which encrypts sensitive data in specific database columns in memory or “in use” during computations. The data remains protected even if the entire system is compromised, e.g., by ransomware. Attacks that involve scanning the memory of the SQL Server process or attempting to extract data from a memory dump are also ineffective against Always Encrypted.

 Always Encrypted allows SQL Server users to reduce the risk of storing data in the cloud, or to leverage third-party vendors for DBA services without violating compliance requirements.

 However, Always Encrypted relies on a client-side database driver within an application to encrypt the requested data before sending it to the database and to decrypt encrypted data in query results. Reliance on a client-sideWindows driver means that applications may require changes to work with Always Encrypted requirements and restrictions. For example, Always Encrypted supports only a few simple operations on encrypted database columns. This tends to limit its use to only higher-risk sensitive data, such as:

  • Personal data like customer names and credit card numbers, especially in regulated industries
  • To improve security when outsourcing DBA services
  • To improve security of data in transit and in use beyond what SSL alone can offer

A good rule of thumb for Always Encrypted is it works best to protect sensitive data that you need to store but don’t need to search on or display to application users. Beginning with SQL Server 2019 (15.x), Always Encrypted supports secure enclaves, which removes some of the limitations on operations you can perform on encrypted data. 

Using Transparent Data Encryption and Always Encrypted together

To create a “defense in depth” or layered encryption protocol for your data, TDE and Always Encrypted can be used together alongside Transport Layer Security (TLS). 

In this scenario, TDE acts as the defensive front line by encrypting the full database at risk, and may suffice to meet compliance requirements. TLS then encrypts data as it is transferred over a network. Finally, Always Encrypted protects the most sensitive data from privileged user attacks, malware that has compromised the database environments, and other threats against the data while it is in use. 

 TDE works with SQL Server 2008 and above as well as Azure SQL Database, but requires SQL Server Enterprise Edition. Always Encrypted works with all editions of SQL Server 2016 (13.x) SP1 and above, plus Azure SQL Database. Both TDE and Always Encrypted are free in Azure SQL Database. 

Next steps

Want to talk with a database security expert before you implement TDE versus Always Encrypted? Contact Buda Consulting to schedule a free consultation.

Why We Don’t Always Do What The Customer Asks?

When we ask our customers why they love working with Buda Consulting, the answer we typically get is that we listen to them and we do what they ask (Apparently this is rare in the database support field).  And we almost always do.  But not always!

The Request

When I was about fourteen, I delivered the morning paper before school in Staten Island.  One of my customers, a very nice man named Mr. Olsen, asked me to make sure he knew when the paper arrived in the morning by making noise. I would throw the paper onto the porch from my bike and he asked me to throw it harder so it would hit the door and he would hear it from inside the house. 

The Result

What he didn’t count on is that I would do exactly what he asked. The next day as I passed by his house on my bike, I threw that paper as hard as I could. It sailed over the walkway, past the steps, along the porch, and crashed right through the glass panel on the front door.  It sure made noise!

Mr Olsen came out a bit shocked, and then said something like “a bit softer next time!” I offered to pay for the door (not knowing how I could ever afford to do that),  but I was fortunate that my customer took responsibility for making a request without thinking it through, because he was a nice guy, and because I was only a kid.. But I understand now that it was my responsibility to evaluate the request, make sure that the request is in the customer’s best interest, and either mitigate any risks or suggest alternatives. 

These Days

Now, all these years later, we sometimes have customers that make requests that could end up harming them. Rather than performing the action without question, we will inform them of the risk, suggest ways to mitigate that risk, and there are times that we will respectfully decline to perform the action if we feel the risk is too great. 

The Takeaway

I have learned that my job was not just to deliver the newspaper.  My job was to deliver the newspaper without causing damage to my customer’s home.  These days, the same applies to our customer’s database systems.

Oracle Database Assessment: Here’s What to Focus On

Oracle Database Assessment: Here’s What to Focus On

Organizations need to keep a close watch on Oracle operations to ensure agreed service levels are always being met. Database downtime can quickly lead to financial and reputational impacts, making periodic Oracle database assessments integral to the smooth operation of your most critical business systems—and thus your company itself.   Also called Oracle database health checks, Oracle database assessments are part of creating what we like to call a boring database environment: No surprises and no downtime. This peaceful state doesn’t happen by accident, but requires planning and commitment to best practices.  This post explains what an Oracle database assessment should mainly focus on.

What to Check

Oracle database assessments can potentially include a wide range of tasks and probes, some of which might come under the heading of performance tuning, security vulnerability testing, or everyday DBA tasks (e.g., patching).    But to be effective, an Oracle database assessment needs to cover all the key installation, configuration, and policy factors that help improve uptime and/or prevent downtime. Even currently minor issues can cascade towards failure if left unchecked.   Some of the most important parameters and elements in your database environment to review and optimize include: 

  • Alert logs and trace files, to see if any events show up that point to potential database problems 
  • Database maintenance procedures, to validate best practices are being consistently followed
  • Parameter settings, to look for values that can negatively impact performance, security, stability, etc.
  • Data block validation, to identify corrupt blocks and missing files, which are prime causes of database outages
  • Finding invalid objects, which can proliferate and hurt performance and stability
  • Identifying index and tablespace fragmentation, both top causes of degrading database performance 
  • Validating important file configurations like datafiles, Redo log files and Archive log files to ensure database file and backup file integrity and prevent data loss and crashes
  •  Memory, CPU, and disk usage review to proactively address low resource conditions that can impact performance and stability

In-house or Outsource?

Oracle database assessments require significant expertise and attention to detail, especially if your environment is complex with many interrelationships. While in-house DBAs can perform Oracle database assessments, a fresh set of unbiased eyes from outside your organization can add a valuable perspective, while also offering expert guidance and sharing best practices.

Expect a Detailed Report

Whether you perform your Oracle database assessment in-house or outsource it, stakeholders should expect a comprehensive report that documents and prioritizes areas of concern and recommends best-practice next steps in line with business goals. 

What About Database Security?

In our experience, database security is often overshadowed by other security priorities.  Yet database security protects the lifeblood of your business—its sensitive data—and must be a core part of your overall cybersecurity program and strategy.  Because of database security’s importance and complexity, it makes sense to conduct Oracle database security assessments as an adjunct to your Oracle database assessments. A holistic approach that secures the data, the database configuration, identities and access, the network, the database server, and the physical environment is key to eliminating vulnerabilities and mitigating business risk.   Some database security “quick wins” we often recommend to clients include making the best use of Oracle’s built-in security features, which you’re already paying for as part of your database package. This includes downloading the Oracle Database Security Assessment Tool (DBSAT). This free tool scans your database and gives you a security profile including your overall database security configuration, users & entitlements, and sensitive data identification.

What’s Next?

Based on decades of experience helping our clients keep their databases stable and running optimally, Buda Consulting offers a 35-point Oracle database assessment that is reliable, thorough, unbiased, and keeps your in-house DBAs focused on other essential tasks.  Contact us to schedule time with an Oracle expert to talk over your situation, goals, and concerns.  

How CIS benchmarks plug Cybersecurity Framework Gaps

How CIS benchmarks plug Cybersecurity Framework Gaps

While a good Cybersecurity Framework specifies the implementation of controls to mitigate information-related risk for the full life cycle of critical data, in practice I have observed that in many organizations the framework implementation tends to focus on the networks, the servers, and the applications. The lack of database focus exposes the mission-critical data of the organization to unnecessary risk. This blog is intended to bring this issue to the forefront and to suggest that having a database professional implement a relevant CIS database benchmark can ensure that the database is secure even if a particular risk was not identified by the security framework implementation team.

 Server, Network, and Application Bias

 In my experience, security considerations of servers, networks, and to a lesser degree applications, are given more attention in most organizations than databases. In fact, a firm that we work with that specializes in helping customers implement security frameworks told me that they see database administrators involved in only about 5% of cybersecurity framework implementations! 

 Because of this bias, and because of the absence of database experts in the process, when security implementers examine the controls in the cyber security frameworks and specify corrective or preventive actions to take, they tend to neglect the database.

 This bias toward non-database components of an organization’s IT infrastructure is evident even in the introduction of the well-respected NIST special publication 800-53 A r5 document.  The target audience is described as follows :

  • Individuals with system development responsibilities (e.g., program managers, system designers and developers, systems integrators, information security engineers and privacy engineers);
  • Individuals with information security and privacy assessment and monitoring responsibilities (e.g., Inspectors General, system evaluators, assessors, independent verifiers/validators, auditors, analysts, system owners, and common control providers);
  • Individuals with system, security, privacy, risk management, and oversight responsibilities (e.g., authorizing officials, chief information officers, senior information security officers,11 senior agency officials for privacy/chief privacy officers, system managers, information security and privacy managers); and
  • Individuals with information security and privacy implementation and operational responsibilities (e.g., system owners, common control providers, information owners/stewards, mission and business owners, system administrators, system security officers, and system privacy officers).

 Conspicuously missing from that long list of individuals mentioned as responsible for information security are Database Administrators. But the database is arguably the most important part of the environment to secure. This is where the data lives!

  Why choose CIS benchmarks as database security guidelines?

 The Center for Internet Security is an independent non-profit organization that provides frameworks for keeping organizations safe from cyber threats. These frameworks include lists of controls that protect the organization from internal or external threats.  CIS also provides benchmarks that are essentially configuration guides used to assess and improve the security of specific applications, databases, or operating systems.

 Fortunately, the CIS database benchmarks are just that — database benchmarks. They prescribe vendor-specific configuration settings that need to be set to mitigate known vulnerabilities. 

 CIS benchmarks are a fast and more certain path to database security. They provide a more prescriptive approach to satisfy the key data security objectives of cyber security frameworks like NIST, ISO 27001, and CMMC.

 The CIS database security benchmarks provide a specific set of configuration guidelines one must follow to eliminate or mitigate known vulnerabilities in the target database, operating system, or application. Carefully following these guidelines can fill potential gaps that may remain when an organization determines which controls need to be implemented to satisfy the requirements of the framework and manage information related risk effectively.

Database-First Security

I believe that an important part of the fight against ever-increasing cyber threats is to focus intently on securing the database.   Applying proper controls at the database level first ultimately requires that controls be applied properly at other layers required by the frameworks.

 For example, properly limiting user privileges inside the database (by role), forces designers and administrators to implement role based security and OS authentication in a more thoughtful way.  Also, thoughtfully limiting OS system privileges and DBA privileges at the database level forces System Administrators to allocate privileged accounts in a more thoughtful way, enforcing principals like segregation of duties.  

If you are leveraging Oracle, MS SQL Server, MySQL, or MongoDB to hold mission critical or sensitive data, I strongly recommend that you leverage CIS benchmarks as a compliment to any cyber security framework.

The CIS benchmarks are available for free here. Contact us today for a free no-obligation consultation