Managing Server Sprawl With AWS Management Console Alerts

Managing Server Sprawl With AWS Management Console Alerts

A DBA’s Transition Guide for Hosting on the AWS Cloud

So your organization has decided to migrate your traditional on-premises IT infrastructure to the AWS Cloud in the hopes of realizing cost savings, and to cut down on the time it takes to provision and configure services to support new and changing application workloads. Applications can evolve over time to cloud-centric architectures in order to realize cost savings. But what about all the extra administrative tasks and pressures that go along with the additional speed and agility that cloud hosting provides? How do you keep a handle on all the new instances and know when there are server sprawl issues? Or, even better, avoid server sprawl issues in the first place?

Every DBA knows that whenever anything goes wrong it is always the database that is guilty until proven innocent. So how can DBAs adapt to the new challenges of AWS hosting to remain valuable assets to our organizations?

For the purposes of this blog we will focus on database monitoring and management using the AWS CloudWatch service. CloudWatch ingests performance data from a wide range of AWS resources, applications and services, sends alerts when needed, and keeps a 15-day historical record of performance information. You can even configure CloudWatch with alarm actions to automatically take corrective measures in response to certain predefined event types (but that is a blog for another time). As an added bonus, the CloudWatch “free tier” should be sufficient to perform the heavy lifting of issue detection and identification for most application databases.

Monitoring Performance Metrics of Databases Managed with Amazon RDS

As with traditional on-premises databases, CPU utilization and available memory are two sides of the same performance tuning coin for databases in the AWS Cloud.

You can use the CPUUtilization metric in CloudWatch to keep a historical view of CPU usage for databases managed with Amazon Relational Database Service (Amazon RDS). To get a more complete picture of how an RDS database instance is performing, you can combine CPU monitoring with these additional metrics:

  • FreeableMemory, which shows the amount of available memory
  • SwapUsage, which shows how much data in memory is being paged to disk due to memory shortages

You can also configure CloudWatch to send alerts when thresholds are crossed.

One of the best features of cloud hosting is you are no longer locked into a specific database footprint based on hardware that was purchased. If you start to see a trend of CPU availability consistently running above 80%, or you’re seeing a shortage of free memory, it could be time to take advantage of the cloud’s on-demand scalability and plan to grow your DB instance to increase capacity. Likewise, if you notice that your databases are consistently showing a large amount of free memory and CPU, then think about scaling down the database instance class to save money.

Storage Monitoring and Auto Scaling To Avoid Server Sprawl

In the AWS cloud, there is never a good reason for running out of available storage on a production database, or any database for that matter. For example, you can use the CloudWatch FreeStorageSpace metric to measure the amount of storage space available to a database instance and trigger alerts as needed. Amazon RDS hosted databases also support storage auto scaling on all major RDS database offerings. This option automatically increases the storage by 5 GB or 10% of currently allocated storage, whichever is higher.

The amount of input/output operations per second (IOPS) for a given database is derived from the storage type you are using together with the amount of storage allocated. It is important to know what IOPS numbers your current storage supports, and you can define the CloudWatch metrics ReadIOPS and WriteIOPS to notify you if you are approaching that level to avoid an issue.

You can get additional IOPS by moving to faster storage or growing your storage footprint to a certain degree. If you exhaust those options and are certain that poor application coding is not leading to excessive read/write activity, it may be time to start thinking about moving to the Provisioned IOPS (PIOPS) storage type, which can provide a higher level of guaranteed I/O for an additional cost.

CloudWatch also offers metrics for ReadLatency, WriteLatency, and DiskQueueDepth for you to configure if you want to keep a closer eye on those parameters.

Monitoring Database Connections

The CloudWatch DatabaseConnections parameter lets you monitor the number of active connections to your database and can alert you when the value approaches the max_connections property for the database.

The default value for max_connections is derived from the total memory and is database-specific, so it is important to check the setting for each database. You can also modify the default value of this parameter if required.

As you can see, CloudWatch simplifies a number of key database monitoring and management tasks. But CloudWatch is just one of several DBA support options you can try on AWS Cloud. You can also subscribe to Amazon RDS events to be notified about changes to a database instance, leverage the Performance Insights dashboard to help analyze database issues, and more.

If your company is thinking of migrating your databases to a cloud or managed hosting provider, Buda Consulting can help you choose the best option for your workloads, and can act as your “first line of defense” when problems like server sprawl arise. We also offer “personalized” managed database environments for Oracle, SQL Server and MySQL workloads.

Contact us to schedule a free consultation today.

For more information:

Cloud Customers Need to Secure Their Own Data

In response to the recent Capital One data breach, where a hacker exploited a misconfigured open-source Web Application Firewall hosted within Amazon Web Services (AWS), the Amazon CTO reminded customers that they must secure their own data when housed on AWS infrastructure.

This seems obvious, but it is a very important point.

When you move your data into AWS or any cloud provider, because it is not in our your data centers, and because you often no longer employ full-time people to manage the server hardware and software that house that data, you might get the feeling that someone else is managing our data just as carefully as our own staff once did.

That may be true for some dimensions of data management. For example, the cloud provider is responsible for making sure that the hardware stays up and running. Likewise, when you use software as a service (SaaS) or platform as a service (PaaS), the service provider is responsible for making sure that the application software and/or operating system stays up and running. In the case of infrastructure as a service (IaaS) offerings, the customer is still responsible for the latter two functions.

But in all the above cases, the customer is ultimately responsible for the security of their own data. The AWS security documentation describes it this way:

    • Security of the cloud – AWS is responsible for protecting the infrastructure that runs AWS services in the AWS Cloud. AWS also provides you with services that you can use securely. Third-party auditors regularly test and verify the effectiveness of our security as part of the AWS Compliance Programs. To learn about the compliance programs that apply to Amazon EC2, see AWS Services in Scope by Compliance Program.
    • Security in the cloud – Your responsibility is determined by the AWS service that you use. You are also responsible for other factors including the sensitivity of your data, your company’s requirements, and applicable laws and regulations.

The key takeaway is that if you have your data in any cloud service, you must be as rigorous in securing that data as if it were in your own data center—and in some cases even more so.

Following are some key things to think about when moving your data into the cloud. These are the same considerations you need to focus on when keeping your data in-house. Some of these concerns only apply to IaaS in the cloud, while others are relevant to all cloud service scenarios.

    • Is the underlying storage configured properly so that only the database software, application software and authorized users can access it?
    • Is the data encrypted both at rest and in transit?
    • Is proper authentication in place for the application or database, to ensure that only proper users can gain access to the system?
    • Is proper authorization in place for the application or database, such that data is exposed only to the proper users?
    • Is the network configured properly to reduce the surface area vulnerable to attack?

These considerations are not new. In fact, our Database Security Roadmap, initially created primarily with on-premises systems in mind, is still completely relevant with cloud technology.

Download the roadmap to make sure you have all the bases covered.

 

Database Security Issues in the Cloud, Part 2: Regulatory Compliance

As the number of databases moving to public, private and hybrid cloud computing infrastructure increases, security concerns are a significant and growing problem. Organizations will do well to scrutinize the security practices of cloud providers and other third parties that store their data. But wherever databases are running, responsibility for the security and integrity of data ultimately rests with the organization that owns the data – even when it resides with a service provider.

As I outlined in Part 1 of this post, cloud database security concerns fall into three basic categories: data access control (covered in Part 1), regulatory compliance, and physical/network controls. This post discusses regulatory compliance issues.

Regulatory compliance issues in the cloud

Much has been written about concerns with physical control of data in cloud environments. Cloud providers frequently need to reconfigure and/or move the virtual servers hosting your data, possibly across multiple data center locations.

How can you demonstrate to auditors that your data is secure if you don’t know exactly where it resides? The answer lies in having clear visibility into database activity relative to applicable regulations. You need to:

  • Put the necessary policies in place to meet compliance requirements;
  • Audit your databases against your policies and against all the regulations that apply to you, whether the data resides in a cloud environment or not; and
  • Make sure you can generate all the reports on database activity that you need to demonstrate compliance to auditors.

At Buda Consulting we use automated tools including Application Security’s Appdetective Pro to assess the vulnerability of clients’ databases and audit them against a host of regulations. The following list from the Appdetective documentation describes some of the key audit policies that we check in regulated environments:

  • Basel II – ideal for a Basel II compliance assessment
  • Best Practices for Federal Government
  • DISA-STIG Database Security Configuration – leverages the configuration parameters outlined by the DISA-STIG for SQL Server and Oracle
  • Gramm-Leach-Bliley Act – structured according to GLBA standards and recommended for GLBA compliance assessment
  • HIPAA – structured following NIST standards and best practices for databases security; highly recommended for use in a HIPAA compliance assessment
  • PCI Data Security Standard – recommended for use in PCI compliance assessments
  • Sarbanes-Oxley – follows CoBIT and ISO 17799 standards; recommended for use in a SOX compliance assessment

Using tools like App Detective Pro, auditors and advisors can perform a database security assessment against the organization’s policies and against applicable regulations, capture results for manual verification, and generate compliance reports.

Some of the scans will be difficult or impossible to run in a cloud environment without the assistance of the cloud provider. In particular, scans that require privileged operating system accounts will not be possible without cloud provider cooperation.

Therefore, it is important to obtain from the cloud provider documentation ensuring that they have the necessary controls in place to satisfy the applicable regulations.

This may be more difficult than it sounds. Some cloud providers refuse to give out any information about their security policies and procedures, indicating that doing so may compromise security. Others may withhold specifics, but instead point to the fact that they have undergone a SAS 70 type II audit. While passing a SAS 70 type II audit can be a valuable criterion to use when evaluating a provider, you must be sure to review which controls are included in that audit. These audits do not have to include every control that may be important to the pertinent regulations impacting your business.

Contact Buda Consulting to learn more about how to ensure the security of your data in the cloud.

Database Security Issues in the Cloud: Part 1

Database Security Issues in the Cloud: Part 1

Cloud Database Security Issues And Challenges

The benefits of cloud computing, including reduced IT ownership and operating costs and improved resource utilization, are just too good for many organizations to pass up. More and more businesses of all sizes are moving a wide range of applications to cloud environments.

But database security concerns remain a significant barrier to cloud adoption. When your applications are running on a cloud provider’s infrastructure, the provider is responsible for ensuring that its operations, facilities, network, hosts and other components are secure. But responsibility for securing your data ultimately rests with you, as do the consequences of failure.

Will a public cloud be more or less secure than your on-premise environment? What new and different security issues do cloud environments present?

Cloud database security issues and challenges fall into three basic categories:

  • Data access control
  • Regulatory compliance
  • Physical/network control

In this post I’ll talk about access controls, and will touch on the other issues in subsequent posts.

Access Control Issues In The Public Cloud

Loss of access control is a primary threat to data security. External threats are certainly a concern, but more and more studies show that the majority of access control threats (some say up to 80%) are internal. In a public cloud environment, internal threats can potentially come not only from your employees who have (or had) valid access to your DBMS, but also employees of the cloud provider.

You can address some access control concerns as you’re evaluating cloud providers. Find out how the DBA services that providers offer are structured, not only in terms of services provided but also around Segregation of Duties. Do the provider’s DBAs (and operating system administrators) have full DBA privileges giving them easy access to your data? How well vetted are the people working with your data? Is there full transparency around how many are involved, who they are, and where they’re located?

The bottom line is how much control do you have over the level of access administrators have to your data, if the cloud provider will be performing database administration. It may be more secure to maintain your own trusted database administration services.

Database Auditing Can Help!

How can you really know who is accessing your cloud-based data? Robust database auditing is vitally important. You need full visibility into database activity regardless of where your data resides.

What is database auditing? Basically, it’s the ability to consistently (and securely) record and report on all the actions of database users. Audited databases produce audit trails that can specify what objects were accessed or changed, how they were changed, and when and by whom.

Auditing is especially crucial when you need to pinpoint an unauthorized access from an authorized user. But it’s also helpful for compliance with regulations and corporate governance policy.

Of course, a weakness of database auditing is that it tracks what’s already happened. Ideally, your cloud-based database security solutions will include intrusion detection capabilities such as Pivot Point Security’s Oscar to identify suspicious activity before it results in any data loss or theft.

Another challenge around database auditing is performance degradation. Auditing needs to be structured appropriately so that useful details aren’t lost in a sea of data whose acquisition bogs down performance and/or storage.

Auditing By A Trusted Third Party

In addition to auditing databases with software, a trusted third party should conduct an audit to find vulnerabilities in your databases and environment, including a cloud environment. Third-party auditors use specialized tools like AppDetectivePro to identify and address security challenges like the following (and many others):

  • Whether the database software is patched and configured in the most secure manner.
  • Whether default passwords have been changed.
  • Whether the users that have access to the data are the ones who actually should have access, according to the business security policy.
  • Whether all machines in an environment (development, QA and production) have the same configuration and the same level of protection against vulnerabilities

Contact Buda Consulting to learn more about how to secure your data whether it is housed in house or in the cloud.