Managing Server Sprawl With AWS Management Console Alerts

Managing Server Sprawl With AWS Management Console Alerts

A DBA’s Transition Guide for Hosting on the AWS Cloud

So your organization has decided to migrate your traditional on-premises IT infrastructure to the AWS Cloud in the hopes of realizing cost savings, and to cut down on the time it takes to provision and configure services to support new and changing application workloads. Applications can evolve over time to cloud-centric architectures in order to realize cost savings. But what about all the extra administrative tasks and pressures that go along with the additional speed and agility that cloud hosting provides? How do you keep a handle on all the new instances and know when there are server sprawl issues? Or, even better, avoid server sprawl issues in the first place?

Every DBA knows that whenever anything goes wrong it is always the database that is guilty until proven innocent. So how can DBAs adapt to the new challenges of AWS hosting to remain valuable assets to our organizations?

For the purposes of this blog we will focus on database monitoring and management using the AWS CloudWatch service. CloudWatch ingests performance data from a wide range of AWS resources, applications and services, sends alerts when needed, and keeps a 15-day historical record of performance information. You can even configure CloudWatch with alarm actions to automatically take corrective measures in response to certain predefined event types (but that is a blog for another time). As an added bonus, the CloudWatch “free tier” should be sufficient to perform the heavy lifting of issue detection and identification for most application databases.

Monitoring Performance Metrics of Databases Managed with Amazon RDS

As with traditional on-premises databases, CPU utilization and available memory are two sides of the same performance tuning coin for databases in the AWS Cloud.

You can use the CPUUtilization metric in CloudWatch to keep a historical view of CPU usage for databases managed with Amazon Relational Database Service (Amazon RDS). To get a more complete picture of how an RDS database instance is performing, you can combine CPU monitoring with these additional metrics:

  • FreeableMemory, which shows the amount of available memory
  • SwapUsage, which shows how much data in memory is being paged to disk due to memory shortages

You can also configure CloudWatch to send alerts when thresholds are crossed.

One of the best features of cloud hosting is you are no longer locked into a specific database footprint based on hardware that was purchased. If you start to see a trend of CPU availability consistently running above 80%, or you’re seeing a shortage of free memory, it could be time to take advantage of the cloud’s on-demand scalability and plan to grow your DB instance to increase capacity. Likewise, if you notice that your databases are consistently showing a large amount of free memory and CPU, then think about scaling down the database instance class to save money.

Storage Monitoring and Auto Scaling To Avoid Server Sprawl

In the AWS cloud, there is never a good reason for running out of available storage on a production database, or any database for that matter. For example, you can use the CloudWatch FreeStorageSpace metric to measure the amount of storage space available to a database instance and trigger alerts as needed. Amazon RDS hosted databases also support storage auto scaling on all major RDS database offerings. This option automatically increases the storage by 5 GB or 10% of currently allocated storage, whichever is higher.

The amount of input/output operations per second (IOPS) for a given database is derived from the storage type you are using together with the amount of storage allocated. It is important to know what IOPS numbers your current storage supports, and you can define the CloudWatch metrics ReadIOPS and WriteIOPS to notify you if you are approaching that level to avoid an issue.

You can get additional IOPS by moving to faster storage or growing your storage footprint to a certain degree. If you exhaust those options and are certain that poor application coding is not leading to excessive read/write activity, it may be time to start thinking about moving to the Provisioned IOPS (PIOPS) storage type, which can provide a higher level of guaranteed I/O for an additional cost.

CloudWatch also offers metrics for ReadLatency, WriteLatency, and DiskQueueDepth for you to configure if you want to keep a closer eye on those parameters.

Monitoring Database Connections

The CloudWatch DatabaseConnections parameter lets you monitor the number of active connections to your database and can alert you when the value approaches the max_connections property for the database.

The default value for max_connections is derived from the total memory and is database-specific, so it is important to check the setting for each database. You can also modify the default value of this parameter if required.

As you can see, CloudWatch simplifies a number of key database monitoring and management tasks. But CloudWatch is just one of several DBA support options you can try on AWS Cloud. You can also subscribe to Amazon RDS events to be notified about changes to a database instance, leverage the Performance Insights dashboard to help analyze database issues, and more.

If your company is thinking of migrating your databases to a cloud or managed hosting provider, Buda Consulting can help you choose the best option for your workloads, and can act as your “first line of defense” when problems like server sprawl arise. We also offer “personalized” managed database environments for Oracle, SQL Server and MySQL workloads.

Contact us to schedule a free consultation today.

For more information:

Cloud Customers Need to Secure Their Own Data

In response to the recent Capital One data breach, where a hacker exploited a misconfigured open-source Web Application Firewall hosted within Amazon Web Services (AWS), the Amazon CTO reminded customers that they must secure their own data when housed on AWS infrastructure.

This seems obvious, but it is a very important point.

When you move your data into AWS or any cloud provider, because it is not in our your data centers, and because you often no longer employ full-time people to manage the server hardware and software that house that data, you might get the feeling that someone else is managing our data just as carefully as our own staff once did.

That may be true for some dimensions of data management. For example, the cloud provider is responsible for making sure that the hardware stays up and running. Likewise, when you use software as a service (SaaS) or platform as a service (PaaS), the service provider is responsible for making sure that the application software and/or operating system stays up and running. In the case of infrastructure as a service (IaaS) offerings, the customer is still responsible for the latter two functions.

But in all the above cases, the customer is ultimately responsible for the security of their own data. The AWS security documentation describes it this way:

    • Security of the cloud – AWS is responsible for protecting the infrastructure that runs AWS services in the AWS Cloud. AWS also provides you with services that you can use securely. Third-party auditors regularly test and verify the effectiveness of our security as part of the AWS Compliance Programs. To learn about the compliance programs that apply to Amazon EC2, see AWS Services in Scope by Compliance Program.
    • Security in the cloud – Your responsibility is determined by the AWS service that you use. You are also responsible for other factors including the sensitivity of your data, your company’s requirements, and applicable laws and regulations.

The key takeaway is that if you have your data in any cloud service, you must be as rigorous in securing that data as if it were in your own data center—and in some cases even more so.

Following are some key things to think about when moving your data into the cloud. These are the same considerations you need to focus on when keeping your data in-house. Some of these concerns only apply to IaaS in the cloud, while others are relevant to all cloud service scenarios.

    • Is the underlying storage configured properly so that only the database software, application software and authorized users can access it?
    • Is the data encrypted both at rest and in transit?
    • Is proper authentication in place for the application or database, to ensure that only proper users can gain access to the system?
    • Is proper authorization in place for the application or database, such that data is exposed only to the proper users?
    • Is the network configured properly to reduce the surface area vulnerable to attack?

These considerations are not new. In fact, our Database Security Roadmap, initially created primarily with on-premises systems in mind, is still completely relevant with cloud technology.

Download the roadmap to make sure you have all the bases covered.

 

The Cloud Does Not Exist!

“Lets Move to The Cloud!”

We often hear people talk about moving their business or their data center or their software to “The Cloud“.  For many, this concept seems confusing and vague.

That’s because The Cloud does not exist!

There is no “The Cloud”. In reality, there are many clouds. And therefore we can’t  just decide to move to “The Cloud”.

Instead, there are many clouds with services being offered by many vendors. A cloud at its core is a collection of hardware and software combined by a vendor into a service offering that provides some level of computing service for their customers.  Depending on our risk tolerance,  bandwidth requirements, data custody and security requirements, level of technical expertise, and business model, one or more of these levels may make sense for our organization.  These levels are known as Infrastructure as a Service, Platform as a Service, and Software as a Service. The table below describes these levels in more detail. The technical reader will recognize that these levels are fuzzy and that the things that are included in each level and the things that we control in each level can vary from vendor to vendor but this table gives us a sense of each level.

[table id=1 /]

A source of confusion when thinking of the cloud is that it is often thought of as an external organization abstracting away the underlying technical details of our computing environment. For example, PAAS (Platform as a service) offerings abstract away everything from the physical hardware up through the operating system, leaving us to have to manage only the software frameworks we are using, which may be database management systems like Oracle, or software development platforms like Visual Studio. But in reality, it is the abstraction that is the essence of a cloud, not the fact that it is an external party providing it. Therefore we can have on-premises clouds hosted at our own data center, and private clouds managed by our own team but that are hosted at external data centers. It is a stack of software that provides the abstraction that makes it a cloud, not the vendor. The foundation of this stack is mostly virtualization and automation related software.

Journey into the Clouds

Jumping right into the clouds is difficult and scary. We can’t see things clearly with all these clouds around. We don’t know what and who we can trust.

The good news is that we can take advantage of some of the huge benefits of cloud computing without some of the riskier aspects.  When we run a private cloud or on-premises cloud, we still benefit from the virtualization and automation when provisioning servers, databases, etc, while minimizing the risk that may be introduced by using shared services or relying on external vendors. Additionally,  if we transition our software to use our own private cloud services, they will be much further along when it comes time to move to public cloud services in the future.

There are other options to make the journey less scary as well: Some vendors are providing ways to simplify the process of taking meaningful steps toward the public cloud while staying on premises. Oracle offers the Oracle Cloud Machine, a machine that can live in our own data center, offer IAAS and PAAS capabilities, and is installed and managed by Oracle, behind our own firewalls. When we are comfortable moving to the public cloud, the entire environment can be picked up and moved to Oracle’s public cloud.  And Microsoft has just announced the Azure Stack. This will enable us to use the same software stack that is running in the Microsoft Public Azure cloud, but we can run it in our own data centers, on our own hardware.  Again, after transitioning our software to use cloud services, a future shift to the Azure public cloud will be greatly simplified.

Clouds are not one size fits all

So when we think about how to transition to cloud based technology, we should stop thinking about moving to “The Cloud” because that is too simplistic. We need instead to look at each component of our IT services and think about what level of computing resources we would like abstracted away for that component, and then choose from the available clouds that provide that level of service.

For example, we may decide that for our Customer Relationship Management System, SAAS is the proper level because we are comfortable with the cloud vendor providing all of the IT administration, Disaster Recovery, and Security services, but for our Chemical Inventory Management system that holds highly sensitive formula information, we may choose to go with a PAAS solution or even an IAAS solution because we want to have more control over network and data security. And for a Financial Trading System we may insist on a private cloud IAAS solution so we have full control over all aspects of redundancy,  connectivity,  and security.

Get your head out of “The Cloud”

We are all thinking about the Cloud these days. I heard a talk by the great physicist and author Michio Kaku recently who predicts that through Artificial Intelligence and technology that can read and write our memories, we will all essentially think in “The Cloud” some day.

But for our businesses today, we have to think about the individual clouds so that we don’t get lost. For each service being provided to our employees, partners, customers, regulators, etc, we must think about the appropriate level of service and abstraction (IAAS, PAAS, SAAS), and then evaluate offerings of the cloud vendors at that level.

So when we think about “The Cloud”, we must think instead of  “The Clouds”. And we may see things a bit more clearly.

If you would like to discuss more about your Journey into The Clouds please give us a call at (888) 809-4803 x 700 and if you have further thoughts on the topic, please add comments!

If you enjoyed this article please like and share!

 

4 Key Use Cases for Oracle’s Multitenant Architecture

4 Key Use Cases for Oracle’s Multitenant Architecture

If you’re thinking of moving to Oracle Database 12c, “the first database designed for the cloud,” one of the most compelling reasons could be the Oracle Multitenant Architecture option. In this revolutionary new architecture, you can “plug” many Oracle databases into a single container database—no application changes required.

Let’s quickly head off any potential confusion around the term “multitenant.” That word has been used for awhile in relation to sharing data elements (records) across databases, especially in contexts like Software-as-a-Service (SaaS) delivery. This scenario is now best referred to as tenant striped database. With Oracle’s multitenancy, you can run many databases within one container, with several databases potentially sharing a common set of metadata.

The advantages of Oracle’s multitenant architecture are sweeping, driving economies of scale across both capital and operating expenses. First, plugging multiple databases into a single, multitenant container creates the highest density yet possible, with shared memory and background processes to further enhance hardware utilization. The advantages over the schema-based consolidation possible with Oracle 11 are 1) no application changes required; and 2) pluggable databases are isolated for improved reliability and security.

Next, multitenancy enables rapid provisioning and cloning. Creating, moving and cloning pluggable databases takes just seconds with new SQL commands. Patching and upgrades are also simplified and accelerated—just unplug/plug to an upgraded container! (What will you do with all the time that will save?) The overall theme is “manage many as one” across tasks like backup and recovery. You even get new capabilities in the Resource Manager to optimize allocation of resources among pluggable databases.

What are the best use cases for Oracle Multitenant? There are quite a few but these four stand out:

  1. Application development/testing
    Multitenant makes it very quick, simple, safe and efficient for individual engineers to rapidly provision and recycle private copies of a few “master test databases.” Just the productivity benefits of this one use case might be sufficient to justify implementing multitenancy.
  2. Infrastructure consolidation
    Multitenancy supports the use of fewer, more powerful physical servers.
  3. Delivering and supporting SaaS applications
    Multitenancy is ideal for deploying separate instances of identical applications to individual “tenants.” This model is predicted to be popular among cloud vendors in particular.
  4. Enabling Database-as-a-Service (DBaaS) in a private or hybrid cloud.
    Multitenancy has a built-in self-service provisioning mechanism that makes it straightforward to enable self-service provisioning of databases; e.g., in development test environments.

Can you upgrade to Oracle Database 12c and not deploy the multitenant option? Yes… but why would you want to? You can even dip a toe in the water by plugging just one database into a container, which requires no additional license.

If you’re considering upgrading to Oracle Database 12c and want to talk over the architecture and design considerations, I invite you to contact Buda Consulting for guidance on analyzing your requirements and architecting an optimal solution. 

4 Key Use Cases for Oracle’s Multitenant Architecture

5 Reasons Not to Put Your Oracle Databases in the Cloud

As I blogged about recently, Database-as-a-Service (DBaaS) is an important option for many Oracle database customers. But “data in the cloud” is not right for every business, and there are important issues to be aware of as you consider how or if to leverage DBaaS.

Following are five concerns that might limit your use of DBaaS. This is not an exhaustive list, but it covers the most critical areas:

One: Security constraints

Your data is your company’s most valuable asset. But in the wrong hands, it can also be a major threat to your reputation and even your operations. Likewise, many organizations in healthcare, financial services and other verticals face strict compliance requirements that encompass both data security and third-party relationships involving hosting or even transmitting data. When your data is in the cloud, security is largely outside your control. When (let’s be realistic: it’s not a question of “if”) the DBaaS provider’s comparatively large attack surface is breached, will your data be safe? How can you know for certain whether it was or wasn’t compromised? What, if any, additional precautions can you take when data is at rest or in transit to and from the cloud?

Two: Data governance/access

When your data is in the cloud and a third-party is managing it, who has access to it? And what levels of access do they have? With DBaaS, you don’t have direct control over that. People outside your organization, very possibly on another continent, could have access for a wide range of reasons; e.g., database administration or IT infrastructure monitoring. Privilege escalation vulnerabilities can be greater in DBaaS scenarios. Some classes of data (test/development data, non-sensitive data, etc.) are better suited to this scenario than others. What is your level of comfort with how your data is stored and accessed in the cloud?

Three: Database availability, performance and uptime

When you move a database from on-premise into the cloud, you increase the distance between your application and your data. You also introduce factors outside your control that could make your data unavailable or impact database uptime. WAN link failures and service provider outages are probably the two main concerns. What are the service provider’s availability and uptime guarantees? Do you know that your data is being properly and securely backed up? If problems occur, what is your recourse? What is your recovery plan?

Four: Database performance and user experience

When a database resides in a public cloud environment, there are multiple shared infrastructure components supporting it, including the Internet itself, which you can’t control and in many cases would be challenged to monitor. As a result, users’ experience with accessing the data can be very inconsistent. This issue can be made worse when the volumes of data involved are large.  You or your users might also experience poor performance as a result of any number of issues with the service provider’s infrastructure. Will the data service provider do a good job with data management? If they don’t, how much time, energy and money will it take to rectify the situation?

Five: Vendor lock-in and interoperability concerns

DBaaS customers naturally want to be able to shift from one provider to another without a lot of trouble and expense. But cloud vendor lock-in can be burdensome and costly. It may not even be a matter of dissatisfaction with the provider; you may simply want to use different providers that are closer to your diverse customers, in order to minimize latency issues. Or your provider could go out of business. If you write/optimize application components for one DBaaS environment, will they run on another DBaaS provider’s infrastructure? Or are you stuck with that provider’s APIs?

Still thinking about using DBaaS? To get some expert advice on whether DBaaS is right for your Oracle databases, contact Buda Consulting. Concerned about data security? Sign up for a database security audit.

 

Database Security Issues in the Cloud, Part 2: Regulatory Compliance

As the number of databases moving to public, private and hybrid cloud computing infrastructure increases, security concerns are a significant and growing problem. Organizations will do well to scrutinize the security practices of cloud providers and other third parties that store their data. But wherever databases are running, responsibility for the security and integrity of data ultimately rests with the organization that owns the data – even when it resides with a service provider.

As I outlined in Part 1 of this post, cloud database security concerns fall into three basic categories: data access control (covered in Part 1), regulatory compliance, and physical/network controls. This post discusses regulatory compliance issues.

Regulatory compliance issues in the cloud

Much has been written about concerns with physical control of data in cloud environments. Cloud providers frequently need to reconfigure and/or move the virtual servers hosting your data, possibly across multiple data center locations.

How can you demonstrate to auditors that your data is secure if you don’t know exactly where it resides? The answer lies in having clear visibility into database activity relative to applicable regulations. You need to:

  • Put the necessary policies in place to meet compliance requirements;
  • Audit your databases against your policies and against all the regulations that apply to you, whether the data resides in a cloud environment or not; and
  • Make sure you can generate all the reports on database activity that you need to demonstrate compliance to auditors.

At Buda Consulting we use automated tools including Application Security’s Appdetective Pro to assess the vulnerability of clients’ databases and audit them against a host of regulations. The following list from the Appdetective documentation describes some of the key audit policies that we check in regulated environments:

  • Basel II – ideal for a Basel II compliance assessment
  • Best Practices for Federal Government
  • DISA-STIG Database Security Configuration – leverages the configuration parameters outlined by the DISA-STIG for SQL Server and Oracle
  • Gramm-Leach-Bliley Act – structured according to GLBA standards and recommended for GLBA compliance assessment
  • HIPAA – structured following NIST standards and best practices for databases security; highly recommended for use in a HIPAA compliance assessment
  • PCI Data Security Standard – recommended for use in PCI compliance assessments
  • Sarbanes-Oxley – follows CoBIT and ISO 17799 standards; recommended for use in a SOX compliance assessment

Using tools like App Detective Pro, auditors and advisors can perform a database security assessment against the organization’s policies and against applicable regulations, capture results for manual verification, and generate compliance reports.

Some of the scans will be difficult or impossible to run in a cloud environment without the assistance of the cloud provider. In particular, scans that require privileged operating system accounts will not be possible without cloud provider cooperation.

Therefore, it is important to obtain from the cloud provider documentation ensuring that they have the necessary controls in place to satisfy the applicable regulations.

This may be more difficult than it sounds. Some cloud providers refuse to give out any information about their security policies and procedures, indicating that doing so may compromise security. Others may withhold specifics, but instead point to the fact that they have undergone a SAS 70 type II audit. While passing a SAS 70 type II audit can be a valuable criterion to use when evaluating a provider, you must be sure to review which controls are included in that audit. These audits do not have to include every control that may be important to the pertinent regulations impacting your business.

Contact Buda Consulting to learn more about how to ensure the security of your data in the cloud.