Best Practices for Error Handling in Database Stored Procedures

I recently performed an assessment of a large set of Oracle stored procedures and batch scripts that a client uses as part of their build and update process. The problem was that the build process was not very stable. During each build, something would go wrong, such as a table not being created or loaded as it should be, and it would take the development team a long time to figure out just what went wrong.

When doing an assessment like this, I look to see how well the code is formatted, structured (modularized), documented, and organized. I also look to see if it is robust (that is, it can gracefully handle failure), and how well the process is logged.

In this case, I found that the processes that were followed were well documented and well modularized. The batch scripts were organized well and used a good naming convention that made it pretty easy to understand the process flow. The PL/SQL stored procedures themselves were quite well written. They were formatted nicely and properly used some of the more sophisticated capabilities of Oracle PL/SQL, such as cursor variables and CURSOR FOR LOOPs when necessary. The developer clearly had a mastery of SQL and of PL/SQL.

But there was one glaring hole in the quality of the work. There was absolutely no exception handling or logging anywhere in the entire process. Not one stored procedure or PL/SQL script checked for errors.

So even though the build system itself was developed by a highly skilled developer, the absence of good exception handling diminished the quality of the whole system. There are two basic exception handling steps that I think should be placed in all PL/SQL code, and some that should be in any large or repeating process.

Here are a couple of the more important ones:

  • Include an exception handler for every logical grouping of PL/SQL statements (BEGIN/END block). This does two things: First, it forces the developer to think about the edge cases that might go wrong so they can handle them appropriately. This alone improves the quality of the code and the stability of the system. Second, it offers an opportunity to determine at what level the exception should be handled. This is critically important to ensuring that the process continues when it should and stops when it should not. This is especially important when nesting procedures or PL/SQL scripts.
  • Log everything, especially exceptions. By creating a separate logging procedure that you call at the beginning, at various points in within the code, and in exception blocks, you can always know exactly which processes completed before a failure. An added benefit of this approach is that the record of when each process took place helps a great deal with performance tuning because you can easily tell which processes are taking the most time.A procedure like this might take the the following parameters:
    • The name of the stored procedure it is being called from
    • An indicator of the position in the stored procedure in that procedure
    • A text string that consists of a concatenation of the values of key variables that are used in the procedure
    • A text string indicating the error if called within an exception block

These parameter values are simply written to a table and optionally to a text file via DBMS_OUTPUT. This helps enormously in troubleshooting later, and it is very easy to do. Write the logging procedure once and call it from everywhere.

There are other steps that you can take to enhance your exception handling further. But the simple steps I just described will significantly improve the stability of your code and your ability to find and resolve any errors that do arise.

If you have used other strategies to create robust, reliable code, please comment and share your ideas.

For expert advice on how to fine-tune use of Oracle’s recycle bin for your environment, schedule a free consultation with Buda Consulting.

For more information:

 

 

Ever Dropped an Oracle Table and Wish You Hadn’t?

If you’ve ever dreamed of going to a recycle bin like you can on Windows, and “undropping” an Oracle database table… Oracle has made your dream come true!

Introduced with Oracle 10g, Oracle’s recycle bin works a lot like what you’re used to in Windows. When this feature is enabled (the default setting), dropped tables don’t actually get deleted. Instead, they “land” in the recycle bin and can be restored.

The recycle bin is actually a data dictionary table containing data about dropped objects. What actually happens when recycling is enabled is that Oracle renames each dropped table and associated objects (indexes, LOB segments, triggers, etc.) with a system-generated name that starts with “BIN$.” The table data is still available and you can query it like any other table.

To “undrop” a dropped table that’s “in the recycle bin,” you perform an operation known as a “flashback drop.” The command syntax is: FLASHBACK TABLE <name> TO BEFORE DROP. This command just renames the BIN$<name> table back to its original name. (You’ll find an excellent overview of Oracle 10g’s Flashback features here.)

The downside of the recycle bin is that dropped tables are really only renamed. Their table segments are still taking up space in your tablespace, which still counts against your user tablespace quotas. To recover the space associated with dropped tables you need to explicitly “purge” unwanted tables and associated objects from the recycle bin.

Fortunately, purging is pretty easy and flexible. If you have the SYSDBA privilege, you can purge everything from all the recycle bins with PURGE DBA_RECYCLEBIN. Or you can purge just the user recycle bin with PURGE RECYCLEBIN. You can even purge “recycled” objects by schema and user with PURGE TABLESPACE <tablespace> or PURGE USER <user>. Users have access in the recycle bin only to those objects that they themselves dropped.

It’s normal to end up with multiple versions of a table in the recycle bin. In these situations, Oracle always restores the newest version. To restore an earlier version, you can simply refer to it by its unique, BIN$-prepended name. Alternatively, you can use FLASHBACK multiple times until you restore the version you want.

The RECYCLEBIN initialization parameter turns the recycle bin feature on or off at the system or session level. Turning recycling off doesn’t prohibit you from restoring objects that were in the recycle bin before you disabled it.

If you want to keep the recycle feature enabled but bypass it when dropping specific tables, just add the PURGE clause to the DROP TABLE statement; e.g.:

SQL> DROP TABLE TABLE_XYZ PURGE;

If you’re not already using the recycle bin feature, why not turn it on and get it working for you today? There’s really nothing to lose—literally!

For expert advice on how to fine-tune use of Oracle’s recycle bin for your environment, schedule a free consultation with Buda Consulting.

For more information:

Architecting to Maximize Recovery Options in Oracle

I recently received a frantic call from a client that believed they have been hacked and had to quickly recover data. They said that data belonging to two or more of their customers had been lost.

Our customer functions essentially as an application service provider (ASP). Their customers’ data is in an Oracle database that our client manages. Our client organizes this database such that each of its customers’ applications is served by a separate schema and they all share one database.

We have advised this client on numerous occasions to separate each of their customers’ data into separate databases, or at least separate tablespaces. This is a good idea for several reasons, one of which is recoverability. Unfortunately, they resisted our suggestions and today are probably regretting that decision.

Oracle Recovery Manager (RMAN) offers a few different options for recovery. You can recover an entire database, an individual tablespace or an individual data file. But you cannot recover an individual schema (table owner) and its objects unless they are in their own tablespace.

In the case of our client, it seems that the tables were lost at some time on a Sunday night, just prior to the nightly logical backup (export). The last good logical backup was from Saturday night.

The database is in ARCHIVELOG mode, meaning that RMAN could restore the database to any point in time, including right up to the point of the data loss. However, since the schemas (each of which serves a different customer) all share the same set of tablespaces, this type of recovery would wipe out any data since that point in time—even for the schemas (customers) that were not impacted by the loss.

Because our client’s customers that were not impacted had activity since the data loss event, we had one less tool in our recovery arsenal. If our clients’ customer data had been separated into separate tablespaces or databases, we could have recovered data for their customers that suffered loss without impacting the others at all.

We are now in the process of recovering the lost data from the Saturday logical backups. When that is complete, we will be doing a full RMAN restore to another location, where we will attempt to recover any lost data since the logical backup was taken. This will be a very arduous and time-consuming process.

The moral of the story is to consider recoverability when architecting your Oracle database. If you have users or applications that stand alone and may need to be recovered without impacting others, separate them at least by tablespace; and, if possible, use a separate database. The more you separate, the greater your recoverability options are.

It’s worth noting that the pluggable database option in Oracle 12c might assist in recoverability even further, if the reason for not separating your schemas into separate databases was ease of maintenance or resource concerns. With 12c you can create a pluggable database for each of your logically separate applications or users, while keeping the administration centralized and the overhead minimized. Check it out.

If you have had similar difficulties with restoring data due to the database architecture, please share your experiences. To talk over your database architecture considerations, contact Buda Consulting.

Tracking Oracle Database Access for Regulatory Compliance

Data stored in corporate databases is subject to increasing regulatory scrutiny. To ensure compliance with security guidelines in major regulations like SOX, PCI, HIPAA and FISMA, you need to implement controls not only to protect data from unauthorized access, but also to monitor and report on access when it occurs.

This capability, usually referred to as data access auditing, enables you to produce an audit trail regarding reads and writes to your Oracle database data. An audit trail can tell you after the fact what database objects were acted upon, who acted upon them, and when the action(s) occurred. Taken together, this data creates a state of non-repudiation, whereby a user cannot effectively deny that they performed the action in question.

Without this kind of comprehensive data, there’s no way you can effectively detect and deal with vulnerabilities or breaches related to your Oracle data, or pass a security audit (e.g., for ISO 27001 certification). You also can’t comply with regulations.

For example, the PCI Data Security Standard (PCI DSS) emphasizes the need to track access to cardholder data in real-time. PCI Requirement 10, in particular, requires companies to Track and monitor all access to network resources and cardholder data. Data access tracking is critical both for alerting and for analysis anytime there’s a concern. Without data access tracking there’s no hope.

HIPAA likewise mandates that Covered Entities and Business Associates be prepared to deliver an accounting of every time a patient record was viewed, let alone altered. Can you do that? If not, you might end up like the UCLA Health System, which paid a $865,500 fine for potential HIPAA violations after celebrity patients alleged that UCLAHS staff were looking at their protected health information (PHI) without permissible reason.

If you’re still not convinced, consider SOX Section 302.4.B – Establish verifiable controls to track data access. SOX mandates internal controls over all relevant data so that officers of public companies can’t plausibly deny that they are aware of, and in control of, changes.

So that’s the bad news: if you don’t have some kind of database auditing software in place for your Oracle data, you probably need it. The good news is that robust data auditing software, when properly configured and enabled for your environment, can reliably and comprehensively track the usage of your Oracle database resources. Then you can analyze and report on audit trail data anytime to respond to questions like, “When were Jane Smith’s payment account details last accessed?” or “Who changed Joe Jones’ appointment time?” Having a solid answer in a legal or regulatory context sure beats excuses…

But implementing database auditing can be tricky. Issues include what levels within the database to audit (database level, object level, user level), managing performance impacts, and storing the audit data efficiently and securely while keeping it accessible for reporting.

To provide database auditing for its customers, Oracle offers Oracle Audit Vault and Database Firewall, a unified solution that monitors database activity, provides the full spectrum of audit capabilities for compliance reporting and also detects and blocks unauthorized database activity like SQL injection attacks. There’s also Oracle Database Vault, which supports separation of duties and privileged user access controls. Oracle Database 12c (and older versions) also offer a range of security and compliance supports that complement database audit logs.

Do you have questions or concerns regarding your organization’s ability to track Oracle database access for compliance purposes? Contact Buda Consulting to discuss your environment and your needs.

7 New Offerings: Oracle Steps Up Its Big Data Game

7 New Offerings: Oracle Steps Up Its Big Data Game

For any big data effort to succeed, an organization needs to figure out how to combine the right data from the right sources to generate the right insights to achieve its goals. Transactional applications might hold data on customer purchases, for instance. But their browsing patterns, loyalty interactions and responses to tweeted offers are probably in web-based systems. You need to pull all the pieces together to solve the puzzle and exploit new opportunities.

Oracle understands this problem well, as evidenced by the four new products and three new services they’ve announced in recent weeks.

The new products include:

  1. Oracle Big Data Discovery—variously dubbed “the visual face of Hadoop” and “the foundation for data innovation,” it offers a straightforward, unified way for business users to explore data from multiple sources and then analyze it and share the actionable results… in minutes, says Oracle. The benefits include radically accelerated time-to-value for big data “projects” plus increased participation by a wider range of business users, adding up to bigger insights all around.
  2. Oracle GoldenGate for Big Data—a Hadoop-based tool that supports streaming of real-time, unstructured data from multiple transaction systems straight into popular big data systems like Apache Hadoop, Hive, HBase and Flume. Essentially it replicates data between systems in real-time in your choice of forms, without impacting source system performance.
  3. Oracle Big Data SQL 1.1—said to offer a query performance boost of up to 40% over previous versions.
  4. Oracle NoSQL Database 3.2.5—which includes several new features including new APIs, as well as improved security, usability and performance.

Together these new products “further Oracle’s vision to enable Hadoop, NoSQL, and SQL technologies to work together and be deployed securely in any model—whether public cloud, private cloud or an on-premises infrastructure.” It’s all about “operationalizing insights” by integrating new data sources with existing infrastructure, applications and databases.

The new big data services, all cloud-based are geared toward helping companies leverage big data specifically for marketing:

  1. Oracle Data as a Service for Marketing is aimed at generating sales leads. It offers a staggering 300 million profiles of business users and companies, which can be used to prospect for new business-to-business customers as well as improve your insight into your customer base and drive smarter cross-channel marketing.
  2. Oracle Data as a Service for Customer Intelligence is designed to provide a clearer picture of customer feedback on products and services, as well as offer insights into emerging trends or customer concerns. Among other data sources, it uses public information from 700 million social networking messages that Oracle collects daily.
  3. Oracle Marketing Cloud for Student Engagement offers templates that universities and others can use to attract students and improve retention among enrolled students. It essentially packages for the higher education vertical a range of existing Oracle cloud services. Similar packages are already available for a wide range of verticals including manufacturing, insurance, entertainment, nonprofits and many others.

These new offerings join two recent Oracle acquisitions (I know I promised I’d stop at seven, sorry…):

  1. The BlueKai platform, “the industry’s leading cloud-based big data platform that enables companies to personalize online, offline and mobile marketing campaigns…”
  2. Datalogix, whose technology “connects offline purchasing data to digital media to improve audience targeting and measure sales impact.”

All these offerings are aimed at helping Oracle customers advance their big data capabilities faster and with greater ease and success. “More people want to use Oracle software without having to run Oracle software,” said Thomas Kurian, Oracle’s VP of product development, at his Oracle OpenWorld 2014 keynote.

Is your IT department looking to respond to business demands for big data analytics that wring new insights and competitive momentum from your Oracle databases? Do you have the expertise you need in-house to address these new challenges while continuing to maintain current databases and applications? Contact Buda Consulting to discuss options for augmenting your core team with an expert Oracle DBA partner that can help with new demands or backstop everyday processes. 

 

Still Think Your Business is Too Small for Big Data?

Just because your business isn’t that big doesn’t mean you don’t have “big data” issues or a need for big data-style analytics to remain competitive. “Big data” is a relative term—relative to needs and capabilities for making business decisions, that is. Nearly every organization, whatever its size, will sooner or later reach the point where the volume, variety and velocity of the data it needs to analyze exceeds its storage and/or computational capacity, such that accurate and timely decision-making is impacted.

At that point, you have a big data challenge/opportunity. You can throw your data aside and fail to leverage it. Or you can find a cost-effective way to apply analytics to give your company new insights and answers that will help you compete and grow.

Understanding how to manage and analyze data to meet your evolving needs is critical, because big data analytics aren’t one-size-fits-all. You’ll potentially need to capture and integrate diverse sources of structured and unstructured data across everything from standalone department-level transactional systems to social media feeds to web forms to smart devices to external vendors’ services. Once you’ve maneuvered all this data so that query it, business users can begin to exploit it, both in planned and newly perceived ways. 

As you begin thinking about a big data analytics application, top-level planning considerations include:

  • Focus on business value first. What questions do you need to answer? Where does the data reside that you’ll need to process. In blogs, Facebook and other social media? In your customer transactions? Do you want to cut costs? Predict consumer buying patterns? Accelerate time-to-market for new innovations? Forecast supply and demand? Let business needs drive the technical approach.
  • Analytics capability is the bottom line. How will you determine what data is relevant and how it should be extracted, stored, transformed, etc.? If you think you have “too much data,” the real problem is that your analytics environment isn’t properly tuned.
  • Rely on data architecture best practices. “Big” shouldn’t mean “out of control.” Whatever the technology and implementation involved, keep your Oracle DBAs in the loop so that data movement and transformation are effectively planned for and handled.
  • Make sure you have the right resources in place to deliver the business value you’re looking for. Many companies don’t have in-house expertise to manage data effectively, for example. Data analytics expertise is also scarce. Augmenting your in-house skills with an outsourced Oracle DBA can yield the most value in the shortest time at the lowest cost.

Once you know what you want to accomplish you can more effectively plan for implementation. Any big data initiative involves the acquisition, transformation and storage of large volumes of data from multiple source systems, which is then analyzed.

Where will you get the source data you need? How will you handle data queries? How will you optimize performance? How will you manage testing, and against what data? Will you need more network capacity to handle the data movement? How do you integrate the new analytics solution with any existing data warehouse or other key data sources? As your new capabilities are used more and more, how do you monitor performance and plan for growth?

These are some of the central implementation questions your team will need to answer before your big data “challenge” can become an “opportunity” for decision-makers. Planning at the IT infrastructure level is, of course, critical—but knowing what business questions you want to answer should be the driving force behind the initiative.

To ensure your big data analytics application can generate reports quickly, in appropriate formats, which provide the insights your business demands, contact Buda Consulting. A free consultation with us can be an ideal way to explore your big data analytics challenges, with an eye toward best-practice data modeling, database design, performance and more.