5 Steps to Finding the Right SQL Server Consultant for Your Business

5 Steps to Finding the Right SQL Server Consultant for Your Business

Sometimes the need for a Microsoft SQL Server consultant seems to sneak up on you. Your database responsiveness and availability appear to be okay. Then, somewhere in the organization, someone deploys a new database alongside a new application. All of a sudden, you’re having performance and/or reliability issues.

What just happened and how do we fix it? That’s the kind of question an expert SQL Server consultant answers every day. If you need database guidance, here are 5 steps to finding the right SQL Server consultant:

One: Evaluate your database needs

Yes, you’re hiring a database expert to help assess and validate your current and future needs. But you need to do some fact-finding before you choose a consultant.

Start with a list of your pain points and a description of your desired future state. Quantify your hoped-for outcomes as much as possible.

Drilling down into your pain points as far as your expertise permits will both identify core issues and help your SQL Server consultant solve them faster. You might even discover that you really need a network expert or a storage expert, not a database expert. 

Two: Match your needs with consultant offerings

To pick the right SQL Server consultant from among many available options, start with basic questions, like:

  • Are they recommended by one or more peers?
  • Do they have current reference clients in your industry?
  • How long have they been in business?
  • Do they have the right expertise on tap? 
  • Are they local to you and able to work on-site if needed?
  • Are they a good match for your business size-wise? 
  • Do their rates fit your budget?
  • Do they have solid Microsoft credentials?

Keep in mind also that your needs will change. You may want a SQL Server consultant that is “a partner not a vendor”—one you can trust to cover any database problem you’re likely to have. 

Three: Talk to the firms on your short list

Most reputable SQL Server consultants offer a free introductory call to talk over your unique challenges and goals. This is a great way to compare ideas and advice, while identifying the consultants you click with. Be ready to discuss the problems you’re having with your database environment, plans for moving workloads to the cloud, compliance concerns, security questions, and so on. 

Do the people you’re talking to sound knowledgeable and confident? Do they ask good questions? Do their initial recommendations make sense? Are they “good listeners” who are motivated to work with you? Or do they sound like they just want to close a deal?

Four: Hire the winner for an assessment or point project

If you have time and/or want to minimize business risk, hire your top pick to perform a database health check, compliance assessment, or security evaluation. These assessment projects will give you critical information about your environment, while providing an accurate indication on whether this SQL Server consultant is right for your business. 

If you’re pleased with the results, you’ve got a winner! If not, at least you learned a lot. Go back to your short list and ask the top contenders a few more questions. 

Five: Write a win-win contract

A win-win contract is one that both parties can live with, which defines unambiguously both parties’ responsibilities so there is always clarity when questions arise. Weak contracts can create financial risk on both sides of the table and can lead to damaged business relationships and lawsuits.

Most contracts with an SQL Server consulting partner should describe (if applicable) the scope of work to be performed, agreed deliverables, project timelines, agreed compensation and fee structures, special equipment or other resource needs, etc. Some contracts also include termination clauses that describe stipulations for ending the contract.

Most consultants will also need to sign a confidentiality agreement to protect your intellectual property and other sensitive data.

What’s next?

With so many possibilities, choosing the right SQL Server consultant can seem bewildering. 

At Buda Consulting, we’re small enough to really know your business, but have the expertise to handle whatever comes up with your SQL Server database environment. From a health check to a cloud migration, we can provide the planning, maintenance, optimization, and specialized skills you need to take your data’s business value to the next level. We’re located in central New Jersey and serve companies of all sizes across the US.

Contact us to schedule a free call with a SQL Server specialist.

 

The Real Risks of AI

The Real Risks of AI

The recent release of ChatGPT, Bard, DALL-E 2, and Stable Diffusion has caused a lot of excitement and a lot of fear. In this article, we are going to dive deep into the real risks of AI.

The fear takes a few forms. Some are afraid that this will lead to computers becoming sentient and taking over the world. Others are afraid that it will lead to the elimination of jobs. Still others are afraid that it will lead us all to stop thinking and to depend too much on these new tools, resulting in less creativity and less progress. 

I think all of the above fears are exaggerated and that the benefits of AI will likely outweigh the risks in the long run. 

But there is another kind of risk that I am very concerned about and that is the topic of this article.

The Risks of AI: Garbage In – Garbage Out

When computers first took hold in corporations around the world, when applications started performing calculations on data held in databases, and reports were generated from that data, a common refrain was heard in IT departments. And as I think of it, I have not heard this sentiment spoken very often recently.  “Garbage in – Garbage Out”.   This saying was shorthand for saying that the quality of the output of any computer program was only as good as the data entered into the database used by the program.

Initially, all data for a given program was entered by and specifically for the company that was using the application, and often by the specific department using the data.  This meant that the company had significant control over the quality of the data and an understanding of the origins of the data.  Many companies did a poor job of controlling the quality of the data and of the entry of the data, but they did have visibility into the source of the information and could determine the quality of it when needed.

AI Data Sources Have Changed

In the many years since then, much has changed. Now in addition to data generated by and entered by a company, its applications use a great deal of data that is gathered from outside sources. There are thousands of available data sources, both public and private, that can be purchased and used by internal applications, with very little control or visibility into the quality of that data. While the quality of internally generated data has tended to improve over time with better controls at the database and application level, there is far less visibility or control of the quality of the external data sources. 

The inclusion of these outside data sources makes validating the results of the calculations, reports, and other outputs very critical. Today, there is still generally a human interpreting these reports and calculations, and making decisions based on them. This interpretation is the last line of defense against ‘Garbage Out’.  In almost any business application, an experienced user can spot an incorrect result. They might not know what is wrong, but they can tell something is wrong, sending developers and database administrators back into the data to figure out where the problem is.  This plays out every day in every organization.

AI Magnifies Garbage In – Garbage Out

Traditional applications present the data they ingest in different forms (reports, graphs) so that humans can make decisions based on those presentations. 

AI applications take it a very significant step further. They don’t present data that helps the user make a decision, instead, they make decisions for the user.  This is a very significant difference because the user no longer has visibility into the semi-processed data that could have clued them into a problem with the data. Some AI models will list the sources and logic they used to make the decision, but even that does not give visibility into the actual data used. 

If a well-developed and tested AI model had perfect data, the results would be, well, perfect. AI models learn from the data that they ingest.  But AI models, especially general use models like the ones mentioned above, use free, publicly available datasets. The quality of this data is suspect at best.  And the organization of the data (in effect the underlying data models) can influence the inferences drawn by the models. 

Other Risks of AI: A Real-World Example

I listen to a podcast called the All-in Podcast.  This podcast features four well-known investors who talk about politics, investing, and other interesting topics. During an episode shortly after the release of ChatGPT, they asked the AI chat tool to give a profile of one of the hosts (David Sachs).  The model created a very accurate-looking profile of Sachs, but in the footnotes, attributed a number of articles to him that he did not write.  I suspect he had commented about those articles, and the model made inaccurate inferences about his involvement in the articles. 

This is a perfect example of the risk of not having control or visibility into the data sets used by the AI models. 

The Real Risks of AI

So in my opinion, the real risk of AI is humans taking action based on the decisions and answers generated by models that use uncontrolled data.  The example I gave is a trivial one, but I am sure you can imagine many examples where an action made as a result of a faulty AI decision can be disastrous (choosing proper building materials, as one example).

So after all these years, awareness of Garbage In – Garbage out is more important than ever.

Securing Your Database: The Importance of SQL Server Audit to Safeguard CUI

Securing Your Database: The Importance of SQL Server Audit to Safeguard CUI

Businesses that handle Controlled Unclassified Information (CUI) or other sensitive data need to comply with applicable information security and privacy regulations to minimize the risk of a data breach, data loss, and other threats to data confidentiality, integrity, and availability. This generally includes regularly or continuously monitoring and auditing all the activities taking place in your Microsoft SQL Server environment. 

To help automate this critical monitoring process, Microsoft provides SQL Server Audit, a tool built into SQL Server that can read database transaction logs to provide information about data and object changes affecting the database. By keeping tabs on how a database is being used, DBAs or security teams can spot suspicious actions that could indicate a potential incident, such as a data breach or cyber attack. 

How SQL Server Audit Works

SQL Server Audit lets you track and analyze events taking place on Microsoft SQL servers to reveal potential vulnerabilities and threats to CUI. It enables you to log all changes to the server settings, as well as record all server activities, in a special database table.

For example, you can check SQL Server Audit data for suspicious log events that point to unauthorized network access. Other activities you can log with SQL Server Audit include:

  • Insert, update, and delete attempts to the server table
  • Connection and login attempts, including both, failed and successful logins
  • Database object access attempts
  • Database management activities
  • Admins and other users who connected to the database engine
  • Creating new logins and databases

You can choose from among several levels of auditing with the SQL Server Audit tool, depending on your specific compliance requirements (e.g., compliance with CMMC Level 2 versus CMMC Level 3). You can create server audits to log server-level events, and/or database audits for database-level events. 

SQL Server Audit Benefits

The overall goal of SQL Server audits is to track how database records are used, who accessed them, and when. This data can help you comply with data protection and privacy regulations, including those governing CUI on non-government systems. It can also improve your information security and incident response—the ability to prevent, detect and contain an attack or data breach impacting your database.

Database auditing also improves your confidence in the accuracy, consistency, and completeness of your data for analytics purposes. Finally, it helps you chart a path of continuous improvement by uncovering problems with your database security, administration, and/or monitoring.

Most common SQL Server Audit levels to protect CUI

Guidance on safeguarding CUI generally recommends implementing either of two SQL Server Audit levels as part of your SQL database audit program: C2 Audit or Common Criteria Compliance. These are the most widely used international standards for SQL auditing.

C2 Audit records data beyond the SQL Server, such as who triggered what events in which database, the event type, the server name, and the event outcome. To get started, you assign an audit ID to each group of related processes starting at login. System calls that these processes perform are thereafter logged with that audit ID. Examples include calls to open or close files, calls to change directories, and failed or successful login attempts.

Common Criteria Compliance replaces C2 Audit processes in many compliance frameworks. This approach uses Extended Events (superseding SQL Trace) to gather audit event details. To residually protect CUI, you can filter specific events out of the trace and subsequently use them in applications that manage SQL Server. Note that Common Criteria Compliance can impact SQL Server performance and should ordinarily be enabled only if your guidance on safeguarding CUI mandates it.

Key SQL Server Audit actions to protect CUI

These are some of the most critical SQL Server events to log for most organizations:

  1. Failed login attempts. This data is vital to identify attempted or successful attacks on your database.
  2. Role member changes. This tells you when a login is added or removed from a server or database role, so you can track your privileged users. and know if an unauthorized user was added.
  3. Database user changes. Like with role member changes, this event tells you when users are created, changed, or deleted from a database so you know who has access within a SQL Server instance.
  4. Database object adds/deletions/changes. While this can create bulky audit logs, guidance on safeguarding CUI frequently mandates it.
  5. AUDIT_CHANGE_GROUP. Logging this event lets you identify when a user is altering or disabling your audit logs to “cover their tracks,” and is often required in audit guidance on safeguarding CUI. Or, this event may just alert you if a DBA disables auditing to temporarily improve SQL Server performance and forgets to re-enable it. 

It’s important to carefully choose the SQL Server events you want to audit based on compliance requirements, so you don’t need to filter unnecessary data. However, it’s important to log unsuccessful as well as successful events, as failures are a top way to spot attacks in progress and identify abuse of privileges.

Guidance on Safeguarding CUI Data: Next steps

Most orgs that handle CUI or other sensitive data are subject to one or more regulations like NIST 800-171, the Cybersecurity Maturity Model Certification (CMMC), HIPAA, Sarbanes-Oxley (SOX), PCI-DSS, etc. The inability to pass a compliance audit puts you at significant risk of fines, legal sanctions, or potentially even criminal prosecution under the US Department of Justice’s False Claims Act.

A database vulnerability assessment performed by Buda Consulting experts will identify any compliance issues with your database environment. This will provide the guidance on safeguarding CUI and other sensitive data that you need to achieve—and demonstrate—compliance to regulators and other stakeholders. 

Contact us to schedule a free 15-minute call to discuss how a database vulnerability assessment can help your business meet its compliance goals.

Oracle SQL Firewall: A New Feature That Blocks Top Database Attacks in Real-Time

Oracle SQL Firewall: A New Feature That Blocks Top Database Attacks in Real-Time

Oracle 23c introduces a very powerful and easy-to-use database security feature that many users will want to try, especially for web application workloads. Called Oracle SQL Firewall, it offers real-time protection from within the database kernel against both external and insider SQL injection attacks, credential attacks, and other top threats. 

Oracle SQL Firewall should be a huge help in reducing the risk of successful cyber-attacks on sensitive databases. For example, vulnerability to SQL injection due to improperly sanitized inputs is currently ranked as the #3 most common web application security weakness overall in the latest OWASP Top 10. This tool effectively eliminates SQL injection as a threat wherever it is deployed.

SQL Firewall is intended for use in any Oracle Database deployment, including on-premises, cloud-based, multitenant, clustered, etc. It is compatible with other Oracle security features like Transparent Data Encryption (TDE), Oracle Database Vault, and database auditing.

How Oracle SQL Firewall works

SQL Firewall provides rock-solid, real-time protection against some of the most common database attacks by restricting database access to only authorized SQL statements or connections. Because SQL Firewall is embedded in the Oracle database, hackers cannot bypass it. It inspects all SQL statements, whether local or network-based, and whether encrypted or unencrypted. It analyzes the SQL, any stored procedures, and related database objects. 

The new tool works by monitoring and blocking unauthorized SQL statements before they can execute. To use it, you first capture, review, and build a list of permitted or approved SQL statements that a typical application user would run. These form the basis of an allow-list of permitted actions, akin to a whitelist. 

You can also specify session context data like client IP address, operating system user, or program type on the allow-list to preemptively block database connections associated with credential-based attacks. This includes mitigating the risk of stolen or misused credentials for application service accounts.

Once enabled, Oracle SQL Firewall inspects all incoming SQL statements. Any unexpected SQL can be logged to a violations list and/or blocked from executing. Though the names are similar, Oracle SQL Firewall is much simpler architecturally than the longstanding Oracle Database Firewall (Audit Vault and Database Firewall or AVDF) system. You can configure the new SQL firewall at the root level or the pluggable database (PDB) level.

Is there a downside to using Oracle SQL Firewall?

In part because it is still so new, Oracle SQL Firewall performance data is not widely reported online. Transaction throughput is vitally important for many applications, so it’s possible that SQL Firewall would create unacceptable overhead even if it were minimal. The good news is that “before and after” performance testing in your environment should be straightforward using best-practice testing techniques.

Oracle SQL Firewall administrative security is robust and logically integrated with other Oracle Database admin security, so it does not introduce new security risks. For example, only the SQL_FIREWALL_ADMIN role can administer the tool or query the views associated with it. SQL Firewall metadata is stored in dictionary tables in the SYS schema, which rely on dictionary protection like other such tables in SYS.

Who should use Oracle SQL Firewall?

For any business that needs to improve application security, such as for compliance with US government supply chain regulations or as part of a Zero Trust initiative, Oracle SQL Firewall could be a good choice. It could prove especially useful in DevOps environments due to its minimal impact on application development and testing timelines

What’s next?

A goal for this blog post is to encourage organizations using Oracle 23c to implement SQL Firewall. It is a low-effort way to improve application and database security and significantly reduce information security risk associated with the sensitive data it protects.

To speak with an expert on how Oracle Database Firewall could improve your database security, and how it might fit with your overall security goals and challenges, contact Buda Consulting

 

 

 

Navigating Database Cloud Migration: How to Choose the Best Cloud Migration Services

Navigating Database Cloud Migration: How to Choose the Best Cloud Migration Services

Thinking of moving your database from your data center to a cloud or managed hosting provider? There are lots of options, and choosing the right cloud migration services for your workload takes research and planning. To get the most business value from your move to the cloud, you need a strategy that minimizes both time to benefit and business risk.

Why move a database to the cloud?

Common reasons for undertaking a cloud database migration include:

  • Reduced operating costs. In the cloud, the cloud service provider (CSP) bears the cost of maintaining, securing, and supporting the physical and virtual infrastructure your databases will run on.
  • Simplified remote access. The public cloud makes it easy to provide database access to remote workers and services.
  • Less security responsibility. Leading public clouds offer comprehensive, multi-layered security controls like data encryption, network protection for remote workers, user activity monitoring (UAM), and threat monitoring/intelligence.
  • Improved scalability. Most clouds can automatically scale data storage and workloads on demand, reducing the overhead associated with manually scaling your infrastructure. 

But the process of migrating databases to the cloud can often exceed time and cost estimates and even lead to security and compliance issues if badly executed. Choosing the right cloud migration services can help streamline key steps and make progress easier to track and manage.

What public cloud should you move to?

A primary consideration that largely dictates what cloud migration services you can pick from is the cloud environment you want to move to.

In some cases, this choice is effectively predetermined. For example, if you are running Microsoft SQL Server workloads and want to keep them in the Microsoft ecosystem, you’ll want to move to Microsoft Azure.  

Similarly, if you use Oracle Database and want to take advantage of the sophisticated cloud migration services that Oracle offers its customers, the best cloud for your workloads might be Oracle Cloud Infrastructure (OCI).

Or maybe you want to use Amazon Web Services with its rich landscape of services. If so, you might benefit from expert guidance from a trusted partner on how to structure your Amazon environment, including networking, storage, and server components. For example, not every business is ready to fully leverage the ephemeral nature of some AWS constructs. The best approach might be to move your database workloads to their own individual instances in Amazon EC2. Or for workloads that don’t require their own instances, Amazon RDS can be a good option.

Finally, if a powerful range of cloud migration services is a deciding factor in your choice of a public cloud, consider Google Cloud. Google Cloud offers multiple approaches for migrating Oracle, SQL Server, and other database workloads. Google’s highly rated cloud migration services use AI to help automate repeatable tasks, saving time and reducing the risk of errors.

What is your database migration strategy?

Another factor in which cloud migration services to use is your database migration strategy. Which strategy you pick will depend on related issues, such as whether you plan to clean up your data or institute new data governance processes as part of the migration.

The three basic database migration strategies are:

  1. Big bang—where you transfer all your data from the source database to the target environment in one “all hands on deck” operation, usually timed to coincide with a period of low database usage, like over a weekend. The advantage of a big bang migration is its simplicity. The downside is that downtime will occur, making this approach unsuitable for databases that require 24×7 availability.
  2. Zero-downtime—where you replicate data from the source to the target. This allows you to use the source database during the migration, making it ideal for critical data. This choice can be fast, overall cost-effective, and generally non-disruptive to the business. The downside of the zero-downtime option is the added complexity of setting up replication, and the risk of possible data loss or hiccups in the data movement if something goes wrong.
  3. Trickle—where you break the migration down into bite-sized sub-migrations, each with its own scope and deadlines. This approach makes it easier to confirm success at each phase. If problems occur, at least their scope is limited. Plus, teams can learn as they go and improve from phase to phase. The problem with a trickle migration is it takes more time and also more resources, since you have to operate two systems until completion.

Cloud migration services examples

Once you’ve identified your target cloud environment and your migration strategy, you can start choosing cloud migration services options.

For example, say you plan to move a business-critical Oracle database to Oracle Cloud Infrastructure using a zero-downtime strategy. One of the best cloud migration services options in this case is Oracle Cloud Zero Downtime Migration (ZDM).

A great feature of ZDM is the ability to fallback if necessary. This is Oracle’s preferred automated tool for migrating a database to OCI with no changes to the database type or version. Using a “controlled switchover” approach that includes creating a standby database, ZDM can dynamically move database services to a new virtual or bare metal environment, synchronize the two databases, and then make the target database the primary database.

At the opposite end of the cloud migration services spectrum from Oracle Cloud ZDM is Oracle Cloud Infrastructure Database Migration—a fully managed service that gives customers a self-service experience for migrating databases to OCI. Oracle Cloud Database Migration runs as a managed cloud service separate from the customer’s OCI tenancy and associated resources. Businesses can choose a simple offline migration option (similar to a “big bang” migration) or an enterprise-scale logical migration with minimal downtime (similar to a “trickle” migration). Teams can pause and resume a migration job as needed, such as to conform to a planned maintenance window.

If you want to move your Oracle, SQL Server, or other database workloads to AWS, Amazon offers a comprehensive set of cloud migration services to help automate the process. However, these tools are complex and powerful, and best used by experienced technologists. Be sure to confirm that AWS database sizing and capacity growth parameters meet your needs. You’ll also need to decide whether to use Amazon Relational Database Service (RDS) or RDS Custom, depending on the kinds of applications your database supports.

Next steps

While moving databases to the cloud offers many benefits, a high percentage of cloud database migrations falter or fail due to inadequate planning and/or a lack of specific expertise. The top public cloud environments offer purpose-built cloud migration services to streamline the process, but these are not always easy to use. The largest CSPs also support millions of users, so your business may struggle to get the individual attention you need in a timely way.

Whether your databases reside in a major public cloud or a smaller cloud or managed hosting environment, Buda Consulting is always the first point of contact for our clients. Personalized service by someone who knows your business is guaranteed. If there is ever a problem, you call us and we take it from there. 

Contact Buda Consulting to discuss how our cloud and managed hosting migration services can help your business get maximum value from moving to the cloud.