Newsletters




Three Database Fixes Capable of Improving App Performance Overnight


Database applications have vastly improved in recent years, but despite our best efforts, they can still feel frustratingly slow. A report that is supposed to take minutes might instead (for unknown reasons) take 2 hours. Corporate teams complain that every time they work with the database, it gums up the works, leaving users grumpy and managers suspicious.

This is no small matter. According to the SolarWinds Query Report 2021, nearly one-third of data pros sur­veyed manage more than 300 databases in their organization’s environment. Yet for database administrators, time is at a premium, and they’re frequently asked to do more with less.

DBAs are under constant pressure and being forced to cope with a skills gap. Responsibilities have also increased, despite most companies now recognizing that their data is their most important asset. Or, as Harvard Business Review puts it, data is the “centerpiece of corporate value creation.”

DBAs know data is inert if it isn’t securely provided to the right people at the right time and in the right format. By extension, DBAs touch the very core of the company business processes and help turn their data into useful information. This fact needs wider acknowledgment, and DBAs need more support. It should also be noted that they performed their tasks during the worst of the pandemic—and during this time, their responsibilities only increased.

Despite their importance to the company, the databases are still slow. Often, this can be accounted for. For example, it might be that developers were given only functional requirements during the build process and had no expectations set for the performance of the application and database. Or the team that wrote the query might have asked for a lot more data than was needed. Rather than creating a more careful and selective query, the team asked for something scattershot, expecting that the end user would then use a spreadsheet to filter and analyze the results. Rule of thumb: If you ask for more data, you’ll spend more time waiting.

But even when users carefully craft a well-written SQL query, the database response can still be slow. In this case, the applica­tion may perform well, but the reason for poor performance lies within the database itself.

Look first in the three areas below if you have a slow database—and see if you can practically hear the squeaky wheels turning while the horse and buggy plugs along.

  1. Review Your Indexes

Indexes matter! It’s common to hear an import­ant query or report that normally takes 2 hours to run could instead take 2 seconds if the database had proper indexes. Without indexes, the applica­tion will be molasses-slow, causing user tempers to run high.

To improve performance, look at the database design and make sure it has appropriate indexes. For example, your tables in an SQL Server or Azure SQL database should all have a pri­mary key and clustered index, and you should apply non-clus­tered indexes to all foreign key columns as well as columns fre­quently used in WHERE and JOIN clauses. For custom-built applications, it’s also common to see duplicate indexes, which can hurt performance. Be aware when you buy third-party products that the vendor makes sure that it runs on specific database platforms. But they rarely do more than make sure it can run. Performance tuning, especially for third-party prod­ucts built to run on multiple database platforms, is extremely rare. Your third-party product may be good enough to run on multiple platforms, but it’s unlikely to run well.

Check the indexes on all your databases—you’ll likely make positive gains.

  1. Check for a Data Type Mismatch

Developers and DBAs can dramatically improve database per­formance by understanding data types—an attribute telling the database query optimizer how the programmer intends to use the data. Conversely, novice data pros frequently choose broad and general data types such as a text data type with no limit to how much text it can hold—e.g., varchar (max)—not realizing a shortcut such as this often degrades performance.

Choosing the right data type is important, but it’s even more important to ensure the data type is used consistently throughout the database and applications using the database. For example, assume the database architect designed a table with a column called Employee_ID defined as a 10-character long text data type—i.e., CHAR(5). The column itself holds numbers, such as 09508 or 00113, but the database architect chose a character data type instead of an integer data type deliberately because the CHAR data type will record any lead­ing zeros, whereas integer data types will drop any leading zeros of an Employee_ID. However, it’s not uncommon to find on multi-person development efforts an Employee_ID that appears as CHAR(5) in some tables, as INT in other tables and pro­grams, and as SMALLINT in yet other tables and programs. Everywhere there’s a mismatch between the data types of Employee_ID, between tables, or between tables and programs, performance degrades.

For developers to choose the correct data type, they must ask the right questions during the design process, such as how the data will be used, how much space is needed to properly store the data, and how likely the data will be used in the future. From there, developers and DBAs should review their database and application design to ensure a given column definition is used consistently everywhere, whether it’s in tables or in SQL code. Doing this improves performance and leads to cost savings.

  1. Investigate the Entity Framework

The Entity Framework (EF) is an open source object-relational mapper (ORM) designed to enable .NET developers to work with a database using .NET objects. EF is our example, but this advice applies to all ORMs, such as TypeORM, Waterline, Hibernate, and NHiberate.

EF makes building the front-end app easier and faster, espe­cially when compared with hand-writing SQL to do the simplest types of data access. Developers become much more productive for simple CRUD requirements (CRUD stands for create, read, update, and delete data). With the right skills, a talented developer using EF can quickly and effectively build most database-driven web applications.

Note that “simple” was stated more than once in the previ­ous paragraph. This is because ORMs are good at making simple CRUD screens. But if you want to make sophisticated user inter­faces, ORMs write low-quality SQL code when compared with hand-written SQL. Though EF has significantly improved over the years, there’s still a problem: It has multiple areas capable of tripping up developers. Learn to spot these traps and avoid them. Here be dragons.

If you have application performance problems, check to see if it’s caused by the EF or another ORM used during the develop­ment process.

Bonus: Get a Database Performance Monitoring Tool

A good database performance monitoring (DPM) tool doesn’t just tell you when there are error conditions or problems. Putting a DPM tool in place can improve database and app performance by providing easier and faster access to vital performance data. It also makes applications behave better by closely and carefully measuring performance and resource consumption during the development process and in production. By using a DPM tool during development, data pros can learn what a normal per­formance profile is and determine when performance becomes abnormal. This can lead to better-optimized, more efficient data­base design, since fixing performance problems is an order of magnitude easier during development than after being deployed into production.

DPM tools speed problem resolution and troubleshooting in production environments and can strongly guide the devel­opment of database queries and applications toward better-per­forming alternatives. DBAs can use DPM tools to compare spe­cific queries and database applications before and after changes with a complete view of all empirical performance metrics. DPM tools also monitor all the organization’s databases from a sin­gle pane of glass, whether they’re on-prem, in the cloud, or in a multi-cloud environment.

Because a good DPM tool uses its own database to store performance metrics produced automatically by the operating system, database server, and the application, it can also provide long-term trending information. It does this all while adding rel­atively little overhead to ongoing database activity. Additionally, make sure the DPM tool you choose can monitor cloud-native, on-prem, and hybrid production and test databases at scale with­out too much performance overhead.

Database professionals know many of these problems lessen or even disappear when their organization has sufficient staff (and a narrower skills gap). While we wait for the boardroom to understand this fact, look to the fixes providing the most improvement while expending the least amount of effort. And know that databases don’t have to be slow—we’ve moved on to an era that is faster than that of the horse and buggy. It’s time to speed things up.


Sponsors