Editorials

About Denormalization of Your SQL Server Databases

Announcing: DBA School – Learning SQL Server for the Rest of Us
A three-day course taught by Stephen Wynkoop – centered around real-world experiences, how-to information and specific demonstrations and labs that will let you use and explore the topics covered. We’ll be covering Performance, Management, Security, Disaster Recovery more. Find out more at the class site here.

The catch? There are a maximum of 15 13 seats in this class. If you’re interested in attending, please be sure to register as soon as possible. The class is in January and I’d love to work with you!

This will be a great way to get some hands-on training in working with SQL Server. We’ll have labs for each segment and lots of discussion on the application of the items covered.

For more information, and to register to save your seat, please click here.

Featured Article(s)
Migrating from Oracle to EnterpriseDB
This article shows you step by step how you may migrate a Oracle 10G XE table to Postgres Plus using the Migration Studio. You not only can migrate a table, its indexes and constraints but whole tables.

About Denormalization of Your SQL Server Databases
It seems that a number of you are facing exactly this challenge in building out your systems. Your online transaction stuff needs to be normalized for performance and data protection. But, when it comes to your data warehouse or simply fulfilling reporting requirements, you’re in a position of having to denormalize. Several people pointed out that this is really the point of views and fact tables and such. Very true.

I also heard from people that talked about having to do this, and automated it, using SSIS to create the new tables from the originals. This seems to be a very popular bit of work, taking the information from the original tables to new work tables for reporting using SSIS to modify and re-jig the data for reporting.

Keith provided some additional real-world feedback: "We have multiple applications for financial services and with that comes multiple implementation models of databases. Each client obviously has their own needs but reporting and even analysis of real time data is required. From a functional standpoint normalization has helped with the analysis aspect of real time data because roll ups of the data are faster and easier. But then when reporting is required where addresses and names are required from banks, companies, accounts, etc… the normalized model starts to slow dramatically.

I’ve been advocating normalization for our de-normalized system but not entirely to the 3rd degree. Even though a bank, company, beneficiary are all ‘parties’ we like to keep them separate rather than a party table with views because the views end up in views and the execution plan doesn’t always meet the needs because merge joins seem to be used a lot more when not needed. Most of the time we force the nested loop because most of the time the entities are the same or are at most a handful of required data instead of the sampling value that SQL Server may expect.

Another normalization method I found that I don’t like is an address table. The addresses are never shared within entities so there’s always a unique row that matches a one to one with all the other tables that may reference it, so from my opinion even though multiple tables may have street, city, state, zip… it’s a lot easier to keep it in the data table of that entity rather than joining since it will always be a one to one.

It could be that some entities could share an address but would only be more confusing later and even possible that the address is the same but the suite may be different and managing that would even be more difficult with trying to share that information."

Featured White Paper(s)
Developing Something for Nothing with SQL Server: A Closer Look at SQL Server Express and How it Can Work for You
Software development is an expensive and time-consuming process, not just in terms of time and salaries but also in terms of … (read more)