Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

Does the Cloud Mark the End of the Production DBA?

  • 1 July 2011
  • Author: Brian Knight
  • Number of views: 1758
  • 0 Comments

While I’ve been on the Expedition Denali roadshow, a portion of the event is about SQL Azure, Microsoft’s new cloud database computing platform. I always enjoy watching and surveying the DBAs in the audience during and after the session about how they feel this is going to affect their jobs. Microsoft has been pitching Azure as a fundamental shift in the way we manage and store data, pushing the onus of managing the database, patches and operating systems to Microsoft. Reading between the lines, does that mean this will cost DBAs their jobs?

Before you get concerned if you’re reading this you must first consider the adoption rates of Azure, which has been more targeted at small to medium businesses, who typically don’t have a DBA staff on board. The present SQL Azure platform has a limit of 50GB per database, which prohibits data loads outside the new-medium sized database. There is also a cost issue, which is steep for many companies. When Pragmatic Works looked at Azure for our internal business, our BIDN database was going to be close to $500/month plus usage fees on top of that. In our example, the ROI wasn’t there, since our database only costs us a fraction of our time to manage.

If you’re a DBA, you have nothing to fear but you should start making tweaks to your resume. If you’re a production-only DBA for example, you must become a specialist on VLDB (very large databases) or how to manage databases more effectively. I think you’ll see over the next decade a fundamental shift for DBAs where we go into two camps: one that manages multi-terrabyte databases and another that manages hundreds of smaller databases across the company very efficiently.

During the recession, business intelligence projects were some of the only projects that had double-digit growth. This is because companies were trying to become more productive and get a leg-up on their competition. This spurred the growth of multi-terrabyte databases and cubes. We have several customers that now have data warehouses crossing 10s of terrabytes. Having this size of a database triggers a new style of database administrator. For example, backing up a database that’s 10TB, can’t be done the traditional way.

With this type of new data load on servers, we’re also seeing very specialized appliances out like the PDW and Fast Track architectures. This BI architecture depends on sub-second response time from server to end-user. This type of demand and experience is not possible in the cloud. DBAs that specialize in delivering this type of response time to the end user will become even more of a commodity but they must get their hands dirty and think outside of the box to accomplish this.

Many companies have also seen a huge explosion of smaller databases as more of their world becomes electronic. Much of this is from 3rd party applications that have been installed and the other part from small internal databases. While this load may make sense for the cloud, the ROI will not be there when a company has hundreds of databases that could be consolidated onto just one or two servers.

If you believe the cloud DB store or not, things are going to change over the next decade. You will see some dip their toes into the cloud and some will have success doing so. This will remain a small to medium databases with lower SLAs for performance. So, the DBA job will change over the next decade but it will still remain strong.

 

 

Print
Categories: Blogs
Rate this article:
No rating
Brian Knight

Brian KnightBrian Knight

Other posts by Brian Knight

Please login or register to post comments.