Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

«November 2015»

DirectQuery in Power BI Desktop

In the latest Power BI Desktop a new Preview features was released that now allows you to connect using DirectQuery to either SQL Server or Azure SQL Databases.  DirectQuery is a really neat feature that allows you to point to the live version of the data source rather than importing the data into a data model in Power BI Desktop. 

Normally when you want to get an updated dataset in the Power BI Desktop you would have to manually click the refresh button (this can be automated in the Power BI Service), which would initiate a full reimport of your data.  This refresh could take a variable amount of time depending on how much data your have.  For instance, if you’re refreshing a very large table you may be waiting quite a while to see the newly added data. 

With DirectQuery data imports are not required because you’re always looking at a live version of the data.  Let me show you how it works!

Turning on the DirectQuery Preview

Now, because DirectQuery is still in Preview you must first activate the feature by navigating to File->Options and settings->Options->Preview Features then check DirectQuery for SQL Server and Azure SQL Database


Once you click OK you may be prompted to restart the Power BI Desktop to utilize the feature.

Using DirectQuery in Power BI Desktop

Next make a connection either to an On-Premises SQL Server or Azure SQL database.

Go to the Home ribbon and select Get Data then SQL Server.


Provide your Server and Database names then click OK. ***Do not use a SQL statement.  It is not currently supported with DirectQuery***


From the Navigator pane choose the table(s) you would like to use.  I’m just going to pick the DimProduct table for this example and then click Load.  You could select Edit and that would launch the Query Editor where you could manipulate the extract.  This would allow you to add any business rules needed to the data before visualizing it.


Next you will be prompted to select what you want to connect to the data. Again, Import means the data

Read more

The Big Data Blog Series

Over the last few years I’ve been speaking a lot on the subject of Big Data. I started by giving an intermediate session called “Show Me Whatcha’ Workin’ With”. This session was designed for people who had attended a one hour introductory session that showed you how to load data, to look at possible applications … Continue reading The Big Data Blog Series
Read more

Maximum Insert Commit Size

  • 2 July 2010
  • Author: ShawnHarrison
  • Number of views: 30264

A few days ago, I was working with a client that recived the following warning on the connection managers in his SSIS packages.

[OLE DB Destination [40]] Information: The Maximum insert commit size property of the OLE DB destination "component "OLE DB Destination" (40)" is set to 0. This property setting can cause the running package to stop responding. For more information, see the F1 Help topic for OLE DB

The packages were executing successfully, but he wanted to prevent that warning message. This was an easy fix.

When you are using an OLE DB destination, and you choose the 'Table or View - fast load' as your data access mode, you will see the 'Maximum Insert Commit Size' property. This is where you can specify the maximum number of rows that can be processed and inserted into the destination in one batch insert. By default, this property is set to 2147483647. There are two ways to change this value.

You can right click on the destination in the data flow task and click edit. You will see it on the Connection Manager tab.

Connection Manager Properties


The other way to change it it in the properties pane for the destination. You can find it under the custom properties section.

Properties Pane


I don't know how the default value was changed to 0, but it may have had something to do with the fact that he was converting these packages from DTS to SSIS.

Rate this article:
No rating


Other posts by ShawnHarrison

Please login or register to post comments.