Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

«November 2015»

DirectQuery in Power BI Desktop

In the latest Power BI Desktop a new Preview features was released that now allows you to connect using DirectQuery to either SQL Server or Azure SQL Databases.  DirectQuery is a really neat feature that allows you to point to the live version of the data source rather than importing the data into a data model in Power BI Desktop. 

Normally when you want to get an updated dataset in the Power BI Desktop you would have to manually click the refresh button (this can be automated in the Power BI Service), which would initiate a full reimport of your data.  This refresh could take a variable amount of time depending on how much data your have.  For instance, if you’re refreshing a very large table you may be waiting quite a while to see the newly added data. 

With DirectQuery data imports are not required because you’re always looking at a live version of the data.  Let me show you how it works!

Turning on the DirectQuery Preview

Now, because DirectQuery is still in Preview you must first activate the feature by navigating to File->Options and settings->Options->Preview Features then check DirectQuery for SQL Server and Azure SQL Database


Once you click OK you may be prompted to restart the Power BI Desktop to utilize the feature.

Using DirectQuery in Power BI Desktop

Next make a connection either to an On-Premises SQL Server or Azure SQL database.

Go to the Home ribbon and select Get Data then SQL Server.


Provide your Server and Database names then click OK. ***Do not use a SQL statement.  It is not currently supported with DirectQuery***


From the Navigator pane choose the table(s) you would like to use.  I’m just going to pick the DimProduct table for this example and then click Load.  You could select Edit and that would launch the Query Editor where you could manipulate the extract.  This would allow you to add any business rules needed to the data before visualizing it.


Next you will be prompted to select what you want to connect to the data. Again, Import means the data

Read more

The Big Data Blog Series

Over the last few years I’ve been speaking a lot on the subject of Big Data. I started by giving an intermediate session called “Show Me Whatcha’ Workin’ With”. This session was designed for people who had attended a one hour introductory session that showed you how to load data, to look at possible applications … Continue reading The Big Data Blog Series
Read more

Enabling Checkpoints in your SSIS Packages

  • 7 November 2009
  • Author: DevinKnight
  • Number of views: 27679

Checkpoints are a great tool in SSIS that many developers go years without even experimenting with. I hope to enlighten you on what Checkpoints are and why it is beneficial to use them. Also, I will walk you through a basic example package where they have been implemented.

What does it do?

With Checkpoints enabled on a package it will save the state of the package as it moves through each step or task and place it in a XML file upon failure of the package. If your package does fail you can correct the problem in your package and rerun from the point of the tasks that did not successfully run the first time. Once the package completes successfully the file is no longer needed and automatically discarded.

How does this benefit you?

Just imagine your package is loading a table with 10 million records. Your package passes the Data Flow that performs this huge load without any problem (Other than the fact that it took two hours to load). The next task in your package is a Send Mail Task and for some reason fails.

You correct the problem in the Send Mail Task, but without using Checkpoints your package would still have to run that Data Flow that loads the 10 million records again (taking another two hours) even though you’ve already done it once. If you had enable Checkpoints on this package you could simply correct the problem in the Send Mail Task and then run the package again starting at the Send Mail Task. Sounds great right?

How do I configure it?

This example will run you through very basic package using Checkpoints.

Example Overview

  •  Use Three Execute SQL Task using the AdventureWorks2009 (It can really be any database for this example) database as a connection manager.
  •  Configure the package to handle Checkpoints
  •  Configure the individual tasks to handle Checkpoints

Step 1: Configure Execute SQL Tasks

  •  Drag three Execute SQL Tasks on your Control Flow.
  •  Use any database for the Connection property on all three tasks
  •  Configure Execute SQL Task SQLStatement property: Select 1
  •  Configure Execute SQL Task 1 SQLStatement property: Select A (Set to intentionally fail)
  •  Configure Execute SQL Task 2 SQLStatement property: Select 1

Step 2: Configure Package to enable Checkpoints

  •  Open the properties menu at the package level (Just open properties in the Control Flow without any task or connection manager selected)
  •  Change the properties CheckpointFileName: c:\Checkpoint.xml (Feel free to use the .txt extension when naming the checkpoint if you want to open it in notepad and look at it!)
  •  Change the properties CheckpointUsage: IfExists
  •  Change the properties SaveCheckpoints: True

Step 3: Configure Each Task

  •  Select each task individually and open the properties menu at the task level (Just click the task once then hit F4)
  •  Change the FailPackageOnFailure property to True

Step 4: Run the Package

  •  Run the package and you will see the package fail on the second task
  •  This also created the file c:\Checkpoints.xml. Feel free to open it and take a look! I use the tool XML Notepad to view XML Files. It’s Free.
  •  You could also save this file with the.txt extension and just view in regular notepad and it still works as a Checkpoint.

•  If you run the package a second time it will skip the first task that was successful and start right at the second task

Step 5: Correct the Problem and Rerun Package

  •  Open the Execute SQL Task 2 and configure the SQLStatement property: Select 1
  •  The package has now completed and skipped the first step which already succeeded. Imagine if that first step would normally take two hours to run!
Rate this article:


Other posts by DevinKnight

Please login or register to post comments.