Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

«November 2015»

DirectQuery in Power BI Desktop

In the latest Power BI Desktop a new Preview features was released that now allows you to connect using DirectQuery to either SQL Server or Azure SQL Databases.  DirectQuery is a really neat feature that allows you to point to the live version of the data source rather than importing the data into a data model in Power BI Desktop. 

Normally when you want to get an updated dataset in the Power BI Desktop you would have to manually click the refresh button (this can be automated in the Power BI Service), which would initiate a full reimport of your data.  This refresh could take a variable amount of time depending on how much data your have.  For instance, if you’re refreshing a very large table you may be waiting quite a while to see the newly added data. 

With DirectQuery data imports are not required because you’re always looking at a live version of the data.  Let me show you how it works!

Turning on the DirectQuery Preview

Now, because DirectQuery is still in Preview you must first activate the feature by navigating to File->Options and settings->Options->Preview Features then check DirectQuery for SQL Server and Azure SQL Database


Once you click OK you may be prompted to restart the Power BI Desktop to utilize the feature.

Using DirectQuery in Power BI Desktop

Next make a connection either to an On-Premises SQL Server or Azure SQL database.

Go to the Home ribbon and select Get Data then SQL Server.


Provide your Server and Database names then click OK. ***Do not use a SQL statement.  It is not currently supported with DirectQuery***


From the Navigator pane choose the table(s) you would like to use.  I’m just going to pick the DimProduct table for this example and then click Load.  You could select Edit and that would launch the Query Editor where you could manipulate the extract.  This would allow you to add any business rules needed to the data before visualizing it.


Next you will be prompted to select what you want to connect to the data. Again, Import means the data

Read more

The Big Data Blog Series

Over the last few years I’ve been speaking a lot on the subject of Big Data. I started by giving an intermediate session called “Show Me Whatcha’ Workin’ With”. This session was designed for people who had attended a one hour introductory session that showed you how to load data, to look at possible applications … Continue reading The Big Data Blog Series
Read more

Using a SSRS Report to run SQL Agent Jobs

  • 12 November 2009
  • Author: DevinKnight
  • Number of views: 17754

Data Warehouse latency is often a complaint I have heard from end users when trying to access data via either Reporting Services reports or Excel.  Generally, I promise 24 hour latency unless the job mandates updates hourly or even sooner. 

With these complaints in mind I decided to create a report that could kick off the SQL Agent job that processed my Data Warehouse load and Cube update.  It is a pretty simple report to create.  Here are the steps I did:

Step One

Create a Data Source that points to MSDB on the server that the SQL Agent job that you want to run is located.

Step Two

Create a DataSet that runs the system stored procedure sp_start_job with the name of the job.


Step Three

Add some text!  Let the user know what’s going on after they click on the report otherwise it will just show a blank report.  Drag a textbox over and add the appropriate text.

Deploy the report and test!

There are some circumstances where you would not want to use this method:

·         Running the job in the middle of the day could severely cripple a transactional system that the Data Warehouse load pulls from.

·         The job takes longer than just a few minutes to process.  Remember you are trying to improve latency.  You don’t want to expose a poorly performing load process (even if the performance time is due to the size of the load not bad code)

·         You haven’t trained your end users in what the report does.  You don’t want end users clicking this report over and over again because it is running a major process.


Rate this article:


Other posts by DevinKnight

Please login or register to post comments.