Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

«November 2015»

DirectQuery in Power BI Desktop

In the latest Power BI Desktop a new Preview features was released that now allows you to connect using DirectQuery to either SQL Server or Azure SQL Databases.  DirectQuery is a really neat feature that allows you to point to the live version of the data source rather than importing the data into a data model in Power BI Desktop. 

Normally when you want to get an updated dataset in the Power BI Desktop you would have to manually click the refresh button (this can be automated in the Power BI Service), which would initiate a full reimport of your data.  This refresh could take a variable amount of time depending on how much data your have.  For instance, if you’re refreshing a very large table you may be waiting quite a while to see the newly added data. 

With DirectQuery data imports are not required because you’re always looking at a live version of the data.  Let me show you how it works!

Turning on the DirectQuery Preview

Now, because DirectQuery is still in Preview you must first activate the feature by navigating to File->Options and settings->Options->Preview Features then check DirectQuery for SQL Server and Azure SQL Database


Once you click OK you may be prompted to restart the Power BI Desktop to utilize the feature.

Using DirectQuery in Power BI Desktop

Next make a connection either to an On-Premises SQL Server or Azure SQL database.

Go to the Home ribbon and select Get Data then SQL Server.


Provide your Server and Database names then click OK. ***Do not use a SQL statement.  It is not currently supported with DirectQuery***


From the Navigator pane choose the table(s) you would like to use.  I’m just going to pick the DimProduct table for this example and then click Load.  You could select Edit and that would launch the Query Editor where you could manipulate the extract.  This would allow you to add any business rules needed to the data before visualizing it.


Next you will be prompted to select what you want to connect to the data. Again, Import means the data

Read more

The Big Data Blog Series

Over the last few years I’ve been speaking a lot on the subject of Big Data. I started by giving an intermediate session called “Show Me Whatcha’ Workin’ With”. This session was designed for people who had attended a one hour introductory session that showed you how to load data, to look at possible applications … Continue reading The Big Data Blog Series
Read more

SQL Internals Reading Data Records Part 5: Variable Offset Array

  • 6 July 2012
  • Author: BradleyBall
  • Number of views: 3449


  Welcome back Dear Reader to Part 5 on our series of how to read a Data Record.    In Part 1 we covered the Tag Bytes.  In part 2 we covered the Null Bitmap Offset.  In Part 3 we covered the Fixed Data Portion of a record.  And in Part 4 we talked about the Null Bitmap. Today we will be discussing the Variable Offset Array.  This part of the record is another optimization, the Variable offset array will only show up IF we have Variable Length Data Types in our Table and if they are not NULL.  If they are all NULL then we will not get an offset array.


“So Balls”, you say, “Why no offset array if it is NULL?”


Excellent question Dear Reader and the answer is in the name itself.  Just about clear as mud, let’s get to the demos and clear this up.




First let’s update our chart so we know what part of the Data Record we are tackling.  Once again these images come by way of Paul Randal (@PaulRandal | Blog) and the MCM Video series on Data Structures and the good people from Microsoft. 



What the Variable offset array does is house the numeric value to the end of the record that we want to read.  So if we have three variable length columns set to a max of 15, 20, and 25 but we only use a portion of them (8 out of 15), (10 out of 20), and (20 out of 25) then we would have unused portions right?  Wrong. 


If we use the meta data and just read the full length of the columns we are getting stuff wrong and wading into columns we do not need to, not to mention in acting like Fixed Length Data our records bloat and there is always wasted space.  This is VARIABLE Length data, we only use what we need, think of this as the Green Data Type.   So all this Recycling is cool and all, but it means that we need to have some data to help us efficiently read this data.  And Voila we get our Variable Length Offset Array.  This tells us what the end of the record is for one record at a time so we go to the end of that record and read forward.


So without further ado let’s get right to it.  We’ll use the same code that we used on day 4, and we’ll create our table and one record of data.


       DROP TABLE dataRecord2
CREATE TABLE dataRecord2
                     (myID INT
                     ,myfixedData CHAR(4)
                     ,myVarData1 VARCHAR(6)
                     ,myVarData2 VARCHAR(6)
                     ,myVarData3 VARCHAR(6)
INSERT INTO dataRecord2(myID, myfixedData, myVarData1, myVarData2, myVarData3)
VALUES (7, 'XXXX', null, null, null)


Now let’s do a DBCC IND and get our page numbers.


DBCC IND(demoInternals, 'dataRecord2',  1)



Remember that a page type 10 in an allocation page and we are reading a data page.  So look for the page number that has a PageType=1.  We’ll follow that up with a DBCC PAGE on Page 296, remembering to turn on Trace Flag 3604 so that we get our output to our SSMS window.  *Remember your page numbers may be different than mine. 


DBCC PAGE('demoInternals', 1, 296, 1)



Now that we’ve got our page let’s take a look at it, I’m only going to post the relevant output.

Record Type = PRIMARY_RECORD        Record Attributes =  NULL_BITMAP    Record Size = 15


Memory Dump @0x0000000010E3A060


0000000000000000:   10000c00 07000000 58585858 05001c             ........XXXX...


You may be wondering where our Variable offset Array is?  And the answer is there isn’t one.  Our initial column had only Null values for our Variable Columns and so our record only took up the space that it needed.  No Variable length columns means no need for a variable column offset array.  So let’s add one more record and see what happens.


UPDATE dataRecord2
SET myVarData2='WWWWWW'
DBCC PAGE('demoInternals', 1, 296, 1)



Record Type = PRIMARY_RECORD        Record Attributes =  NULL_BITMAP VARIABLE_COLUMNS

Record Size = 27                   

Memory Dump @0x000000000A03A060


0000000000000000:   30000c00 07000000 58585858 05001402 0015001b  0.......XXXX........

0000000000000014:   00575757 575757                               .WWWWWW


We have Several values in our Variable Offset array.  First we have 02 00 then 1500 and finally 1b 00. As you might have guessed these are Hex pairs.  Our first set of Hex pairs will tell us the number of columns for our variable length fields, in this case 0200 which translates to 0x0002 or just 2.  We have three columns but remember we just updated the middle column, so we have to acknowledge the column in front of it but not behind it because we do not yet need this space.  1500 is our next Hex pair which translates out to 0x0015 in this case we are translating Hex to Decimal and not binary, remember to use our handy tool, and this translates to 21.


So how do we get to 21?  2 Bytes for our Tag bytes + 2 bytes for the Null Bitmap offset + 8 Bytes of fixed Length Data + 3 bytes for the Null Bitmap Offset + 6 bytes for our Null bitmap offset array, or simply (2+2+8+3+6), = 21.  And at record 21 there is nothing because our value is null.  Our Remaining value is 1b00 or 0x001b also known as 27.  Remember we inserted six W’s into our variable length field, so just add 6 to 21 and we get 27 the end of our current record.


Just to double check let’s insert some data into our final Variable length row and look at the updates.


UPDATE dataRecord2
SET myVarData3='BBBB'
DBCC PAGE('demoInternals', 1, 296, 1)


Remember just putting out the relevant data.

Record Type = PRIMARY_RECORD        Record Attributes =  NULL_BITMAP VARIABLE_COLUMNS

Record Size = 33                   

Memory Dump @0x000000000A03A060


0000000000000000:   30000c00 07000000 58585858 05000403 0017001d  0.......XXXX........

0000000000000014:   00210057 57575757 57424242 42                 .!.WWWWWWBBBB


Right away we see our values have changed 03 00, 1700, 1d 00, and 2100 are our new values.   We now have 3 columns so we see that reflected in 0x0003 or 3.  Our first value incremented by 2 because of the addition of our 3rd column to 0x0017 or 23, and our second column has incremented as well from 0x001b to 0x001d or 29.  Our new value for our 3rd row is 2100 which translates to 0x0021 or 33.  If we add 4 , for the four B’s we added to column 3, to 29 which is the ending value of column 2 we get 33 (4+29=33).  So one last update and we’ll have gotten rid of all the null’s.


UPDATE dataRecord2
SET myVarData1='SSSS'
DBCC PAGE('demoInternals', 1, 296, 1)

Once again I’m only going to copy out the relative output.


Record Type = PRIMARY_RECORD        Record Attributes =  NULL_BITMAP VARIABLE_COLUMNS

Record Size = 37                   

Memory Dump @0x000000001398A060


0000000000000000:   30000c00 07000000 58585858 05000003 001b0021  0.......XXXX.......!

0000000000000014:   00250053 53535357 57575757 57424242 42        .%.SSSSWWWWWWBBBB


And now we’ve shifted our values one more time.  The relative values are 03 00, 1b00, 21 00, 2500.  We still have 3 columns 0x0003, but now we see that our first record has grown again to the ending point of our first variable length record to 0x001b or 27.  The ending point of our second variable length record is 0x0021 or 33, and the ending point for our 3rd and final record is now 0x0025 or 37.


And that as they say Dear Reader is that, I hope you enjoyed the read and as always Thanks for stopping by.





Categories: Analysis Services
Rate this article:
No rating


Other posts by BradleyBall

Please login or register to post comments.