Context Transition in SUMX Function

Posted on

Yesterday I have read an article “Currency conversion in DAX for Power BI and SSAS” from Kasper de Jonge. He used a very neat technique for that.

But I think there might be some confusions from some people, why this technique works, especially how the context flow down to the lookup table.


There are only 2 measures:

  1. TrnExchangeRate = Min(FactExchangeRate[Factor])
  2. TotalSales = if(HASONEVALUE(ReportCurrency[ReportCurrency]),

SumX( FactSales, [Sales] * [TrnExchangeRate]))

The pivot table:


Some people might ask, for this measure “SumX( FactSales, [Sales] * [TrnExchangeRate])”, why iterating FactSales table can filter down to the “1-side” lookup table TransactionCurrency and ExchangeDate?

The trick here is, [TrnExchangeRate] is a measure. When it is used in another measure, it will be automatically wrapped with CALCULATE, which will perform the context transition from row context to filter context. Important is, the Fact table “FactSales” in SUMX is expanded and includes all its 1-side lookup tables. So when context transition happens, the filter context will also affect the lookup tables.


Power BI Visual Embedded in SSRSPreview

Posted on

in PASS , Microsoft announced that “Technical Preview of Power BI reports in SQL Server Reporting Services now available” (https://blogs.msdn.microsoft.com/sqlrsteamblog/2016/10/27/technical-preview-of-power-bi-reports-in-sql-server-reporting-services-now-available).

Anyway we can only taste it by using Azure Marketplace. Now I have the VM ready (when building this VM, make sure using default size, be careful this is costly! so after testing, stop it.).

First impression: fantastic! Now we can really control the PBI version:).

Some screenshots

loading speed was pretty quick.

next step, I will get the VHD from Azure and load into my test machine to reduce cost.

Really excited to see this feature, or to see, a milestone for SSRS.


Probability to get a lucky draw?

Posted on Updated on

Yesterday we had annual dinner with all colleague. Of course, lucky draw must be carried out.

All of our names were written in a piece of small paper for each and thrown into a box. The lucky draw was performed about 30 rounds. Each time a paper was pulled out, the person with the name same as the one from that paper would get a gift. After that that paper was discarded (without replacement).

Everyone was excited. All hoped to get drawn out. Suddenly I was thinking, what is the chance for me to get a gift?

Yeah, I don’t need a precise number but just a rough one. So I took this formula from Excel:

BINOM.DIST.RANGE(30,1/500,1) = 5.66%

Ah ja, of course not so accurate – no consideration of “no replacement”, and some minor one. But sure enough, it should be between BINOM.DIST.RANGE(30,1/450,1) and BINOM.DIST.RANGE(30,1/500,1). That’s 5.66% – 6.25%. Not too bad:)

And, I got one. Bingo! Lucky enough!

Power BI Desktop October update

Posted on Updated on

Microsoft Power BI team is really amazing! This month they brought up 23 new features. Among them, I ticked the best features (I thought):

PowerBI Oct

More details:


SQL Server Analysis Service 2016 CTP 2.3 DirectQuery in action (by Kasper de Jonge)

Posted on Updated on

In my previous blog post “What’s new in Microsoft SQL Server Analysis Services Tabular models in SQL Server 2016 CTP 2.3” I introduced some of the enhancements coming to SQL Server Analysis Services Tabular –and many more will be coming in the next CTPs. In this article, I’d like to show you what the combination of Super DAX and improved DirectQuery could mean to you.

For illustration, I loaded a dataset with 220 million rows in a fact table and several other dimension tables into a relational SQL Server 2016 CTP 2.3 database, I then enabled columnstore indexes on all the tables.  The hardware was a Lenovo W530 laptop, so no server hardware by any means. Your mileage might vary, but I would imagine much better performance on a proper server with lots of RAM.

Next, I built a Tabular model in DirectQuery mode on top of this relational database by using SSDT for Visual Studio 2013 (this version also works with SQL16). For comparison, I installed two instances of Analysis Services, SQL Server 2014 and SQL Server 2016 CTP 2.3, and deployed the same model to both. All running on the same laptop.

Finally, I built a report by using Power BI Desktop. I connected to the SQL Server 2014 SSAS instance and started building my report:

Looks beautiful doesn’t it :). I added some realistic visuals like a matrix as well as these are usually the worst for performance.

Now, let’s take a look at a SQL profiler trace to see what is generated when I refresh the entire report:

The complete refresh of the report took 3 minutes and 34 seconds. 15 source queries were sent to SQL Server.

Let’s see what happens when I use the same report against my SQL Server 2016 CTP 2.3 instance:

The same report refreshed in 10 seconds instead of 3 min and 34 seconds, a whopping 20 times faster! And Analysis Services only generated five SQL queries!

That is a significant improvement you can start experiencing today. You will see the most improvement when using Power BI Desktop and Power BI service together with the Analysis Services connector but improvements can also be expected when using Power View in Excel and SharePoint. And as mentioned in my previous blog post, we are planning to ship even more performance updates in the next SQL Server 2016 CTPs.

So start using SQL Server 2016 CTP2.3 today and see for yourself! Here are the steps you can try out:

Use SuperDAX and the new DirectQuery with an existing model:

  1. Download SQL Server 2016 CTP2.3 here.
  2. Install Analysis Services from the 2016 setup on your development machine. You can install it as a new instance so you can run it side by side with SQL Server 2014.
  3. Backup the database you want to use and restore it to the SQL Server 2016 CTP instance.
  4. Point Power BI, Power BI Desktop, or any other client tool to the restored database.
  5. You can now enjoy improved performance with your tabular model.

To develop a new model using the new DAX functions or DirectQuery functionality:

  1. Download SQL Server 2016 CTP2.3 here.
  2. Install Analysis Services from the 2016 setup  on your development machine. You can install it as a new instance so you can run it side by side with SQL Server 2014.
  3. Use SSDT for Visuals Studio 2013 and point the workspace database to the newly installed SQL 2016 instance.
  4. Open SSDT and then open your Tabular model solution.
  5. Start using the new DAX expressions.
  6. Deploy to your newly installed SQL Server 2016 CTP instance.
  7. Be advised that the new DAX expressions will not work with previous versions of Analysis Services, SQL Server 2016 is required.

I hope you are excited about the upcoming SQL Server 2016 release. This is just the beginning. More will come over the next couple of months, but I’m repeating myself. Stay tuned!

From <http://blogs.msdn.com/b/analysisservices/archive/2015/09/08/sql-server-analysis-service-2016-ctp-2.3-directquery-in-action.aspx>