You are reading the article Running Sensitivity Analysis For Power Bi Using Dax updated in December 2023 on the website Achiashop.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 Running Sensitivity Analysis For Power Bi Using Dax
Have you ever thought it would be nice to work out what the most optimal outcome is from your scenario analysis work that you’re doing inside of Power BI? In this tutorial, I show you exactly how you can discover this and also include sensitivity analysis techniques to your what-if parameter and scenario analysis work. You may watch the full video of this tutorial at the bottom of this blog.
So we go through how you can build and optimize your model through running scenarios, and then explore or run sensitivity on those scenarios.
Incorporating sensitivity analysis and relevant visualizations to your reports enables consumers to see what would happen if multiple scenarios occurred at once versus just a singular result based on a selection.
By utilizing this technique in Power BI, you’re giving the consumer a chance to see what the most optimal outcome is based on the scenarios that could occur in your data.
The key thing for this analysis is to set up our data model correctly. Inside our data model, we have our Lookup tables — Dates, Customers, Products, and Regions — that are related to our Sales table.
More importantly, we create these three scenario tables or scenario supporting tables. In the older version of Power BI, we had to do this manually. But now with the recent Power BI version, we can create this using the What-If parameter feature.
In these scenario tables, we can shock the demand, cost, and price.
I call this multi-layering of scenarios or a multi-layered approach to scenarios because we can thread through these three variables or elements into our calculations, allowing us to run multiple scenarios.
This is how we’re going to run sensitivity analysis to then see which is the most optimized scenario in this current environment we have here in this example.
Once we have this forecast or this sort of scenario in our demand, pricing, and costs, we will then see its ultimate impact on our total profits or sales.
We then thread all of our elements through our formulas. In our Scenario Profits calculation, we use iterating functions. We can isolate any element that we’re looking at in a particular row of a table.
In this case, it is the Sales table we’re iterating through every single row. And then we can shock it with the change in demand, price, and cost.
So, if you think about it, these elements or scenario tables are not even connected into anything in our model since they’re supporting tables. And, we use this formula to integrate them into our model.
Instead of just showing the overall results, we show the sensitivity. We’re using that multi-layering of scenario approach inside of iterating functions in this particular formula to then create these sensitivities.
In this chart here, we’ve brought in on the rows the Change in Price and across the top and the columns we have the Demand Changes.
In this chart, we can see what the change in demand, as well as the change in price, would actually do to our results. And then within the matrix, we can use the conditional formatting to color those in, which is another really awesome element, for a better visualization.
In this chart below, we can see the Change in Cost. So when costs decrease, for example, our demand increases.
We might as well add more elements to get a more comprehensive analysis. We can put in our Dates, Regions, etc.
We add our Dates slicer here, so we can change the time frame, which is seriously amazing. This will enable us to drill into a specific time frame, and then it will change the results we have in our charts.
With the power of the data model, we can also include any element on any dynamic way so we can really drill into aspects of our data.
So we can utilise anything inside our model, we can filter our Customers, Products, Regions, etc.
We can still run these sensitivities in these very specific regions. Once you select a region, it will dynamically change the results as well.
And that’s how you can optimize these scenarios. It’s basically running sensitivity analysis easily and effectively.
As you can see, this is really powerful stuff. This is really high quality analytical work that is going to impress anyone if you put this in front of them.
This work historically was very difficult to achieve. In Power BI, just like magic, you can create this work. You can create these insights in a really intuitive, effective and a scalable way.
I hope you can see how quickly you can do this. There are not many complex formulas involved. It just requires a really good understanding of iterating functions, and that is the key to implementing this technique.
Good luck using techniques like this in your own analysis.
You're reading Running Sensitivity Analysis For Power Bi Using Dax
In this tutorial, we’re going to focus on some customer attrition analysis in Power BI. You may watch the full video of this tutorial at the bottom of this blog.
Power BI is an amazing tool for high-quality analytics. In my view, it doesn’t have a close competitor at this point in time.
The dashboard we’re using here is part of the Learning Summit I had on attrition analysis, but this tutorial focuses on finding our lost customers.
In this demo, we will find out who our lost customers are and we’ll see the total sales lost from them.
We apply complex DAX formulas to achieve this powerful attrition analysis inside Power BI.
In this example, a lost customer is considered to be a customer that has purchased something in the last 10 months but not the last 2.
We need to find a list of our customers who purchased in the last 2 months or 60 days, and a list of customers who purchased in the 10 months before that.
Then, we’ll compare these tables of customers and see which customers don’t exist in the table from the last 2 months but did in the past 10 months prior to that.
And that’s what this particular formula is doing. The CALCULATETABLE function enables us to do that in any particular month.
So we’re creating virtual tables and CALCULATETABLE is a perfect function to use because it’s very similar to CALCULATE where you can change the context of a calculation. But in this case, we’re changing the context of a table (Customer Name Index).
It shows us a list of customers who purchased between 365 days ago to 60 days ago. That’s going to give us a 10-month window and also a list of those people that purchased something in those months.
We do exactly the same for the second variable PriorCustomers, but we based it on the last 60 days from the first day of the month. So we’re looking at 2 months back not at the current month.
The COUNTROWS show how many of these customers purchased in the period before (CustomersPurchased VAR), but didn’t purchase in this particular period (PriorCustomers VAR).
The EXCEPT function is going to return another virtual table of just the customers whom we consider to be lost customers. And we’re going to multiply it by -1, and that’s how we get the particular number here in the chart.
Who are these -9, -15, -8, etc. customers? We need another formula to get this list of customers.
To show the list of customers whom we considered as lost, we use the same virtual tables in the Lost Customers formula. The difference is that we are trying to calculate an amount — how much do we actually consider lost?
The CALCULATE function here brings a value into this table and we’ll blank out every other value. We’re not showing all the customers who we’re not considering lost. All we’ll see are the sales amount of customers who we’re considering lost.
So instead of going COUNTROWS EXCEPT, we use the variables as some context inside of a CALCULATE function. Then, we work out the Total Sales between the time period we identified using the DATESBETWEEN function.
I hope you enjoy exploring this topic more!
In this tutorial, you’ll learn about the different Power BI compression techniques in DAX Studio that help optimize your report.
After data is loaded segment by segment by the Analysis Services in Power BI Power Pivot and SSAS, two events occur. The first one is that they try to use different encoding methods to compress columns to reduce the overall RAM size. The second one is that they try to fund out the best sort order that places repeating values together. This method also increases compression and in turn, reduces the pressure on the memory.
There are different compression techniques used by Analysis Services. This tutorial covers three methods, in particular, namely, Value Encoding, Run Length Encoding, and Dictionary Encoding. In the last section of this tutorial, it’ll cover how to sort order works in Analysis Services.
The first one is called Value Encoding.
Value Encoding seeks out a mathematical relationship between each value in a column to reduce memory. Here’s an example in Microsoft Excel:
This column requires 16,384 bits in order to store the values.
To compute the bits required, first use the MAX() function in Excel to get the highest value in the columns. In this case, it’s 9144. Then, use the POWER() function to calculate the bits required. Use the argument POWER(2, X) where X is any positive value that will return an answer that’s greater than the MAX value. X, in this case, also represents the bits required. So for this example, the value of X is 14 which results in 16,384. Therefore, the column requires 14 bits of storage.
To reduce the required bits using Value Encoding, VertiPaq seeks out the MIN value in the column and subtracts it from each value. In this case, the MIN value in the column is 9003. If you subtract this from the column, it’ll return these values:
Using the same functions and arguments, you can see that for the new column, the MAX value is 141. And using 8 as the value of X results in 256. Therefore, the new column only requires 8 bits.
You can see how compressed the second is compared to the first column.
Once the data is compressed and you try to query the new column, the Storage Engine or Vertipaq scans this column. They won’t simply return the new values of the column. Instead, they add the subtracted value before returning the result back to the user.
However, Value Encoding only works on columns containing integers or values with fixed decimal numbers.
The second encoding method is called Run Length Encoding.
Run Length Encoding creates a data structure that contains the distinct value, a Start column, and a Count column.
Let’s have an example:
In this case, it identifies that one Red value is available in the first row. It then finds out that the Black value starts at the second row and is available for the next four cells. It proceeds to the third value, Blue, which starts at the sixth row and is available for the next three rows. And this goes on until it reaches the last value in the column.
So instead of storing the entire column, it creates a data structure that only contains information about where a particular value starts and where it ends, and how many duplicates it has.
For columns with the same structure, data can be further compressed by arranging the values in either ascending or descending order.
With this properly sorted column, you can see that the Run Length Encoding method now returns a data structure with one row less.
So if you’re dealing with many distinct values, it’s recommended to sort the column in the most optimal way possible. This will give you a data structure with lesser rows which in turn occupies lesser RAM.
Run Length Encoding can’t be applied to primary keys because primary key columns only contain unique values. So instead of storing one row for each value, it’ll store the column as it is.
The third encoding method is called Dictionary Encoding.
Dictionary Encoding creates a dictionary-like structure that contains the distinct value of a column. It also assigns an index to that unique value.
Using the previous example, let’s look at how Dictionary Encoding works. In this case, the values Red, Black, and Blue are assigned an index of 0, 1, and 2, respectively.
It then creates a data structure similar to that of Run Length Encoding. However, instead of storing the actual values, Dictionary Encoding stores the assigned index of each value.
This further reduces the RAM consumed because numbers take up lesser space than string values.
Dictionary Encoding also makes the tabular data type independent. That is, regardless if you have a column that can be stored in different data types, it won’t matter since the data structure will only store the index value.
However, even if it’s independent, the data type will still have an effect on the size of the dictionary. Depending on the data type you choose to save the column in, the dictionary (or data structure) size will fluctuate. But the size of the column itself will remain the same.
So depending on what data type you’ll choose, once Dictionary Encoding is applied on the column, Run Length Encoding can be applied afterward.
In this case, Analysis Services will create two data structures. It’ll first create a dictionary and then apply Run Length Encoding on it to further increase the compression of the column.
For the last part of this tutorial, let’s discuss how Analysis Services decides on the most optimal manner to sort data.
As an example, let’s look at a column containing Red, Blue, Black, Green, and Pink values. The numbers 1 to 5 have also been assigned to them. This acts as the dictionary of our column.
Now, fill an entire column in Excel with these values. Use this argument to generate a column containing these values at random.
Next, copy the entire column and paste it as a Value.
To reduce the amount of RAM consumed, you can sort the column from A to Z. If you check the size again, you can see that it’s been reduced to 12.5 MB.
The 1.9 MB reduction may not seem much. This is because the example used a single column in Excel to demonstrate. Excel is only limited to 1 million rows. However, in Power BI, your data can contain billions of rows and columns. The reduction in space used grow exponentially.
Once your data is sorted in the most optimal manner, Analysis Services applies either of the three compression techniques depending on the data type.
Doing so increases the compression of your data which greatly reduces the amount of memory consumed in your device. This makes your report more optimal making it easier to run and load.
Enterprise DNA Experts
I stumbled across a tool that is just so good and useful for some really common tasks in power BI when creating reports and visuals. It’s a color hex codes picker that you can use to easily get the colors for your Power BI reports. You may watch the full video of this tutorial at the bottom of this blog.
The tool is by a firm called Anny software and I want to give them a shoutout not only for making an awesome tool but for making it completely free. It’s got tremendous functionality and it solves some common problems and annoyances when dealing with hex codes and color themes in Power BI.
Let me show you some of the things that I’ve been using it for.
One of the common things that happens to me all the time is needing to figure out what the hex code color for something I’m using in my own report. For example, in this report, I would have to figure out what this blue color is. This is a PowerPoint background, so Power BI has no idea what that color is.
I would typically have to get a screenshot and then go into the effects tool on the eyedropper. But with this Just Color Picker, you don’t need to do any of that.
You just point to the blue and it shows you what the hex code is. And then from here, you can copy that value.
It’s really an easy way to build out a complete theme with this tool and dovetails perfectly with the Enterprise DNA Colour Theme Generator.
With the Color Theme Generator, you can go to Image to Colours, and then you can upload your own image here. I’ve uploaded an image here that I like, and I think it’s going to make a good theme.
And so the Image to Colours pulls out five colors from that theme. Then, I’ll use the color hex codes picker (Just Color Picker) to get the hex codes and then copy them.
I like this red from the lighthouse, so we can get that one as well. I also like the sand color from the promenade, so we can add that one to the list too. There’s a white from the waves that I also think will look good.
We can take all that. Then, I can take the blue base color that I like and from the sliders (HSV and RGB), we can take and create additional colors from that where we can change the saturation, etc. And we now have this dark blue-green color.
We can easily create a really nice color theme right away from this.
It has some built-in tools that are really handy. If you go to Tools and Color Wheels, it’ll show you for a given color what the complimentary colors are, and what the triad colors are.
You can choose more colors form here.
Then, you can go to the Tools again and there’s a function called Text.
This will show us the different combinations of background and texts. So clearly, you can see here in our example color combination that it doesn’t work at all.
Let’s take our text and change that to our promenade sand color and now it looks pretty good.
We can also change the background if we want to. We can just play around with the different colors in our theme to see if we need some additional complimentary colors.
Then, copy and paste this on a json file. We just need to get rid of the spaces and add quotes and pound signs. This is an example of what it should look like.
After that is saved, we can then pull it into Power BI.
Let me show you the final thing I do with this. This is something that I commonly do with my Power BI Challenge entries and reports. It’s modifying icons, particularly for navigation.
I’m using a service called Flaticon that has millions of really high-quality icons.
I’ve saved a bunch in a mini collection. These are ones that are frequently used. Let’s take this icon as an example. I use this as my standard for clear filters.
Let’s say we want to change that color from that greyish-black into something that matches the color theme that we just created. We’ll pop up the color hex codes picker, and in Flaticon, we can go to Edit Icon.
In the upper left corner, we can choose the color that we want to change. Then, we choose the color that we want from our color theme in the Just Color Picker. Copy the values of that color.
Paste the values or the hex code into the “new color” area. Format it correctly, and then we have the desired color in our icon.
Those are just three easy ways to use this color hex codes picker. It’s a great tool that you can use with your Power BI reports. I’m sure you’ll come up with many more ways to use this tool.
I hope this is helpful for you. Check out the links below for more relevant content.
Recently, I built and showed a Budgeting Analysis Dashboard in one of Enterprise DNA’s workshop. One feature of that Dashboard is the Cumulative Budget view. The webinar is linked here. You may watch the full video of this tutorial at the bottom of this blog.
Product Budgeting Analysis Dashboard
The dashboard itself is dynamic so I can change the time frame and select the products I want to track. This makes exploring the data extremely efficient if you’re comparing it against a benchmark.
Dynamic selection of time frame and products.
There was a seasonality aspect on my budget data and I needed to display it cumulatively. The visualization I created compares the BUDGET against SALES and SALES LAST YEAR. The dark blue line represents BUDGET and gives a good direction how the performance is against SALES.
In my view, using cumulative totals is the best way to evaluate trends. How your actual results compare versus your budget is ultimately what we want to look at.
We will discuss setting up cumulative totals in detail for this tutorial. We will discuss the formula and technique I used to do it for this dashboard.
First, we need to go to another page to set up the scenario and data table. This makes it easier to see what’s going on with the data itself. We then create a data table with DATE, the TOTAL SALES from the Key Measures and the BUDGET ALLOCATION from the Budget Measures.
The budget is set to allocate for every single day because the data context is by DATE. At the moment, the budget is not cumulative. We are going to use DAX formulas to make it so.
The formula looks complicated but if you work out how it’s set up, it will make sense. Variables are used in the formula – I will link the tutorial for the formula in detail.
Budget Allocation formula.
In your data, the budget can come in different granularities. It can be monthly like in the example above, it can also be yearly or weekly – this depends on how you define your data in the beginning.
Power BI allocates the budget based on how you set up your formulas.
The Product Budgets is even more complex because aside from the Amount, it also has the Product ID.
We can switch the DATE field into MONTH & YEAR instead and still get the correct breakdown because our formula is dynamically set up.
Using Date to group data.
Using Month & Year as the date field.
We’ll use bar charts to visualize this data and compare the daily performance to our budget allocation. This already gives us a good insight in itself – however, this is still not cumulative.
Next, we duplicate this chart and turn the duplicate into a table to see the actual values.
We’ve talked about Cumulative Sales many times before which follows this formula:
The Cumulative Budgets has a slightly different approach because we need to use complex DAX formulas.
The big difference when calculating Cumulative Budgets is that we can’t use the Budget Allocation by itself. It needs to pass through several DAX formulas to refine it.
If you notice on the right hand side of the SUMX formula we have the Budgets variable. What’s interesting here is we declared the Budgets variable inside the SUMMARIZE formula, DAX formulas can use a column that you virtually created as a reference right away. You will see it is similar with our other Cumulative formulas, except the SUMX portion.
To review, we SUMMARIZE the Budget Allocation at the same time, creating the Budgets variable. We then use SUMX on this Budgets variable to create the virtual table where we get the cumulative totals.
Now we add the Cumulative Budgets column to the table and we see that it adds the budgets cumulatively on all dates. This is a great way to represent seasonality in your data.
We then remove the columns we don’t need and change the table into a graph. This represents the data effectively in a cumulative way and shows the deviation better.
From the visualization perspective, you will identify trends better by using different elements together. I went through many other samples in the Advanced Budgeting Session. I’ll put a link below to the replay which is up on Youtube as well.
If you want to play around with this sample file, it is up on the Showcase page.
All the best
Power BI – Uses in Finance
A Microsoft-owned business analytics service that offers a wide range of data visualization and data warehousing services
Published April 27, 2023
Updated July 7, 2023What is Power BI?
The tagline for Power BI, “Bring Your Data to Life,” very clearly demonstrates the purpose of the Microsoft-owned business analytics tool. Power BI is an assortment of several data analytics-based services and systems that primarily focuses on visualizing business data and making it more interactive for organizations.
Created by Microsoft, Power BI is a cloud-based service provider offering data visualization and data warehouse services aimed at making data more interactive for the user. The application includes a wide range of data analytics services such as data preparation, custom visualization, data discovery, data warehousing, data reporting, interactive data sharing, data organization, and many others.
Owing to its substantial popularity, Power BI is available across different usage platforms. Power BI Desktop is used to access the analytics tool using a Windows-based desktop. It can also be accessed online using Power BI online SaaS, which is the application’s online software service. In addition, it can also be accessed using Android and iOS devices using the Power BI applications developed for the purpose.Key Highlights
Power BI is an assortment of several data analytics-based services and systems that primarily focuses on visualizing business data and making it more interactive for organizations.
Power BI is a cloud-based service provider offering data visualization and data warehouse services. The services include data preparation, custom visualization, data discovery, data warehousing, data reporting, interactive data sharing, data organization, and many others.
Owing to its substantial popularity, Power BI is available across several different platforms.How is Power BI used in finance? 1. Easy consolidation of large to very large data sets
Power BI is a very efficient tool for business data organization. There are generally different kinds of limitations regarding the volume, nature, and complexity of data, and its reporting and organization across various data management mediums. However, this application offers exceptional financial data management services, with absolutely no limitations on the reporting of financial data, no matter how large the company is or how complex its data.2. Excellent projections mechanisms
Power BI offers exceptional data projection systems. Financial projections are an integral part of a business’ operations, and several vital decisions are drawn essentially from financial projections. Hence, it is an integral part of any organization’s data management processes.
The application offers a data projections function called “what-if parameters” that create interactive data projections and are very efficient for comparison. It is a vital tool to draw up projection statements of any number and kind of assumptions.3. Data trends and patterns
Power BI comes with a built-in time intelligence feature. It also provides the ability to arrange data in accordance with different data dimensions and parameters. Using such features, it is very easy to spot data trends or data patterns over several years or across the market competition. They are very useful in drawing important conclusions about business operations and making important financial decisions about profitability, budgeting, business expenses, etc.4. Quick Insights feature
Power BI runs on powerful data analysis algorithms, which fuels the efficient working of the Quick Insights feature of the software. The tool provides data implications and draws various facts and conclusions from the dataset provided by the user. It is a very useful tool for a financial planner who would want some insight or draw conclusions from the financial statements for the year, and so on.5. Power View
Power View is an essential feature of Power BI that allows generating interactive charts, graphs, and data maps. It helps generate visually interactive charts and graphs from the financial dataset and consolidating that information to draw conclusions and make important financial decisions.6. Collaboration
Power BI is a collaborative platform, i.e., it is easily shareable and accessible across different users, while, at the same time, offering high security and protective measures. Hence, the finance department of the business or financial management teams of projects can collaborate. They can work together in sharing financial datasets, publishing financial reports and dashboards, exchanging vital data conclusions, and working collaboratively and efficiently.Conclusion
The aforementioned uses of Power BI serve the purpose of financial data management and analysis exceptionally well. However, Power BI is not limited to just such uses; it offers a wide array of services for data management. It is considered an integral tool of financial management for large businesses and corporations, where financial data is in large numbers and is much higher in complexity.
To learn more about the dashboarding tool, check out CFI’s Power BI Fundamentals course! Learn more about telling meaningful stories with data, organizing and manipulating large and complex datasets, and creating powerful dashboards.Related Readings
Update the detailed information about Running Sensitivity Analysis For Power Bi Using Dax on the Achiashop.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!