Simulated Streaming Data -> Power BI Embedded

This post is about something new I have tried last week. The goal was to create simulated streaming data source, feed it into Power BI as a streaming dataset, create a report out of the streaming dataset, and then embed it to an web application. With proper directions provided by my teammates, I finished the implementation from end to end within 1.5 hours. I was super impressed by how awesome it is and how easy it is to implement so that I want to share those directions to you.

Main Step 1 Create simulated streaming data and push it to a Power BI as streaming dataset

  1. The basic steps include:
    1. Set up an Azure IOT hub to capture the data
    2. Use IOT Simulator https://www.azureiotsolutions.com/Accelerators#dashboard to simulate streaming data
    3. Create a Power BI App Workspace in PowerBI.com
    4. Set up Streaming Analytics Job to connect data coming from IOT hub to Power BI datasetstreaming analytics job

  1. There is a free tutorial course from EDX talks about the above steps in details. You do not have to finish all the tutorial and you can just go through the ones in green check marks as shown in below.

https://www.edx.org/course/iot-business-intelligence-for-iot-solutions

                 edx tutorial for iot

Main Step 2: Create a Power BI report using the streaming dataset

  1. Open Power BI Desktop
  2. Click on Get Data
  3. Choose ‘Power BI dataset’
  4. Choose the Power BI streaming dataset you just created from Main Step 1

powerbi datasets

  1. Once you connected to the streaming dataset, you can use it to create your graphs and charts.
  2. Lastly, click ‘Publish’ to save your PBIX file back to PowerBI.com.

Main Step 3 Embed your Power BI report to your application

  1. If you need to embed your report into your application, please follow this tutorial https://docs.microsoft.com/en-us/power-bi/developer/embed-sample-for-customers

This tutorial introduced the following important topics which we have not done in the previous steps

  1. How to register an application in Azure Active Directory (Azure AD) which allows your application to interact with Power BI.
  2. It provides a sample application package from GitHub. You can just update webconfig file of the sample package on applicationID you just created, the workspaceid where your report locates, the reportid of your report, and your Power BI credential.

  1. Lastly, to set up automatic refresh of your streaming dataset on your application, please try the following blog

http://radacad.com/integrate-power-bi-into-your-application-part-4-refresh-data-set

 

Thank you,

Your friend, Annie

 

My first week at Microsoft

On Monday March 3rd 2019, I started my journey at my dream company Microsoft. I would love to share the experience I have acquired over the past few days.

How did I get in

You may be wondering why I’ve decided to join Microsoft and how I got here, so let me tell you my story leading up to today. After finishing my master’s degree in Information Systems from Indiana University, I joined Anthem (one of the largest health insurance companies in the industry).

My team back then relied heavily on Microsoft products and worked closely with its supporting team to build its world class data infrastructure to support Anthem’s customer operation business. From that experience, I was not only fascinated by its products and services but also its community, filled with passionate and supportive individuals who love to learn and share.

Feeding off of this passion, I began joining local PASS SQL groups, attended and presented at SQL Saturdays, got my Microsoft certification with SQL Server 2016, and started this very tech blog you are reading. After a few years in the work place, I started to sense that cloud, big data and AI were and are the future of our world, and that the space provided me with a unique set of challenges to solution problems and share my knowledge with others, both being things I love.

In order to broaden my cloud technology skills and get more involved with direct customer interactions so to help them with my skills, I accepted the offer from MCAConnect a Microsoft Golden Partner consulting firm as senior consultant. From that experience, I got the opportunity to leverage Microsoft cloud technologies to help customers of different sizes, working directly with executive or management level clients to build solutions for their business needs. I took great pride in being able to leverage my knowledge in Microsoft technologies to help impact, change, and improve the various business processes of varying clients.

Yet despite the opportunities I was given, I still casually found myself day dreaming of one day pacing around the Seattle office with a fresh hot of the printer Microsoft badge, swimming in one of the richest pools of knowledge in the world. It wasn’t until the end of 2018 that my day dreaming became a reality. I had attended SQL Saturday in Washington DC, and after the event, the Microsoft One Commercial Partner (OCP) group reached out to me for a few openings. I had reached out to my Microsoft friends, my mentor James and the hiring manager Justin to understand better the job and Microsoft’s working culture. I found out that the Cloud Solution Architect for Big Data and AI under Justin’s team was the best fit for me and aligned well with my personal goal, and shortly after our discussion, I applied for the position.

After 1 general interview and 2 technical interviews, I got the offer.

To be honest, I felt like I had bombed my technical interviews as I wasn’t able to answer a ton of the questions asked, but it wasn’t until later, after reading Microsoft CEO Satya Nadella’s book ‘Hit Refresh’, that I realized why it actually made sense that I got the offer. Microsoft doesn’t hire know-it-all’s but the learn-it-all’s, welcoming people with a growth mindset. The moment I got the Microsoft offer, I admit I was ecstatic, more so than I had ever been in my professional career, as it was a truly elusive dream come true moment. Of course, after waiting through a lengthy hiring process, done through an internal career system which tracks progress and collects the necessary documents, I found myself talking with my HR manager about onboarding, a complicated process due to the size of the company. Along the way, my manager and the HR manager worked diligently and patiently, communicating with me regularly throughout the 1 month process, reducing my anxiety levels significantly.

Redmond, Seattle and Microsoft Campus

My New Employee Orientation was set to be at Microsoft’ headquarters located in Redmond, WA. My flight, hotel, and car rental were pre-booked by the company based on my schedule. Last Sunday evening, I landed at Seattle Airport, and from the rental car office to the hotel, I got all kinds of Welcome to Microsoft signs and greetings from the service companies, an experience comparable to a home coming. The hotel I stayed at was the Marriott in Redmond town center, a lovely shopping center with all kinds of restaurants, fitness studios, and shops.

Next morning, I drove to campus to attend New Employee Orientation (NEO). The building was called building 92, and at first, I thought it must have been a random number as I could not imagine any company having more than 90 buildings on headquarter campus alone. It was, however, true, and to my astonishment, the campus was still expanding. It was so large that transportation between buildings often relied on shuttle cars. It had multiple walking/running trails, football and basketball courts, on campus shopping centers, like those in Disneyland, and for every 3-4 buildings, there was a food court, summing up to over 30 food courts on campus providing all kinds of cuisines.

The campus was covered with trees, creating the feeling of working in a park. Later, my manager showed me and one of my teammates the on campus treehouses designed by someone famous, though I couldn’t remember the name, and those treehouses were accessible by all employees based on reservation. Then for the next few days, I visited multiple buildings. The offices were illuminated with bright natural light and were separated into different rooms, enhancing my mood as well as providing a feeling of privacy. It fit the culture of Microsoft, formal but fun.

Also, very important, provided for free on every floor and corner, was a variety of drinks and office supplies free for the taking.

New Employee Orientation (NEO)

NEO happened in building 92 which was the visitor’s center. It started with new employees (NE) gathering in the product showcase room where we had access to coffee, Xbox, VR, and networking opportunities. Some posed for a picture in front of a photo featuring the first team of Microsoft employees all of different ages, genders, and cultural backgrounds.

At around 8:00 a.m., we were led through different stations to process our I-9, get our photo taken for employee badges, and have breakfast before entering the conference room where the NEO meeting was hosted. In the room, we were randomly assigned to different tables, done on purpose so for us to interact with individuals from different areas to gain exposure to the different groups in the company. Leading hosts of this events were senior leaders and employees, all who were volunteers. It started with simple greetings and storytelling exercises for each table. Then one of the hosts delved into the company culture and his experiences at Microsoft, followed by the HR manager introducing the benefits which Microsoft provided. During this session, nearly everyone around me, including myself, whispered amongst each other, doubting what we heard and checking our sanities amongst each other. Are all of those really free and reimbursable? I have to say that although the general benefits including 401k, health insurance, family support, Health Savings Accounts, and employee stock purchase plan were amazing, the ones that caught my attention were the fitness benefit, tuition support and donation matches, as my health and growth are the two stabilizing factors to my career.

Following the benefits introduction, there were more stories and program introductions provided by different speakers, all of which reinforced my excitement for finally being here. During lunch, hiring managers came to the conference room to greet their new hires before taking them to lunch. My manager Justin came to welcome me and my new teammate Alex, and proceeded to take us to the café building nearby to provide us with our new PC. After lunch we were sent back to the NEO building to finish up with our security settings and the benefit vendor tour.

NEO officially ended at around 2pm. Justin soon after gave us a brief introduction meeting and provided some tips on what to expect for the next couple of days and weeks. He also told us there was a TechFest on campus happening this week so in lieu of a short Q&A meeting with him each day, we were welcome to attend the event or leave earlier to check out the ins and arounds of Seattle.

TechFest and Seattle area

TechFests, in conjunction with other events, happen every now and then and offer various keynote sessions and presentations. Here, you can learn of the latest research happening in the tech word as well as inside different teams at Microsoft. This week’s TechFest was about AI and machine learning. I was so impressed by all the new inventions and studies out there in our field, and felt so fortunate to be so close to the inventors and researchers, allowing me to talk directly to them about their ideas while asking questions. It was quite an experience to be surrounded by such creative and curious individuals, all so very accomplished and intelligent, all leaders of the industry.

After each day, I had a chance to visit the Seattle area, a place where I have never been before. It was such a beautiful city with lakes, ocean, and mountains, home to multiple large corporations such as Microsoft, Amazon and Starbucks. There were so many hiking, running, and biking trails, various outdoor activities, a paradise for the adventurous soul. It was also one of the more foodie cities I have seen, as all of those companies and universities embrace diversity, offering a vast choice of ethnic cuisines from around the world.

My experience during my first week at Microsoft truly reflects its mission statement “Empower every person and every organization on the planet to achieve more”, reflecting its culture of valuing growth mindset. Microsoft is a place where an aspirational technology professional as myself can use as a platform to empower my skillsets to impact the world.

I am ready for this new chapter of my career.

Thanks for reading,

Your friend, Annie

You may need a table variable for IRR and NPV for DAX

I finished my Fundamental Finance Accounting course last week. I learned 2 awesome functions which helps which which projects to undertaken. Frist is NPV net present value and the second is IRR(Internal Rate of Return). I found those two are so powerful to use helping to decide which investment to choice so I want to implement in DAX. Fortunately, in DAX there are two functions XNPV and XIRR which I can choose to use. I found that XNPV and XIRR are also in excel and would be a replacement of IRR and NPV.  They have slightly different calculation details but the purpose are the same. You can find http://www.differencebetween.net/business/finance-business-2/difference-between-npv-and-xnpv/ and https://corporatefinanceinstitute.com/resources/knowledge/modeling/xirr-vs-irr-excel-financial-model/  for more details.

The Excel XNPV function is a financial function that calculates the net present value (NPV) of an investment using a discount rate and a series of cash flows that occur at irregular intervals. Calculate net present value for irregular cash flows. Net present value. =XNPV (rate, values, dates)

The Excek XIRR(Internal Rate of Return) is the discount rate which sets the Net Present Value (XNPV) of all future cash flow of an investment to zero.  If the NPV of an investment is zero it doesn’t mean it’s a good or bad investment, it just means you will earn the IRR (discount rate) as your rate of return. =XIRR(values,dates,guess)

The challenge in DAX, I found, is that , we may not have the date field. For example, we may just calculate a intial investiment and an estimate assuming paying amount every year or every month in the future. Something like this:

Assumption - IRR NPV

So in DAX, we can create table variable in the Measure to leverage XIRR and XNVP.

XIRR =
VAR cashflowtable =
    UNION (
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () )YEAR ( NOW () ) ),
            VAR inc = [Value]
            RETURN
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Initial Inventestment] )
                )
        ),
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () ) + 1YEAR ( NOW () ) + [Investment Years] ),
            VAR inc = [Value]
            RETURN
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Annual Cash flow] )
                )
        ),
        GENERATE (
            GENERATESERIES (
                YEAR ( NOW () ) + [Investment Years],
                YEAR ( NOW () ) + [Investment Years]
            ),
            VAR inc = [Value]
            RETURN
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Exit Value] )
                )
        )
    )
RETURN
    XIRR ( cashflowtable, [Values], [date] )
XNVP =
VAR cashflowtable =
UNION (
GENERATE (
GENERATESERIES ( YEAR ( NOW () ), YEAR ( NOW () ) ),
VAR inc = [Value]
RETURN
ROW (
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Initial Inventestment] )
)
),
GENERATE (
GENERATESERIES ( YEAR ( NOW () ) + 1, YEAR ( NOW () ) + [Investment Years] ),
VAR inc = [Value]
RETURN
ROW (
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Annual Cash flow] )
)
),
GENERATE (
GENERATESERIES (
YEAR ( NOW () ) + [Investment Years],
YEAR ( NOW () ) + [Investment Years]
),
VAR inc = [Value]
RETURN
ROW (
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Exit Value] )
)
)
)
RETURN
XNPV ( cashflowtable, [Values], [date], [Interest Rate] )

Hopefully, you find this helpful.

Thanks,

your friend, Annie

Migrating Local Power Query to PowerBI Data Flow and link it to Azure Data Lake Storage account

Within Microsoft lastest PowerBI update, the one exites me most is that we can link PowerBI data flow to Azure data lake storage account.

Benefits are:

  1. When you have a PBIX imported mode reaching more than 1 GB and you are not Primier memeber you can not publish to PowerBI.com. However, with DataFlow in PowerBi you can save upto 10 GB data.
  2. Your workspace members are able to reuse the same data flow as souce of their PowerBI file.
  3. When liked to Azure Data Lake Sotrage Gen 2 you can use data got from PowerBI Dataflows as data source across your Azure applications.14a64eeb-b1a7-4a12-889f-8b573c98ef20[1]

To try this, you need to have the following things

You have PowerBI-Pro license and above.

You are an admin to your Azure portal. (Or you are good friend of your admin ^_^)

There are few things I found out when I did my experiment and I think you might face are:

  1. You can use blank query to copy M Codes from your Power Queries
  2. If you have a table function to transfer and you don’t have primier account, you can still do so by select “do not load”
  3. To configure Azure Data Lake Gen 2 connect to PowerBI, follow this document from microsoft. https://docs.microsoft.com/en-us/power-bi/service-dataflows-connect-azure-data-lake-storage-gen2
  4. When you configured everything right according to the above microsoft document. You may still have issue like I did. Please follow the following communication string  to see the lastest udpate on the PowerBI community discussing. https://community.powerbi.com/t5/Service/Power-BI-Dataflow-to-Azure-Data-Lake-Gen2-Setup/td-p/583564

Latest update, Microsoft team has fixed the above. Now I have tested and works fine.

Thanks,

Your friend, Annie

How to calculate PMT and CUMPRINC in DAX

PMT is a function in excel for the periodic payment for a loan.

In excel, The Syntax is PMT(rate, nper, pv, [fv], [type])

  • Rate    Required. The interest rate for the loan.
  • Nper    Required. The total number of payments for the loan.
  • Pv    Required. The present value, or the total amount that a series of future payments is worth now; also known as the principal.

Yet, in DAX, there is no built in function yet for PMT. To calculate this we need to reapply for its mathmatic logic.

So, you can use

DAXVersionPMT = ((Pv*Rate))/(1-((1+Rate))^(-1*Nper))))*(-1) 
// Remind: if you are trying to calculate monthly payment but rate is annual rate, then you need to convert to monthly rate by devide to 12
CUMPRINC is the accumuated principle amount for a loan 

In excel, the syntax is CUMPRINC(rate, nper, pv, start_period, end_period, type)

The CUMPRINC function syntax has the following arguments:

  • Rate    Required. The interest rate.
  • Nper    Required. The total number of payment periods.
  • Pv    Required. The present value.
  • Start_period    Required. The first period in the calculation. Payment periods are numbered beginning with 1.
  • End_period    Required. The last period in the calculation.
  • Type    Required. The timing of the payment.

In DAX right now, there is no built in function fo rCUMPRINC yet.
So, you can use the follow DAX
Accumulated Principle = var p=[PV]

var i=[Rate]

var n=[Nper]

var b=[Start_period]

var e=[End_period]

var m=[DAXVersionPMT]*(-1)

var q=12 // if rate is annual rate and you calculated you PMT as monthly period, then you need to convert to monthly rate by devide to 12, if the rate is for monthly rate, then you just need to use q=1

Return ((p-m*q/i)*(1+(i/q))^(e)+m*q/i)-((p-m*q/i)*(1+(i/q))^(b-1)+m*q/i)

Thanks,
Your friend,Annie

An easy way to compare all tables’ DDL between two SQL environments

To continue my story about data warehouse migration from UAT to PROD environment.

The DDL difference between the same source loading table from two environemnts will cause my data warehouse load fail, so I need to find an easy to compare the two for all my sourcing tables . There are a lot of good tools with free trail out there to choose. However, there are some regulation from my client side to installing third party tool so I need to create a query to help me to do so instead.

Inspired by an answer from this Post which compares one table. I generated the following query which can loop through all of my targeted sourcing tables.

DECLARE @Table VARCHAR(100);
DECLARE @i int=1;
While @i<= (select max([RANKColumn]) from [dbo].[EvaluationTableList] ) — this is the list table which contains all the tables you want to validate
Begin
SET @Table=(select [TABLENAME] from [dbo].[EvaluationTableList] where [RANKColumn]=@i)
insert into [dbo].[EnviromentObjectDifference] — (this is the output table)
SELECT Table1.ServerName,
Table1.DBName,
Table1.SchemaName,
Table1.TableName,
Table1.ColumnName,
Table1.name DataType,
Table1.Length,
Table1.Precision,
Table1.Scale,
Table1.Is_Identity,
Table1.Is_Nullable,
Table2.ServerName as T2ServerName,
Table2.DBName as T2DBName,
Table2.SchemaName as T2SchemaName,
Table2.TableName as T2TableName,
Table2.ColumnName as T2ColumnName,
Table2.name as T2DataType,
Table2.Length as T2Length,
Table2.Precision as T2Precision,
Table2.Scale as T2Scale,
Table2.Is_Identity as T2Is_Identity,
Table2.Is_Nullable as T2Is_Nullable
FROM
(SELECT @Server1 ServerName,
@DB1 DbName,
SCHEMA_NAME(t.schema_id) SchemaName,
t.Name TableName,
c.Name ColumnName,
st.Name,
c.Max_Length Length,
c.Precision,
c.Scale,
c.Is_Identity,
c.Is_Nullable
FROM [ServerName01].[DatabaseNameO1].sys.tables t
INNER JOIN [ServerName01].[DatabaseNameO1].sys.columns c ON t.Object_ID = c.Object_ID
INNER JOIN sys.types st ON St.system_type_id = c.System_Type_id AND st.user_type_id = c.user_type_id
WHERE t.Name =@Table) Table1
FULL OUTER JOIN
(SELECT @Server2 ServerName,
@DB2 DbName,
SCHEMA_NAME(t.schema_id) SchemaName,
t.name TableName,
c.name ColumnName,
st.Name,
c.max_length Length,
c.Precision,
c.Scale,
c.Is_Identity,
c.Is_Nullable
FROM [ServerName02].[DatabaseNameO2].sys.tables t
INNER JOIN [ServerName02].[DatabaseNameO2].sys.columns c
ON t.Object_ID = c.Object_ID
INNER JOIN sys.types st
ON St.system_type_id = c.System_Type_id
AND st.user_type_id = c.user_type_id
WHERE t.Name = @Table) Table2
ON Table1.ColumnName = Table2.ColumnName
where Table1.ColumnName is null or Table2.ColumnName is null;
SET @i=@i+1
END

Hopefully, you find this helpful.

Thanks,

Your friend, Annie

How to use T-SQL to validate all SSIS packages at once

Recently, I need to do a data warehouse migration for a client. Since there might be some difference between the Dev environment source databases and Prod environment source databases. The migrated SSIS packages for building data warehouse might have some failures because of the changes. So the challenge is how can I validate all my DW packages (100 +) all at once.

There are a good amount of posts out there to do one package validation at a time using the Management Studio GUI (just right click on the package under SSIS Catalog)

validate SSIS package

Or using T-SQL store procedure [SSISDB].[catalog].[validate_package]. Here is the explanation of this store procedure from Microsoft.

To get the validation all at once, we can use a loop query to execute [SSISDB].[catalog].[validate_package] one by one.

Here is the code I created and would love to share.

DECLARE @i INT= 1;
DECLARE @validation_id BIGINT;
DECLARE @packageName NVARCHAR(250);
WHILE @i <= (SELECT MAX(package_id)
FROM SSISDB.catalog.packages)
BEGIN
SET @packageName = (SELECT concat(NAME, ‘.dtsx’)
FROM SSISDB.catalog.packages
WHERE package_ID = @i);
EXEC [SSISDB].[catalog].[validate_package]
@package_name = @packageName,
@folder_name = N’(the folder name which the packages are in)‘,
@project_name = N’(the project name which the packages are in)‘,
@use32bitruntime = False,
@environment_scope = S,
@reference_id = 2,   /* this is the reference id of the environment the package may use. If there is no environment reference, you can assign NULL’ */
@validation_id = @validation_id OUTPUT;
SET @i = @i + 1;
END;
After you ran the above statements, you can when use the following query to check the validation result.
SELECT  object_name,
                CASE status
                    WHEN 4
                    THEN ‘Fail’
                    WHEN 7
                    THEN ‘Success’
                END AS ‘Status’
FROM SSISDB.catalog.validations;
Thanks for viewing and I would love to have any feedbacks.
Your friend, Annie

Create Dynamics CRM Online – User Activity PowerBI Report

Couple days ago, a client of mine is asking me to build a Dynamics CRM Online user activities dashboard. In Dynamics CRM On-premises version, I know, this information resides in audit table. However, with Dynamics CRM online, I am not able to access directly to the backend database. Fortunately, there are few PowerBI content packs for Dynamics CRM online out there in market place so that I was able to read through the PBIX file and find out how PowerBI is connected to the Dynamics CRM Online data.

PowerBI is using OData as data source to connect to Dynamics CRM online web services which gives you almost all data available (There are still some important fields and tables which are in On-premises version and causing the challenge of build reports effectively) . You just need to change the highlighted piece below to your organization CRM web name.

CRM Online Audit Report snapshots

 

Odata Tables available

I did find the ‘Audit’ table available. However, there are few challenges to interpret the data. I am listing the challenges below and the steps I handled it.

  1. Initially, the ‘Audit’ table is blank.
    • This is because in CRM online you need to enable the setting of Audit Tracking. You can follow this blog to enable it.
  2. Every field is in its code or id format. And, there is no mapping table in Odata connection to map between field id and field value.
    • I have check the Microsoft provided content pack and found out those mapping table are hard coded. So I have to google the CRM Audit Entity documentation in Microsoft to get the hard coded mapping for the following fields.
      • Operation – The action that causes the audit – It will be create, delete, or update
      • Action – Actions the user can perform that cause a change
      • Audit Table fields
  3. You can expand the userID field to get the information  (such as full name, login account, address, etc.) of user who caused a change. This however, only provide information for users who make changes like ‘create’, ‘update’, and ‘delete’. For ‘access’ status, this userid only reflect as ‘SYSTEM’ as the user.
    • To get the information about the user’s name who access CRM, we need to link the ‘_objectid_value’ field from Audit table to the ‘UserId’ field from systemusers table.
  4. Missing ‘ObjectTypeCode‘ field. This is the key field is missing from this table for the Odata connection data pull. Thus, we are not able to find which entities the changes were made on.
    • Unfortunately, I have not find a solution for this unless CRM team reveal this field for the CRM Audit Entity.
    • If this field is available in the future, we can use this field in combine with ‘Attributemask’ fields to get the column names of the objects which has been changed. So that we can build further analysis report.

For now, with the information we can get, we can start to build dashboard with user login activities.

Final Result.PNG

Thanks.

PowerBI – two ways to dynamically change measures in the same visual

I love that you can always find multiple ways to solve one problem. Inspired by the Youtube video by Guys in the cube and blog reading from SQLBI.com, I found there are two ways to dynamically change measures in the same visual. The first way is more towards front end report developer using bookmark and the second way is more towards data modeler using a technique called disconnected tables.

Bookmarks method:

Bookmarks

The basic steps are:

  1. Enable Bookmarks under File ->’Options and Settings’ ->Options->Preview Features->’Bookmarks’
  2. Create one visual for example a bar chart with first measure, for example Revenue by Sales Territory
  3. and copy paste the visual 2 times. in the coped visuals replace the measure you want to replace. For example cost by sales territory.
  4. Overlay the three visuals together.
  5. Go to ‘View’ on the top tool panel and make sure ‘Bookmarks Pane’ and and ‘Selection Pane’ are checked
  6. Add bookmarks with selection of visuals to be visible associated with the bookmark you created
  7. .Bookmark setting
  8. Add the images/shapes which represent the measure you want to select on and link it to the bookmark you created.
  9. Image link

Disconnected Table method:

This method is more towards PowerBI modelers. Basically, the idea is to have a Field in a independent table (no relationship to other tables) as Slicer with your measure choice and then create a measure using SELECTEDVALUE function to have the measure dynamically switch referring measures based on the choice made on the slicer.

Step 1. Created an new table called ‘Measurechoice’ like image below.

Discounnected table

Step 2. Create a measure called ‘Selected Value’

Selected Value = SWITCH(SELECTEDVALUE(‘Measurechoice'[Index]),1,[Total Revenue],2,[Total Cost],3,[Total Profit],BLANK())

Step 3. Create a slicer using ‘MeasureChoice’ from the new table created.

Step 4. Create a visual use the ‘Selected Value’ as Value

Selection Result

Personally, I like the disconnected table method because I don’t need to spend time on creating images and link bookmarks. However, the bookmarks choice can help dynamically change the title of the visual and give you the ability to change the layout of your bookmark views. Now it is your choice to choose either one to fit your reporting needs.

Thanks for spending time reading this blog.

Blog at WordPress.com.

Up ↑