How to rename a Microsoft Flow

Just a reminder for my future-me!

If you ever wanted to rename an existing Microsoft Flow and you found not way of doing it.. here is the solution:


  • Just click into the name of the flow and edit the name! It is just as easy as that! 😉


Posted in Best Practices, Power Platform, Tips | 1 Comment

SQLKonferenz 2019 – Session zu Power BI Dataflows

Nächste Woche ist es wieder soweit – die SQLKonferenz findet wieder statt. Wieder in Darmstadt, wieder mit drei Tagen voller Sessions!

Dieses Jahr ist ein besonderes Jahr für die PASS Deutschland – wir feiern alle gemeinsam 15 Jahre PASS! Große Party am Mittwoch Abend ist angesagt..

Ich freue mich, dass ich auch heuer wieder eine Session halten darf und zwar über eines der (für mich) aktuell spannensten Themen: Power BI Dataflows


Meine Session findet am Donnerstag (21. Februar) um 11 Uhr statt – ich würde mich über deinen Besuch freuen!

Noch ist es nicht zu spät – Schnell anmelden und wir sehen uns nächste Woche in Darmstadt!



Posted in Power Platform, PowerBI | Leave a comment

Power BI Dataflows – My Session at the Virtual Power BI Days (2019-01)

Today morning (08:00 am Austrian time on a Sunday) I had the pleasure of presenting a session at the Virtual Power BI Days (2019-01 edition).

Power BI Days are an event (mainly) organized by Jan Mulkens (t) – One day of “real” conference in Belgium on Sunday and one day of virtual conference filled with webinars.

My session today focused on Power BI dataflows, the new data preparation option of

Let your data flow – Introducing Power BI dataflows

Power BI serves as a self-service BI platform with a strong focus on data preparation and interactive analysis. With the introduction of Power BI dataflows, self-service data preparation is brought to a new level. The main concepts used are:

* Usage of common and mature technologies: data is stored as entities following the Common Data Model in Azure Data Lake Storage Gen2;
* Integration: dataflows are created and managed in Power BI app workspace
* Self-Service and low-code/no-code – Power Query is used as data preparation engine
* Connectivity: dataflows will support a variety of different data sources (including cloud-based and on-premises sources)

Join this session if you would like to learn more about the basic concepts and especially see Power BI dataflows in action.

Jan will put up a recording of the session (and the slides) at

If you do not want to wait for the slides, you can download them directly from here:

2019-01-27 pbi dataflows virtual conference

Happy data flowing,



Posted in Power Platform, PowerBI | Leave a comment

24 Days of PowerPlatform – Day 24 – Wrap Up


This is it – It’s day 24 of my #24DaysPowerPlatform! 

When I had the first idea of this blog series, it was just a vision – and now – at the end of this series I hardly cannot believe that I kept up and wrote those 24 posts.

It was

… more work than expected,

… more feedback than expected and

…  it was more fun than expected!


As a wrap-up it’s now my part to summarize my PowerPlatform for beginners series:

PowerPlatform introduction

PowerPlatform examples

Power BI focused posts


Thanks for your continued visits – I hope you like it and I happily wait for your comments and feedback!

Happy PowerPlatform-ing,








Posted in 24 Days Of PowerPlatform, Power Platform | 1 Comment

24 Days of PowerPlatform – Day 23 – Flow and Cognitive Services


Hi and welcome to the almost last edition of my #24DaysPowerPlatform series!

In today’s demo I will continue the story of Excel Online and Microsoft Flow integration.

Today’s demo: Sentiment Analysis of text stored in an Excel file using Microsoft Flow

During (or better said – just before) my Power Platform sessions I encourage my attendees to fill out a short questionnaire about their knowledge of the PowerPlatform and in addition I ask them to enter their current emotions in one sentence.


Excerpt of the PowerPlatform questionnaire (Microsoft Forms)

The demo setup looks as follows:

  • Microsoft Forms for the questionnaire
  • A Microsoft Flow that handles all the answers and a) pushes the data into a Power BI streaming dataset (and further into a Power BI dashboard) and b) writes the results into an Excel file in Sharepoint Online.


The results in the Excel file look like this:


As you can see in the screenshot, there is already a column Score in the Excel file. This column will get a value using a Microsoft Flow that itself uses Microsoft Cognitive services to do text sentiment analysis. Cognitive services is a collection of multiple different services that allow your applications to “see, hear, speak, understand and interpret your user needs through natural methods of communication” (source: MS Cognitive services homepage). There are many services available and the areas touched are:


The Language services provide several functions to better understand text and for this demo I will use Text sentiment analysis.


Flow for Sentiment Analysis

I will not cover the steps of how to create a new Flow within Excel Online (see this blog post for a step by step description) – but the final flow looks like follows:


  1. The trigger for this Flow is coming from Excel Online (For a selected row)
  2. The first action calls the Sentiment Analysis from Microsoft Cognitive Services (the connection and configuration of Cognitive Services is shown in a later paragraph). This action gets the text from the current selected row and passes it to the sentiment analysis. The output/result of this action is a Score (ranging from 0 = negative to 1 = positive)
  3. The Scoring value is written back to the Excel file in the last action.

Configuration of Cognitive Services – Text Sentiment Analysis

Cognitive Services are created in the Azure Portal (


What you need to use Text Analysis in your Flow are the access/usage keys. You can copy them from


Flow Connector configuration:


Let’s see it in Action

You can start your workflow for a single line:


Or you can even start it for multiple lines:



Additional reading:


Posted in 24 Days Of PowerPlatform, Power Platform | 1 Comment

24 Days of PowerPlatform – Day 22 – What are Power BI dataflows?


In the world of data analysis and business intelligence, the part of preparing your data to work with, is seen as the most time-consuming task of any BI project. There are numbers that mention up to 80% of a total project budget for the ETL (extract, transform and load) works.

In the past, these ETL tasks were mainly driven by IT professionals that produced a huge amount of data transformation logics. In the Microsoft ETL universe, SQL Server Integration Services (SSIS) packages/projects and T-SQL logic were the main choices for solving these requirements.

During the last few years, the trend points into one direction: Self-Service BI! And self-service BI is mainly driven by business analysts and key users from the departments without a solid knowledge about data modeling and data preparation. Microsoft addressed these challenging task by introducing PowerPivot (the BI modeling tool integrated into Excel). A completely new story – new data modeling approach and a new formula language – DAX. The self-service trend for reporting and data modeling was definitely addressed with Power Pivot, but for the ETL part no real solution was provided..

“Until now”  (quote out of the “Dataflows in Power BI” Whitepaper by Amir Netz)

Introducing Power BI dataflows

This is where Power BI dataflows (at least now they have a name that did not change within the last few months – If you want to hear more about the story of this naming process.. ping me – I will happily talk about it.. :-))

Power BI dataflows provide the next step for re-using Power BI data preparation logic. In the past, data preparation implemented for Power BI datasets were limited to these datasets. No way of reusing logic from one dataset in another dataset!

Power BI dataflows are targeted to overcome existing data preparation limitations and will become first level citizens in the Power BI information hierarchy (figure taken from the whitepaper)


Power BI information hierarchy (source: Dataflows in Power BI whitepaper)

Dataflows read data from source systems, use Power Query to implement the ETL logic and store the results of these transformation steps in entities (you can also call them tables).

Amir mentions in the whitepaper five core principles that describe the main pillars of Power BI dataflows:

  • Intuitive authoring: dataflows are created using Power Query- the tools that millions of Excel and Power BI users are already familiar with.
  • Auto Orchestration: Dataflow authors do not have to think about the right order of dataload, dataflows are supposed to handle it in the right way (let the future prove if this is right).
  • Big data in a data lake: dataflows are designed to work with huge amounts of data. Results of the transformation are stored in Azure Data Lake Storage Gen2
  • Common Data Model (CDM): dataflows support the Common Data Model – you can use standardized entities, extend them or create your own entities to store dataflow results. See my blog post for an explanation of the CDM!
  • Native integration into the Power BI system AND Azure Data Services:
    • Dataflows are a native artifact in the Power BI ecosystem like datasets, reports and dashboards (see figure above)
    • Announced in December, dataflows can now be even more deeply integrated with other Azure data services. You can choose to store dataflows data in your Power BI environment OR you can select to use your own Azure Data Lake Storage. And with the second approach, you can exchange data and work together with other data analysts/ data scientists!

Is it a Premium only feature? NO – it’s PRO and Premium!

Power BI Dataflows are available for both – PRO and Premium users. Nevertheless, there are differences in the feature set – as usual the more enhanced and fancy features are only available in Premium capacities (i.e. Incremental refresh, referenced entities, …)

The following table is again taken from the whitepaper (checked again as of 2018-12-21):


Sounds interesting?

It is.. 🙂

If you want to read more about it, here are some links:

Be (data) prepared and let your data flow!





Posted in 24 Days Of PowerPlatform, Power Platform | 1 Comment

24 Days of PowerPlatform – Day 21 – Powervember


Hi & Welcome back to #24DaysPowerPlatform!

November 2018 can go into the books of (Power BI) history as one of the major updates and extension months for Power BI.

The excitement was huge and even the Power BI leadership has thrown oil into the fire to increase it.. 🙂

On member of the Power BI team – Matthew Roche introduced the hashtag #Powervember and proposed to use it whenever a new Power BI feature was announced in November 2018.

What I’ve tried to do is to capture all (hopefully) Power BI Nov2018 announcements. Have a look at the summary blog post:



This post is part of my 24DaysOfPowerPlatform blog series. Have a look at the other blog posts as well..



Posted in 24 Days Of PowerPlatform, Power Platform | 1 Comment