Featured

Shine with PowerBI round the clock with TDS for Common Data Service

TDS for CDS in PowerBI DirectQuery Connection Mode

The new Tabular DataStream (TDS) endpoint for Common Data Service, which is in preview mode, is not just another access point addition for CDS but it addresses several shortfalls for live and secure reporting in Power Platform. With DirectQuery option in PowerBI, the live reporting is possible and it also respects the current user access and roles, when we use TDS connection.

Wait for data refresh and secure it

Early this awesome year of 2020, one of my customer who had very finely crafted the security in their model-driven app using roles and organisational hierarchy want to use PowerBI for reporting. To their surprise (as it is hard to buy), in PowerBI we need to configure row-level security (RLS) and then have to assign roles to the people at PowerBI service level to present only the authorised data. One un-addressed question was if PowerBI is part of Power Platform why it does not understand the roles and security defined in the CDS. I remember there was no single forum or a session where same question was raised and the speaker reminds PowerBI does not follows the permissions of users.

Recently run into another scenario, where customer was using SharePoint for the data storage (Yes, it is another issue) and connecting with PowerBI for producing some reports. Their requirement was to have live reporting and the users were confused because of reporting inconsistency and refresh delay for their system. SharePoint lists does not support direct query which means you have up to rely on data refresh schedule which gets more tighter if you are not having premium capacity as only 8 data refreshes are allowed in a 24 hours day. Direct Query was also not an option for Dynamics 365 or CDS before TDS and has the same pain point.

We are live and secure

With TDS, connecting to CDS is as simple as bringing the data from SQL Server and then put the report on top of it. It will execute the query every time by-passing the issue of data refresh interval and it also will show the data only authorised for the logged in user that means CDS roles and permissions are fully applied. One more advantage, that we get with this is option sets and look up values are also available and there will be no need to bring (and then keep maintaining them) the values and IDs inside the data set and then use tricky expressions.

If you want to explore it further and see the steps how to connect to CDS data source using TDS DirectQuery connection, keep reading through and follow the screenshots below.

1- Enable TDS endpoint for your environment through the Power Platform admin portal. Environment >> Settings >> Features

Turn On TDS endpoint for your environment

2- Connect SQL Server Management Studio to CDS to write and prepare your query. In the Server name type your environment URL and the port no 5558 is fixed for every environment. For User name and password, use your system administrator account.

Connect to CDS environment and Database with SSMS

3- Write your query as you want, yes joins and aggregates are supported in the query.

SSMS to write the SQL Query we want to use.

4- In PowerBI desktop, click Get Data from SQL Server and select DirectQuery in Data Connectivity mode. In Server, put the same value, we used in SQL Server Management Studio. In Database, use the same database name that we connect to when executing the query. In SQL statement, you can write the SQL query or, use the same query prepared in above step.

SQL Server DirectQuery mode
Connection credentials

5- With Data loaded in PowerBI, you can use it the visualisations the way you want and the last step that is required is the dataset settings after your publish the PowerBI reports. The checkbox that says “Report viewers access this data source with their own PowerBI identities in DirectQuery mode.” must be checked.

Report viewers access to the data source.

Currently only SQL Server Management Studio and PowerBI are allowed to be used as client to connect using TDS. If you are thinking about some other use cases, you still have to use WebAPIs or OData end point and the SDK assemblies.

Featured

Power Platform security: Three trusted guards

Power platform security

Security brings trust to the business and is the foremost and fundamental pillar that must be ensured before thinking about the rest of the platform capabilities of the solution. Natively integrated with Azure Active Directory for identity and access management, there are a few steps that are absolutely necessary for your power platform security. In this blog, I have covered few recommendations summarised in three steps, that will help you to secure your business and personal productivity applications alike. The first step gives control over who can access the environment and is authorised to perform operations on the resources (apps, flows, connectors and the data). In the second step, we need to log all the activities that users perform after gaining access to the environment. The last step is to have guardrails to protect the data from accidental or intentional misuse without compromising the promise of being the citizen application platform.

Environment level access control

At every environment level, we can define the security group which provides the first layer of security. If no security group is assigned, it means it is open for all the users in your tenant. This is completely fine with the default (personal productivity) environment but is not encouraged for other environments especially hosting critical business applications (Dynamics 365 or other restricted Power Apps). The next layer of security is controlled by the permissions and the roles that the users are assigned. Even though it is fully configurable to control access at every resource level (entity and fields), it is worthy to look into the standard roles that comes with the CDS enabled environment.

Security RoleDescription
System AdministratorComplete ability to customize and administer the environment.
Full read write access to data in the database
The role cannot be modified
Care should be taken in assigning this to the right people
System CustomizerFull permission to customize the environment
Data access is focused only on data owned by the user
Role can be modified but it is not recommended to modify
Environment MakerCreate new resources in the environment including apps, connections, gateways and Power Automates
There is no default privileges to data included
Role can be modified but it is not recommended to modify
Common Data Service UserBasic user role, with ability to run apps and perform common tasks but no ability to customize the system
The data access is focused on Read access to most Common Data Model core entities with full access to records owned by the user (i.e. ‘self’ privileges)
Good role to copy to make a custom security role for users
Default security roles

Enable Audit log

Microsoft 365 Security and Compliance Centre is the one-stop solution for comprehensive logging across all Microsoft services. Power Platform activities are also recorded and stored in the compliance centre. The only prerequisite for enabling the audit log is to have Microsoft 365 Enterprise E3 or E5 licenses. The steps to enable the logging and then to access to reports are documented here. With auditing enabled, you can have peace of mind if someone having access perform some naughty activities.

Data Loss Prevention policy

The last leg of the security is from within. With 300+ connectors and growing in the power platform, it is extremely important to put the guardrails in place to protect the makers’ community from unintended exposure of data to the external services. With the ability to block connectors, the restrict-all and allow only known connectors is the safest approach. For example, block social media, dropbox etc and just keep Microsoft Standard connectors unless there is a business case to allow. Default environment needs further restrictions and the critical business data like Azure-hosted assets, SQL Server should be blocked. Lastly, for the dedicated environments specialised policy can be applied depending on requirements, for example, less restrictive for marketing and combination of selected connectors for Finance.

Featured

Power Platform setup: Three small steps to giant improvements

Power Platform Strategy

To establish the Power Platform environment for your kick-ass business applications, following the best practices and finely crafted strategy is very important. Top three steps, that go to the depth to checkmate the highly unruly environment are here in the list. Delivering benefits and keeping the confidence in citizen application platform, is highly critical and like with all great platforms and technologies, it is hard to achieve without putting the thoughts about strategy and architecture. While putting the strategy and working on the architecture, there are many aspects to look after but it always starts with establishing the environment on solid foundation.

Don’t use default environment

The default environment comes with every O365/M365 tenant. It is highly recommended not to be used for critical business applications. First of all, this (default) environment has fewer security controls as compared to other environments. The main use of this is for personal productivity tools like Office 365, SharePoint Online and Project for the web. All licensed users for Power Apps, Flow, Office 365 and Dynamics 365 Online, stand-alone licenses, free and trial licenses automatically gets the maker role in this environment. (Yes it is as scary as it sounds 😦 ) As an admin, the best would be to rename this (default) environment Personal Productivity to label it for once and for all.

Critical application deserves respect

Every business has mission-critical operations that deserve special attention. Identification of critical over important applications can help create segregated environments without costing a lot of resources. e.g. For Dynamics 365 applications, there should be a set of Dev/Test/prod environment of its own. If the compliance or security requirements need ring-fencing of the information, multiple isolated environments can be created for D365 apps too. But remember less is better when resources are limited, use this strategy for mission-critical and important apps, or on business units who need their own dedicated area.

Service accounts for production deployment

This may cause a few raised eyebrows. Yes, under European GDPR and other standards compliance, having a generic account with admin access is an exception. But the benefits of following this outweigh the possible issues(that can be managed and mitigated). Under this, we will use a service account that central IT manages to deploy to test and production environments. This will make all other users with no need for elevated rights and they can operate using end-user permissions and avoid unintended exposure. This can save on licensing too if premium connectors such as Outlook or SQL Server are required.

Featured

Schedule Gold with Dynamics 365

Effective resource scheduling has remained a business problem for a long time. Several tools and strategies came and keep evolving with the time to minimise the complexities around it and maximise the profitability with optimum utilisation. Microsoft Dynamics is also rolling out next-generation schedule board experience which will be publicly available from April 2020. This is very exciting as the value my customers are already reaping in the current offering is helping them unblock the revenue and driving their customer engagement to the next level.

We leveraged the Universal schedule board at two very different customers. One of them is in the hospitality industry where they provide venues and facilities for events and is considered as an established brand in wedding and corporate events at premium locations. The second customer is the construction and real estate industry that develops high-end malls and commercial structures nation-wide.

The prerequisites for using the Universal resource scheduling from licensing perspective is quite clearly documented but few blogs when mentioning it as “Schedule Anything” makes it confusing and the intuition leads towards as if this is something like Account and Contact entity in the Common Data Service. The users who want to use URS features need to have either of the following:

  • Customer Service Enterprise
  • Field Service
  • Project Service Automation
  • If you have Project for the web license, it will be present in default environment too (which makes it a little less appealing for enterprise-level solutions)

Yes, this is possible to on-board key stakeholders initially who will be carrying out the scheduling activity as there is no minimum purchase requirement as subsequent qualifying app.

The schedule board works like a charm to give visibility on the utilisation. In the hospitality industry where the occupancy rate drives the profitability, this insight can help design marketing campaigns and attract the audience.

The pain point is different if one has to take on the requests from 50+ construction sites for the different skill set of the team. Timely resource allocation not only unblocks the revenue at the project sites but also make sure the skilled staff is utilised appropriately as per the roles.

With so many features almost ready to come out of the oven, this is one of the exciting features that definitely will make the adoption of Microsoft business applications faster and add value for the businesses. My plan is to cover this area more in detail in the upcoming weeks, so please leave your comments and feedback if there is something that resonates with you and demands effective scheduling.

Featured

A friend in need is Power Platform indeed!

Power Platform
Power Platform is a friend in need

Microsoft Power Automate (yes formerly known as MS Flow) allows updating the datasets in PowerBI in the latest updates. But just like good friends, PowerBI does not hesitate to speak to MS Flow too and can send data to open endless possibilities. In this blog, I have mentioned the steps that will allow you to send data from PowerBI Dataset to MS Flow and then our all-powerful friend, MS Flow can help in thousands of different ways! 

Last week, I was working for a customer in the Public utility sector that is using SAP as the core technology platform. The requirement was to get some key information related to projects from SAP and then use it to update the Apps running on the Power Platform. SAP currently supports SAP BW connector released by Microsoft. The connector works like a charm for PowerBI and we can use data gateway to securely get the on-premises hosted data in PowerBI dataset. The issue was, to date, there is no connector available for SAP BW for PowerApps and Flow, but you can vote for a Flow connector and PowerApps connector for new features release.  

In my case, the options were limited as the customer does not want to invest a lot of time and budget for developing SAP interface to share the data for the cloud and Power Platform world. But the solution explained below, can also be used for situations where you already have built and loaded the data from heterogeneous sources and want to have a consistent approach and reuse the data without doing rework on multiple layers again.  

MS Flow works as Web API too

The first step we need is to get the MS Flow available as the Web API. It is a very simple step and all we need to do is use the trigger “When a HTTP request is received”. Once you saved, the Flow, it will show you the URL for your Web API along with the sig (secure key). Copy the entire string for our next step.

HTTP Request received trigger

Update PowerBI dataset to make the web call

The Web API generated by MS Flow can be invoked from the PowerBI dataset. The first step will be to connect to our data source normally and query the data that we are interested in. The next step will be to make a web call for the URL, we copied from MS Flow in the previous step. The below power query code snippet, you can use for your solution and the only thing that you will have to care about is to prepare the data from your data source and copy the correct URL for the Web API (MS Flow). 

let
    Source = Sql.Database("Your DB Server", "DB Name"),
    SAP_Project_data = Source{["Your Table"]}[Data],
    Projects = Table.SelectColumns(SAP_Project_data,{"ProjectName", "ProjectStartDate", "ProjectCost", "ProjectActualCost", "ProjectBudgetCost"}),
    ToArray = Table.ToRecords(Projects),
    ToJsonBinary = Json.FromValue(ToArray),
    WebCall = Web.Contents("Copy the Full URL here including the key", [Content=ToJsonBinary]),
    LineFromBinary = Lines.FromBinary(WebCall)

in
    LineFromBinary

Making MS Flow ready to catch the data

The data received in the MS Flow is in JSON format. We first need to parse this JSON data and then convert it into a table to be saved in CSV file. It sounds complicated but it is very simple as mentioned in the steps below with the help of few expressions highlighted in the hint text in the screenshots.

Parse JSON data
Convert JSON to CSV table
Save CSV file to One Drive

Endless possibilities!

Using SAP BW connector, we can load the data into the PowerBI service. Even if you have data scattered into multiple data sources, it is pretty trivial to get it transformed and loaded into the data sets. Once you identify your required data, not only we can present into the visualizations in PowerBI, we can also send it to MS Flow following the above steps. The data loaded from the PowerBI data set to the MS Flow gives several options depending on how you want to use it. We can send it in the CSV format as an attachment in an email to someone, save the file on Azure Blob storage, or just load it into CDS entities. 

Featured

Sailing out of castaway island

In your business applications, thinking of data integration Or even migration to Power Platform, Dataflows help in getting out of the castaway island in the stranded situations. Microsoft Dataflows features become generally available in the power platform maker portal in Oct 2019 and can be used both in the PowerApps portal or in PowerBI. Using the PowerQuery, dataflow provides a very rich set of connectors to interface with not only the databases, and files but also with Open standards like OData, ODBC and restful Web APIs. Yes, it is the same no-code PowerQuery that we have been using for years in PowerBI. In summary, it is the most advanced Extract Transform and Load (ETL) tool present today that provides self-service capabilities to you in the ever-growing volumes of data to convert into actionable insights.  

Power Platform Dataflows

To drive this home, I set up a dataflow as a sample to migrate the contacts data from a SharePoint environment to the Power Platform entities and it just took few minutes to configure the fully integrated Common Data Service environment. If you want to bring in any other data outside your business application or even your legacy system, like accounts, products, projects (Project Online) or any other data you would imagine, it can be easily done in the same way. 

Navigate to the power platform maker portal and click on new Dataflow in the left navigation pane. In the next screen, I select OData data source and you need the credentials for reading the data only. I could have used the SharePoint connector here, but I preferred OData because it is a good example that will fit many systems complying with OData standards. Yes, if you have the data in your on-premises environment, you can bring it to the surface using the data gateway

OData data source in PowerQuery

In the next step, you will be shown the entire content of the data source. Here, you can select columns you are interested in, filter rows based on your requirements or even can add calculated columns.

The next screen is a simple mapping of your data source to CDS Entity fields. If there is no mapping of your data on any existing entity, you can bring the data as a new entity. It will ask you to select a primary key column and the entire data source will be replicated in the CDS entity.  

Dataflows allow up to 48 auto data refreshes on a daily basis. In PowerBI, we have the limit of up to 8 auto-refreshes per day. This limit is quite generous and it means we can auto-refresh data every 30 mins. The other option is manual data refresh and this can be handy for one-time data import or if you are planning to do data refresh based on certain actions that can be automated too. 

and here comes the Data imported in CDS Entity surfacing on the model-driven app.

Power Platform Integration Dataflows is a vital tool

Power platform dataflows helps in bringing the data from several data sources. Once the data is loaded into the storage, we can use it in Apps, MS Flows and produce reports in PowerBI too.

Featured

Getting romantic with your business

Spring is here and how good is that to present flowers to your business !

We heard about the flower power and this role is taken by PowerBI in Microsoft Power Platform. Just like, a nicely arranged bouquet brings warmness and smile for your loved-ones, carefully organised PowerBI content can enlighten your business. Business Applications leveraging Power Platform has huge potential of finding hidden gems in the ever growing business data inside Common Data Service. The way you develop the data story will open a barrage of realisations and insights of your business.

Art of Executive reporting

User Story holds the utmost importance. The deep connection the story will make with the stakeholders though the data, the more value it will generate for the business. Having full access on the CDS data directly or through Data Export Service, has the tendency to draw anyone of us to get in to the area which makes the story hard to understand.

Last week, I faced the similar situation as a coincidence at two of my engagements. The stakeholders interested in the information leave the ground open (Sounds familiar!) and the analysts were in awe of availability of data. This leads to over-complicating the story that needs to be avoided to have a useful insight. I am glad to have very thoughtful customers who work as a team and eventually we refine the content to make more sense to the executives.

Tech Bits !

For rolling up the numbers with the running month, the DAX expression we used for putting the sample graph below is:

BaselineYTD = CALCULATE(Sum('PortfolioCost'[BaselineCost]),Filter(ALLSELECTED('PortfolioCost'),'PortfolioCost'[Date]<=MAX('PortfolioCost'[Date])))

PortfolioCost is the table holding all financials, BaselineCost is represented as redline, and Date is the calendar date.

Featured

About Myself

Hi, This is my digital home where I share my thoughts and latest learning to help community around Microsoft Power Platform. Thanks for visiting my blog and I hope you will find the information shared here useful.

I am Microsoft certified and experienced consultant helping customers in solving real-life problems and empowering them with the cutting-edge technology to fully realise the potential in their digital transformation journey.

My areas of interest include:

  • Microsoft Business Applications
  • Microsoft Power Platform
  • Microsoft Dynamics 365
  • PowerApps, Flow, Common Data Service and PowerBI

Being (ex)consultant for Microsoft Project Online, I am focusing on making the success stories for my several PPM customers in transition to the Power Platform based Project Service which will become primary Project management service over time.

#PowerPlatform #CommonDataService #PowerApps #MSFlows #PowerBI

As you’re not locked into any of the technology in this age; one of the wonderful things about this blog is how it constantly evolve as we learn, grow, and interact with one another.

Upgrade Dynamics 365 Apps to Unified Interface

Empty Offices are getting normal as remote productivity tools are getting rich

Mobility and remote collaboration are vital for any business application while we are facing social distancing and global disruption today. With the deadline looming to upgrade the Dynamics 365 classical apps to the new unified interface, there is FOMO factor as the new enhancements have heaps of great features to offer including the mobility and collaboration across any device of choice from anywhere.

Oct 2020 is the deadline from Microsoft to upgrade your classical interface apps like Sales/Customer service/Project Service to new Unified Interface and this time is to allow you to perform testing and review the features with your team and perform the change management activities before the hard switch.

Recently we upgrade one business application at a public service agency and below are the recommendations based on my experience.

Step 1: Stock taking of your current artefacts

Review your current app especially for the customisation that has been done by you or your implementation partner and is not part of out of the box Microsoft App.

1- Discuss with team and stakeholders which of the customisation they like to move or drop. Remember, this is an opportunity to look at unused functionality, simplify the business process and improve lack of adoption.

2- Note all custom components like client-side web resources (Web resources), Plugins etc. Which solutions holds these customisations? Make a list.

Step 2: Run Solution checker

Microsoft provides solution checker in the powerapps maker portal, you will need to run the solutions checker against all the custom solutions (noted in Step 1) in the environment one by one. This will validate the entire code whether it needs to be modified and upgraded and show the results with recommendations.

Solution checker

Review the results and action all the items under “Upgrade Readiness” category. The solution checker validates the code against the following:

  • plug-ins
  • custom workflow activities
  • web resources (HTML and JavaScript)
  • configurations (example: SDK Message steps)
Action Solution checker issues

Check out what solution checker can do for you on this blog post too.

Step 3: Create a parallel new app

Start with a new model-driven app with default site map. Add all the entities and artefacts identified in Step 1 with the team. Identify gaps and issues and work on the deprecated features. If you have the following items in your app, you have to work and replace them with new recommendations.

  • Task Flows – replace it with Business Process flows
  • Process Dialogs – replace it with Business Process flows or Canvas apps
  • Service Scheduling in customer Service – replace it with Universal Resource Schedule

Step 4: Test and Roll out the new app

Test the new app with the key users and move is the final message as the old UI is gone (or will be gone very soon). You will have to test the new app and work on the issues and gaps if there are any. Also, recommend to record short videos and prepare how-to guides for the users to get familiar with the new interface which is after all more friendly and modern for them. Any new enhancements and support to be provided only on the new App in future.

The steps above and the best of you will be good enough to drive this home as many of the businesses have already upgraded. Don’t miss the opportunity to show the users that the same App can now be used from any device of their choice including all popular phones and tablets as business continuity these days demands the business to be run from anywhere.

Digital Transformation starts with Why?

We need to upgrade our system! We need to migrate to the Cloud! We need to bring in new Apps! We need to automate our processes! We need to to understand our data! I am sure these statements must be very familiar to you.

Inconvenient Truth

Do you know how many digital transformation projects get failed or fall short of meeting their objectives?

According to a global survey by McKensie (read here), only 14 percent of the digital transformation projects report, they have made and sustained improvements and only 3 percent confirm complete success . It is a very low number but surely there are moves that can put your project in the category of successful ones. Top of the list is to put a ruthless and bold scope and definitely, you will need to ask Why from the beginning.

Bold Scope! Start with Why?

Yes, Simon Sinek in his book covered why “Start with Why?” is the most important question to raise from the beginning but it looks refreshingly relevant in today’s trending digital transformations. Many businesses and teams embark on the projects with the lack of understanding and vision on “Why” they are doing this investment. The clear vision not only helps leadership define and measure the success of projects but also set up the design consideration basis for the project team.

Start with Why? Inside Out!

Now if we revise the statements, we saw in the beginning, it will look more clear and will increase the chances of focusing on and achieving objectives for your initiatives.

  • We want to improve our customers’ experience.
  • We want to improve our employees’ experience.
  • We want to increase the efficiency of our operations and improve the quality of our products and services.

Power Platform shows how to do it

To deliver such a bold scope, the next thing you need is the adaptive design and flexible technology. Microsoft Power Platform provides the latest technologies that empower the businesses to quickly adapt the solution and design keeping pace with a rapidly changing landscape.

Gartner places Microsoft Power Platform in the Leaders’ quadrant

Remember, the business transformations journeys might take multi-years investments and performance targets, if not months, and adaptable design and technology will play a key role to make this journey sweet, sustainable and successful.

A Star Is Born

A Star Is Born

Project Management evolution continues

Portfolio and Project Management practice, as we know today has come a long way from where it started. The KPIs for the projects to determine the success traditionally lean towards having cost and schedule management but now we are focusing more on stakeholders management. PMI’s, PMBoK in its sixth edition has included “Agile practice guide” for the first time, which reflects that the projects need to be ready for the rapid changes and digital disruption age we are living in. As the saying goes the best way to ride a horse is in the direction it is going.  

Microsoft remains the partner of choice; however

Project Online has been using the SharePoint based architecture since 2000 when it was called Project Server. MS Project Professional stays as the scheduling engine and the rest of the collaboration features are based on the SharePoint. Several features are easy to use due to the collaboration oriented features of SharePoint, but there are several challenges too. Security, data synchronization and performance issues start hitting hard as the size of the data grows due to the platform constraints. Yes, Powershell scripts to update the project site registers has been a big issue for many of us. Most importantly, MS Teams and Groups are the new collaboration tools and these are way ahead of SharePoint features.  

Game changer comes into play!

Microsoft introduced the Power Platform which starts getting traction with the transformation of the business applications like Dynamics CRM (aka Customer Engagement) and its module on the new platform. Dynamics Project Services was one of the module which has many cross-cutting features with Project Online but it was not fully available on the Common Data Service, which is the backbone of the Power Platform.

Microsoft announced reimagined Microsoft Project on 10th Sept 2019 and the new rock star is expected to be available to play tunes for all existing Project Online subscribers from 7th Oct. What we know publicly so far, it is going to be primary project management service from Microsoft. The existing Dynamics PSA and Project Online customers can still continue using the available features but it is the time to start experiencing the best solution, we have seen yet for the project management. 

Power Platform Overview

I asked Mark Smith and Steve Mordue in their weekly Power Platform show how they are seeing this disruption and the response was clear, “If anyone is using anything old, you probably need to think about using the new stuff !!!”

Playback 18th Sept – Jump to 41:48 for the straight advice

Now what to expect?

As the role out will happen in a few days, we will have the world’s leading scheduling engine shining and sitting on the most advanced application platform, we have today. By having it on Common Data Service (CDS), it means it is inherently integrated with Dynamics Customer Engagement and Dynamics Finance and Operations modules. All the goodness of the citizen application platform or Power Platform will be available to extend your project management practice and yes there are fewer or no boundaries this time.

In the next few weeks, I will be writing more about the new capabilities and how-to stuff. I hope it will help you in getting understanding with new technology.