The Knowledge Export Service is an efficient resolution, however it comes with its personal fallbacks, particularly with massive clients. These fallbacks embrace failures and delays in synching at time limits, complexities in troubleshooting, the shortcoming to repeat configuration from one surroundings to the following, such that may be anticipated in an ALM course of.
Final 12 months, Microsoft launched another choice to the Knowledge Export Service. This new various known as Azure Synapse Hyperlink (or on the time Knowledge Lake) allowed the choice to simply export information out of your Microsoft Dataverse surroundings to an Azure Knowledge Lake Storage.
There isn’t a deprecation discover for the Knowledge Export service and that there are a number of Microsoft clients which might be nonetheless utilizing it, however it looks as if the path that Microsoft is pushing clients is to make use of Azure Synapse Hyperlink (previously Azure Knowledge Lake) so as to sync the information between their Microsoft Dataverse surroundings and an Azure information supply.
This Azure Knowledge Lake Storage (Technology 2) is an Azure providing that gives the power to retailer massive information from analytics functions. It’s based mostly on Azure Blob storage making it value efficient and strong. The subsequent few steps will present you learn how to configure the Azure Storage V2 account that’s required for organising the hyperlink between Microsoft Dataverse and Azure Synapse Hyperlink.
Step one is to login to the Azure Portal by navigating to https://portal.azure.com, after which click on on the Create a useful resource hyperlink icon and search for Storage account. Click on on the create possibility for Storage account.
Choose a subscription and Useful resource group, enter a storage account, choose the area and you may depart the remainder of the settings as is. The screenshot beneath exhibits the primary tab of the Storage account creation
Don’t click on on the Evaluate and create button on the backside of the display screen, however reasonably the Subsequent: Superior button to maneuver to the superior tab.
You’ll discover within the superior tab that there’s an part for Knowledge Lake Storage Gen 2. Verify the field to allow hierarchical namespace. The picture beneath exhibits this tab.
You’ll be able to skip the remainder of the tabs and click on on Evaluate + create to finalize the creation of the Storage account. As soon as the Azure Storage account is configured, we will go forward and begin configuring the Azure Synapse Hyperlink in our Dataverse surroundings.
Navigate again to your maker portal, by going to https://make.powerapps.com. Inside your occasion, increase the Knowledge menu, and click on on the Azure Synapse Hyperlink menu merchandise. This can open the New hyperlink to information lake panel, the place you may specify your Subscription, Useful resource Group and Storage account that you simply created to be used with Azure Synapse Hyperlink. The picture beneath exhibits this
You’ll discover that there’s additionally an choice to Connect with your Azure Synapse Analytics workspace, which is at present in preview. This enables us to convey the Dataverse information into Azure Synapse with just some clicks, visualize the information inside Azure Synapse Analytics workspace after which quickly begin processing the information to find insights utilizing options hyperlink serverless information lake exploration, code-free information integration, information flows for ETL pipelines and optimized Apache Spark for large information analytics.
Let’s return to our Azure portal and create the Azure Synapse Analytics in order that we will do that on the identical time inside database. In your Azure portal, click on on the create a useful resource hyperlink once more, and this time search of Azure Synapse Analytics.
This can open the Create Synapse workspace web page on the Fundamentals tab. Choose your subscription and Useful resource group. Additionally, you will need to enter a reputation for the managed useful resource group. A managed useful resource group is a container that holds ancillary sources created by Azure Synapse Analytics to your workspace. By default, a managed useful resource group is created for you when your workspace is created.
Enter the identify of the workspace and the area. The you’ll have to enter the identify of the Storage Gen2 that we beforehand created and the identify of the File System. When you wouldn’t have a file system identify, click on on the Create new hyperlink underneath it and supply a reputation, it will create the File system for you.
Don’t click on on the Evaluate + create, however on the Subsequent: Safety button. You’ll have to present a password to your Sql Server admin login.
Now you can click on on the Evaluate + create, after which the create buttons to create the synapse workspace. This course of takes a bit of longer then the creation of the storage account, as extra sources are being created. As soon as deployment is finished you may go to your new Azure Synapse Analytics useful resource by clicking on the Go to useful resource group button.
Let’s return to our Maker portal and choose the Connect with Azure Synapse Hyperlink once more, however this time we will even present the knowledge for the workspace.
Verify the Connect with your Azure Synapse Analytics workspace (preview), enter your Subscription, Useful resource group, Workspace identify and Storage account. Within the useful resource group just be sure you don’t choose the managed useful resource group, as that useful resource group doesn’t have the workspace and the storage account related to it.
When you click on on the Subsequent button, you may choose the tables that you simply need to sync with Synapse Hyperlink. For this goal we are going to solely choose a couple of tables (account and call), however you may choose as many tables as you need.
Lastly, click on on the Save button. This can create the connections and help you begin synching and reviewing information in Synapse Hyperlink and Synapse Analytics. As soon as accomplished, the web page will refresh and you will notice the Linked information lake exhibiting the Storage account that you simply beforehand created.
Now let’s begin by going to Azure and see what occurred after we clicked on the Save button. In our record of containers you will notice varied containers, one in every of them containing the identify of your dataverse group. Clicking to that container will present you a folder for every of the tables that you simply chosen to your synchronization in addition to a mannequin.json file which comprises your schema for the entities that you simply chosen.
When you drill down into the chosen entities, you can find a single csv file containing the preliminary dump from Dataverse. You’ll be able to view and edit the file straight in Azure Blob or obtain it and open it in Excel. The info will comprise all of the fields which might be a part of the entity.
As soon as we both add an extra report we are going to discover {that a} new file will get created comparable to the month of the creation. If the report already exists in our Azure Blob surroundings, a brand new file won’t be created, however it should modify the present report.
When modifying exists data, the modified report will get up to date within the corresponding file the place it at present exists. In our case, based mostly on the report the adjustments will both be within the 2021-06 or 2021-07 file.
Now that we see that the recordsdata are created, let’s go forward and see how we will view this information in Energy BI. The file factor that we’re going to want is to get the Endpoint of the Knowledge Like storage. Inside your new Storage account, within the left navigation underneath settings, click on on Endpoints. Throughout the endpoints web page, underneath the Knowledge Lake Storage part, you will notice that there are a Main and Secondary endpoint Urls for Knowledge Lake Storage. Copy the Main endpoint url. This will likely be used inside Energy BI. That is proven within the picture beneath:
Subsequent you’ll have to get the Entry key for the Knowledge Lake Storage Gen2. Within the Azure Storage Account, underneath the Safety + networking part, click on on Entry keys. This can present you the web page containing a listing of entry keys. Click on on the Present keys button, and the copy the worth of the primary key. You’ll need this for configuring Energy BI.
In Energy BI, click on on the Get Knowledge button within the Ribbon, choose Azure for the supply of the information, and from the accessible Azure supply choose Azure Knowledge Lake Storage Gen2 as proven within the picture beneath:
Click on on the Join button, and within the new pop up window, enter the Knowledge Lake Storage Main endpoint url that you simply copied within the earlier step and paste it on this window, choose the CDM Folder view Beta after which click on OK.
Within the subsequent window, you’ve got the choice to sign up utilizing an Organizational account or the account key. Click on on the account key. That is the entry key that you simply copied from the earlier step. After you enter the entry key, you will notice the next window, with the accessible information supply outcomes.
You’ll then see the navigator which gives you with a view of the mannequin that you’ve in Azure Knowledge Lake. Broaden the Storage Container, after which increase the cdm hive. This can present you the record of entities that you’ve accessible there as tables. See the screenshot beneath.
Lastly from Energy BI, you can begin including visualizations or fields, and customise your information as wanted. Within the screenshot beneath, we add a couple of fields from the accounts desk within the Azure Storage account.
After we configured Azure Synapse hyperlink we checked the field for making a Synapse workspace as properly. If we navigate to the Synapse workspace that we created, we will question the information from our Azure Storage account container from throughout the Synapse Analytics workspace. There are a number of configuration choices which might be accessible in Azure Synapse Analytics workspace, equivalent to configuring Analytics swimming pools, encryption and firewall configuration and extra. These may be additional reviewed within the Microsoft documentation, however for our goal, we’re going to check out the Synapse studio. The picture beneath exhibits the Azure Synapse Workspace overview web page, the place we will click on on the Open Synapse Studio to start out querying our analytics.
When Synapse Analytics Studio opens, there are a number of accessible hyperlinks on learn how to use this Azure product, and is likely to be overwhelming, however we’re simply going to overview the fundamentals on learn how to retrieve or question information from the information warehouse. There are a couple of choices that you need to use to create a SQL Script in opposition to Synapse Analytics. You’ll be able to click on on the New button on the highest of the web page, and select SQL script
You’ll be able to click on on the Knowledge icon on the left navigation, underneath the House icon, which can help you increase into the information construction of the information retailer, and from there click on on the account desk, select New SQL script after which select the Choose TOP 100 rows. This can create the SQL script for you and you can begin enhancing it from there.
The final possibility is clicking on the Develop icon within the left navigation, then clicking on the + button and choosing SQL Script as proven beneath:
As soon as we’ve got chosen the question possibility that we need to use, we will go forward and construct our question. In our case we’re simply going to retrieve the record of accounts as proven within the screenshot beneath:
After we click on on the Run button we will see the outcomes as proven beneath.
There are much more prospects and choices which might be accessible for learn how to use Azure Synapse Hyperlink and Azure Synapse Analytics and accessing the information from totally different sources. This articles gives the fundamental overview on learn how to configure them and make them work to your Dataverse surroundings. I hope this was useful.