There are a number of prerequisite steps that should be accomplished, and also you would possibly have already got accomplished them, however I want to evaluate them once more in case they haven’t been accomplished.
To be able to name the Azure DevOps REST API we have to have a Private Entry Token. That may be acquired by going to our Private Account Settings, and deciding on Private entry tokens as proven within the picture under:
If you have already got a Private entry token, clicking on the above hyperlink will present you the record of obtainable tokens, however when you don’t you will notice a screenshot saying that you just don’t have a private entry token but, and you may click on on one of many New Token hyperlinks to create a New Entry Token.
You’ll need to present your Private entry token a reputation, specify the group and expiration date, in addition to the Scope of the token. For the aim of this demonstration, I’ve given Full entry, however you too can present a customized outlined scope, which helps you to decide whether or not you need to present Learn, Write and Handle entry to the completely different objects that make up the API.
When you click on on the Create button, you will notice the Success notification displaying you the Private entry token, and an icon that lets you copy the token to the clipboard. It is very important copy it and retailer in a protected place, as as soon as the window is closed, this token will now not be accessible.
Not that now we have the token, we might want to convert it to base 64, in order that it may be utilized in our HTTP name within the circulate, passing it as a Fundamental Authentication token. This may be accomplished by utilizing Git Bash or different utilities that will let you convert to Base 64. Open Git Bash, by clicking on Begin -> Git after which deciding on Git Bash.
As soon as Git Bash is opened, you have to to enter the next command within the Bash window:
$ echo -n :[PERSONAL_ACCESS_TOKEN] | base64
You’ll need to switch the [PERSONAL_ACCESS_TOKEN] with the actual entry token that you just copied earlier. The next screenshot (with some blurring) reveals the Git Bash command.
Copy the outcome into Notepad++ or one other textual content editor, as you have to it at a later time.
The subsequent factor that we might want to get is the Pipeline Id that we’re going to name. Navigate to Azure DevOps and click on on Pipelines. This provides you with the record of Pipelines as proven within the picture under.
Click on on the Pipeline that you just need to execute out of your Cloud circulate or Canvas app, and let’s look at the Url. You’ll discover that the top of the url comprises a definition Id. That’s the Pipeline Id that may might want to use so as to execute the Pipeline.
Subsequent, we might be making a desk in Dataverse, in order that we are able to retailer the executions. Though not required, I like having a log of who executed these processes and when. Principally for historic goal.
The columns that I added are the next, however further columns may be added.
Let’s go forward and have a look at the circulate that I added to the Answer. I begin the circulate utilizing a Energy Apps set off, and initializing three variables containing the Pipeline Title, Pipeline Id and the Person Electronic mail that’s executing the circulate. The picture under reveals these steps
You’ll discover that every of the variables which might be being initialized use the “Ask In PowerApps” possibility, in order that the worth is initialized from my Canvas App. The subsequent step is to name the REST API utilizing a HTTP Put up request. The url under is the Microsoft Docs url containing the knowledge of the Pipelines REST API:
Inside the doc, it particulars the precise url for use within the HTTP request, which is:
You’ll discover the HTTP submit request under. The blurred part comprises the group and undertaking from DevOps inside the URI parameter, and within the Header, we paste the outcome that we go from the conversion of our Private entry token to base 64 in Git Bash.
At this level, the execution will begin, and the ultimate stage that I exhibit under are actually non-compulsory, however a pleasant to have. If I want to write the outcomes again to my desk in my Dataverse occasion, I can retrieve the Construct Id and Construct Url from the results of the HTTP. I can do that by utilizing a Parse JSON request, which is able to give me the properties that I would like.
The Parse JSON comprises the Physique from the HTTP request, after which a duplicate of the Schema. I can run the circulate earlier than including the final two steps, after which get the JSON outcome to be pasted within the run outcomes of the Cloud circulate HTTP motion step, by pasting them within the Generate from pattern under which is able to generate the Schema.
You’ll find the precise wanted schema for this step pasted under.
[Parse JSON Schema]
Lastly the final step is so as to add the file to Dataverse. We’ve got the Pipeline Id, Pipeline Title and Person Electronic mail being handed from the Canvas App. The Construct Id and Construct Url are the outcomes from the Parse JSON request, that are mainly physique(‘Parse_JSON’)?[‘id’] and physique(‘Parse_JSON’)?[‘url’].
As soon as the circulate is execute we are going to see a brand new file in my Dataverse occasion.
We may even see that the Azure DevOps pipeline is being initialized
Now, so as to execute this, I create a Canvas App, that may have all of the Pipelines which might be a part of my course of. This may be short-term pipelines are automated and scheduled pipelines. The app is proven under
While you click on on the Run Pipeline button underneath every of the pipelines, it’s going to name the Energy Automate Cloud circulate by utilizing the next Canvas App Command.
InitializeDevOpsPipeline.Run(“PublishExportUnpackGit”, 3, Person().Electronic mail)
This may be in fact enhanced additional, however for an preliminary execution of the pipelines it is a nice assist. It is step one in starting to have an ALM. Like at all times, I hope this was useful to a few of our neighborhood members.
I additionally need to give a particular due to Paul Breuler for Microsoft for serving to me out in a few of these challenges.