Reacting to Azure Activity Log events via Logic Apps

Currently Azure Activity Log can only be exported to Azure Event Hub for processing (other choices are ill suited for event processing). Event Hub is optimized for event collection but is not a message queue service. The only message queue services provided by Azure are Azure Service Bus as well as Storage queue. Relying on EventHub for log processing will not only inflate costs (for example you can not use Filters to limit amount of messages processed by backend) but difficult to deal with concurrent access by different readers as well as lack of UI tools provided by real message processing utilities.

Azure Event Grid subscriptions on other hand will allow you to subscribe to only events you need from Azure Activity Log, filter those and then rely on Storage queue for downlevel processing.

Step by step instructions below will configure and tie together several native Azure resources in single pipeline for Azure Activity log processing. Flow is as follows: Azure Event Grid Subscription will create Storage queue message upon event being fired and passing Rules, Azure Logic App then pick up all the messages and process them through Azure automation to extract necessary information.

Create new storage account to host storage queue.

Create queue to hold ActivityLog messages forwarded by EventGrid

Create new Event Grid Subscription.

Important field are below which identifies parameters for capturing only events relevant to successful Writes and sends events to storage queue

Create filter to further narrow down events which will lower costs as well. Go to Filters tab and choose Advanced filter for key data.operationName with Operator String is in with value of Microsoft.Authorization/roleAssignments/write

Create event grid subscription once all parameters are set. If you would get error stating that Provider is not registered, wait a couple of minutes and try again. Provider will be automatically registered upon trying to create resource if your account have sufficient permissions.

See if your subscription working. Go to any resource group and assign any sort of permission to resource group. Within a minute you shall be seeing message in your storage queue

You can open actual message below and see details identifying who initiated action, name etc. Important part is on line 15 which will allow to get information about actual request which will identify which user/group was assigned a Role and which Role was assigned.

"groups": "32df13c6-6e25-48e2-9b5e-63dd25f308dd,6a06581b-64b4-4a86-a590-5bb0e1f7a63b,52d47dae-80ce-4e3c-88b5-e1d57e39506e,0c6dbf57-8196-460d-b63b-f55647179c03,4cf3860a-5d4b-4f6f-bec1-bc2bbeec5ec9,a1812dba-2d31-4a1c-9cc2-7091672defc2",
"": "",
"ipaddr": "",
"name": "Gregory Suvalian",
"": "6c19805a-8757-42ae-92de-02897cd7ccf9",
"puid": "10037FFE87760F28",
"": "user_impersonation",
"": "w8DgE14xt_N7LiCRKp_5Ib0OQL0G0s1lX6VUEzS95QU",
"": "c0de79f3-23e2-4f18-989e-d173e1d403d6",
"": "",
"uti": "3hGkDgjxAk6KQEOYkUMmAA",
"ver": "1.0",
"wids": "62e90394-69f5-4237-9190-012177145e10"
"correlationId": "552ee42c-d232-465a-a39a-9dd73e5d9d94",
"httpRequest": {
"clientRequestId": "920fb593-d0f1-453f-870e-d0ef99870004",
"clientIpAddress": "",
view raw a.json hosted with ❤ by GitHub

This value (
“correlationId”: “552ee42c-d232-465a-a39a-9dd73e5d9d94”) will be used in Automation account to pull additional information with details of request.

Use existing Automation Account or create a new one.

Make sure RunAs account is created as part of Automation Account creation

Navigate to RunAs account properties of your automation account and note Application ID

Navigate to Azure Active Directory and find this Object ID under app registrations

Click on it and go to Settings/Required Permissions/Add

Choose Windows Azure Active Directory as API. Under Enable Access specify Read Directory Data

Choose Grant permissions on last step

Go to Automation Account/Modules and click Update Azure Modules

Go to Modules and choose Browse Gallery. Search for Az.Monitor and import that module, you might have to import depended modules if it requires.

Save and import this runbook as graphical runbook (

Once imported it shall look like below

Go test pane input CorrelationId you copied in previous steps (in my case it’s 552ee42c-d232-465a-a39a-9dd73e5d9d94)as input parameter and run test. Your output shall be similar to below

Create blank LogicApp.

Add Azure Storage Queue as connector and “When there are messages in queue” as trigger

Choose your parameters as needed on next screen

Add Parse Json as next steps and paste body of existing message in queue to generate schema

Choose Azure Automation/Create Job as next step and input relevant information about your automation account and output from previous steps like below

Add Azure Automation/Get Job Output to receive result of job completion

Run your pipeline to verify that you receiving correct information from existing messages in queue

Add Parse JSON as next steps to make those values available for output and paste existing result to generate schema

Create final JSON which you’d want to send to external tool via Compose Message step

Final step is delete message from queue upon successful execution via Storage Queues/Delete Message

Publishing build artifacts from Azure VSTS(DevOps) to OneDrive

Steps below will allow you to publish contents of your Azure VSTS (DevOps) Repo to Sharepoint online and by proxy to OneDrive as well.

There is no built-in task available in either build or release pipeline to push files to OneDrive so solution below relies on Azure Logic Apps to perform those functions.

Overall flow is below

  1. Azure DevOps completes build which packages code in ZIP file as build artifact
  2. Azure DevOps project calls Azure Logic App webhook 
  3. Azure Logic App retrieves results of build task and extracts to Sharepoint online documents folder

Steps in detail:

Create build pipeline in Azure Devops

Yaml file as well as UI representation is below. It packages files in scripts folder into ZIP file into subfolder called Powershell Scripts

vmImage: Hosted VS2017
task: CopyFiles@2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
SourceFolder: scripts
TargetFolder: '$(build.artifactstagingdirectory)'
task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: Powershell Scripts'
ArtifactName: 'Powershell Scripts'
view raw build.yaml hosted with ❤ by GitHub


Create Azure LogicApp

Define a trigger of HTTP request type. Use following Request Body Schema. You can download schema from here

Save trigger which will give you HTTP Post URL which you would need to use later in Azure DevOps project

Since you might have more then 1 build in your Azure Devops pipeline you need to have conditional logic in your LogicalApp to only publish on results of specific buildID.

Second step in LogicApp is “Condition” based on definition id number of your build. In my case it’s 5


Next steps are initialize 2 variable. First one will be holding buildID number as well authorization information for AzureDevops.

To create Authorization token you need to create PAT token in AzureDevops and encode :{token} into Base64. For example if my PAT token is a123 then go to and encode value of :a123 into OmExMjM=

Add Send an HTTP Request to Azure Devops as next step and modify parameters respectively for your values. Output of this step will allow to get URL to download artifact.

Add 2 steps to parse JSON and extract value of downloadURI. Schema can be download from here

Last 3 steps are downloading artifact from URL and extracting it to Sharepoint site

Add webhook to AzureDevops

Go to project settings and add a service hook

Enter webhook URI you get in previous steps for LogicApp