Currently Azure Activity Log can only be exported to Azure Event Hub for processing (other choices are ill suited for event processing). Event Hub is optimized for event collection but is not a message queue service. The only message queue services provided by Azure are Azure Service Bus as well as Storage queue. Relying on EventHub for log processing will not only inflate costs (for example you can not use Filters to limit amount of messages processed by backend) but difficult to deal with concurrent access by different readers as well as lack of UI tools provided by real message processing utilities.
Azure Event Grid subscriptions on other hand will allow you to subscribe to only events you need from Azure Activity Log, filter those and then rely on Storage queue for downlevel processing.
Step by step instructions below will configure and tie together several native Azure resources in single pipeline for Azure Activity log processing. Flow is as follows: Azure Event Grid Subscription will create Storage queue message upon event being fired and passing Rules, Azure Logic App then pick up all the messages and process them through Azure automation to extract necessary information.
Create new storage account to host storage queue.
Create queue to hold ActivityLog messages forwarded by EventGrid
Create new Event Grid Subscription.
Important field are below which identifies parameters for capturing only events relevant to successful Writes and sends events to storage queue
Create filter to further narrow down events which will lower costs as well. Go to Filters tab and choose Advanced filter for key data.operationName with Operator String is in with value of Microsoft.Authorization/roleAssignments/write
Create event grid subscription once all parameters are set. If you would get error stating that Provider is not registered, wait a couple of minutes and try again. Provider will be automatically registered upon trying to create resource if your account have sufficient permissions.
See if your subscription working. Go to any resource group and assign any sort of permission to resource group. Within a minute you shall be seeing message in your storage queue
You can open actual message below and see details identifying who initiated action, name etc. Important part is on line 15 which will allow to get information about actual request which will identify which user/group was assigned a Role and which Role was assigned.
This value (
“correlationId”: “552ee42c-d232-465a-a39a-9dd73e5d9d94”) will be used in Automation account to pull additional information with details of request.
Use existing Automation Account or create a new one.
Make sure RunAs account is created as part of Automation Account creation
Navigate to RunAs account properties of your automation account and note Application ID
Navigate to Azure Active Directory and find this Object ID under app registrations
Click on it and go to Settings/Required Permissions/Add
Choose Windows Azure Active Directory as API. Under Enable Access specify Read Directory Data
Choose Grant permissions on last step
Go to Automation Account/Modules and click Update Azure Modules
Go to Modules and choose Browse Gallery. Search for Az.Monitor and import that module, you might have to import depended modules if it requires.
Save and import this runbook as graphical runbook (https://gist.github.com/artisticcheese/b5af629cdcddb6617953fc7fb2c885ee#file-get-azureactivitylogdetails-graphrunbook)
Once imported it shall look like below
Go test pane input CorrelationId you copied in previous steps (in my case it’s
552ee42c-d232-465a-a39a-9dd73e5d9d94)as input parameter and run test. Your output shall be similar to below
Create blank LogicApp.
Add Azure Storage Queue as connector and “When there are messages in queue” as trigger
Choose your parameters as needed on next screen
Add Parse Json as next steps and paste body of existing message in queue to generate schema
Choose Azure Automation/Create Job as next step and input relevant information about your automation account and output from previous steps like below
Add Azure Automation/Get Job Output to receive result of job completion
Run your pipeline to verify that you receiving correct information from existing messages in queue
Add Parse JSON as next steps to make those values available for output and paste existing result to generate schema
Create final JSON which you’d want to send to external tool via Compose Message step
Final step is delete message from queue upon successful execution via Storage Queues/Delete Message