Trigger Azure Data Factory Pipeline from Event Grid (Using Webhook Endpoint)
Umesh Kumar Dhakar
Posted on April 12, 2023
In this post, I will explain how we can use Azure Event Grid Topics to trigger Data Factory pipeline. We will be using Access key for authentication and POST request for publishing a topic to Event Grid. I have also attached a snapshot for each step which can help to understand the steps easily.
Prerequisite:
Microsoft.EventGrid should be registered under the subscription.
To confirm this, go to your subscription, then click Resource Provider.
Search for Event Grid, ifMicrosoft.EventGrid
status is Registered, we are good to go else you need to selectMicrosoft.EventGrid
and click on the register button.
We will create 2 Azure resources: Event Grid Topic and Azure Data Factory.
Create Event Grid Topic Resource
Here, I created a resource with name eg-tp-useast-dev-01
While creating the resource, make sure that Enable access key and Shared Access Signature (SAS)
is enabled.
Remaining configurations you can keep as default.
Once Event Grid Topic Resource is deployed, go to the resource. Under the Overview tab, you can see there is no subscription found at this moment. Once we configure the ADF trigger, we will see a subscription here.
Create Azure Data Factory resource and launch the Studio
I created a Data Factory resource df-useast-dev-01
.
Go to Author Tab and create a sample pipeline. For simplicity, I just added a Wait activity inside a sample pipeline PL_Sample
.
Once you created the pipeline, Publish it.
Create a custom trigger
Create a custom trigger for the pipeline PL_Sample
which we just created. Click on Add Trigger and create a new trigger with the below details:
- give trigger name
- select type as Custom events
- select Subscription
- you should see all available event grid topics, select the topic which we just created (eg-tp-useast-dev-01)
- type the subject: tp-subject-01
- type the event type: event-type-01
- check
Start trigger on creation
- click OK
- publish Trigger
subject and event type values are used to filter the events.
NOTE: If Event Grid topic is not visible in the
Event Grid topic name
drop-down, follow these 4 steps:
- Open Event Grid topic resource in a new tab, Go to the overview tab, click on JSON view.
- Copy the Resource ID.
- Again, switch back to the ADF trigger page and switch to Enter manually for
Account selection method
and paste the Resource ID in the scope field.- Click on Ok and Publish the trigger.
Once Trigger is published successfully, go to Event Grid topic, you should see 1 event subscription.
So far we have configured a ADF pipeline with the Event Grid topic.
Publish an event to the Event Grid Topic
Now we will publish an event to the Event Grid Topic by sending a POST request. I am using Postman to send requests.
In order to send a request, we need an endpoint and an access key of the topic.
- Go to Overview tab and copy the
Topic Endpoint
value - Go to Access Keys and copy a Key
Create A POST request
URL
: Use the Topic Endpoint as request URL
header
: Add a header in the request with key aeg-sas-key
and access key as value.
body
: Add the below body in JSON format
[
{
"id": "I01",
"eventType": "event-type-01",
"subject": "tp-subject-01",
"data":{
"key1": "val1",
"key2": "val2"
},
"eventTime": "2023-04-06T11:26:07+05:30",
"dataVersion": "1.0"
}
]
id
: id of the event (must be unique for each request)
eventType
: same as eventType mentioned while creating ADF trigger
subject
: same as subject mentioned while creating ADF trigger
data
: parameters that can be passed to ADF trigger (explained at the end)
Click on Send, If the status is 200 OK
, ADF pipeline should trigger.
Go to ADF monitor tab > Pipeline Runs > Triggered
.
You should see the Pipeline has been triggered by the custom event trigger
As I mentioned earlier that we can pass the parameter from event topic to ADF pipeline. For that follow below steps:
- Create a pipeline parameter (here I created
par_input_file
) - Edit the trigger and pass
@triggerBody().event.data.key1
in Trigger Run Parameters.
When pipeline gets triggered, it should initialize the pipeline parameter with event.data.key1
value.
Note: Resources should be deleted after testing in order to avoid costs.
Thank you for reading this post. Hope it helped you understand and implement this event-driven pipeline execution. I could not find all these steps in a single post, so thought to consolidate all the steps into one post and share them with other developers. Please let me know if you stuck in between while implementing this, I would be happy to help. Happy learning.
Posted on April 12, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.