

Integrating Pega with Azure event hub using native kafka
3
315
0
Many organizations are using Azure Event Hub these days to stream the events to multiple applications with in enterprise architecture. To adhere to the organization's changes the events should be consumed or published to and from Pega Business Process Management (BPM) platform to or from Azure Event Hub. To achieve this, easy way is to use native kafka provided by azure event hub to integrate with Pega. The integration allows seamless communication between Pega and Azure Event Hub to stream real-time events, enabling enhanced data exchange and event-driven processing.
Note: Before creating the connection between the systems whitelist Pega hostname, IP range in Azure event hub and whitelist Azure events hub's hostname, IP range and open ports with in pega cloud.
Step 1: Set Up Azure Event Hub
Create an Event Hub Namespace:
Go to the Azure portal and navigate to the Event Hubs service.
Click on + Add, then provide a name for your Event Hub Namespace, select a region, and click Review + Create.
Verify that Kafka surface is enabled or not. By default the name space should enable the kafka surface when a name space is created.
Helpful link to know more instructions how to create name space and event hub is below
https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-create
Name space and kafka surface enabled show highlighted
Create an Event Hub:
In the newly created Event Hub Namespace, select + Event Hub and create an Event Hub within it.
Give the Event Hub a name (e.g., pega-events) and configure any required partitions based on your load.
Event hub is nothing but a topic.
Event hub shown highlighted
Get the Event Hub Connection String:
Go to Shared access policies and create a new policy with Send and Listen permissions.
Copy the Connection string - primary key, which will be used for Kafka configuration.
Help link how to get connection string for the connection.
Connection string example shown above
Step 2: Kafka configuration prerequisite
Create Kafka properties file to use for connecting to Azure Event Hub:
Create a property file for connecting to Kafka azurekafka.properties file to include Azure Event Hub as a destination.
Use the Kafka Connect Azure Event Hubs connector to integrate Kafka with Event Hub.
Note: Gather host name, port and connection string from azure event hub name space and event hub for the connection.
Example azurekafka.properties :
Copy below code to properties file created
security.protocol=SASL_SSL
sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ username="$ConnectionString" \ password="<your-eventhub-connection-string>";
Replace <your-eventhub-connection-string> with actual values from your Azure Event Hub.
Step 3: Pega Configuration for Kafka Integration
Configure Kafka in Pega:
In Pega, go to Records → Sysadmin→ Kafka.
Create a new Kafka record for integration.
Provide the Kafka broker details (hostname and port) and other necessary configurations to enable communication.
Upload the properties file created for authorization and test connectivity.
Provide your azure event hub host name and port (9093 or 9092)
Configure Pega to Send Events or receive events using kafka topic:
Create a data set rule of type Stream Kafka rule in Pega to send events to Kafka and read the events from kafka.
Configure the data set with kafka instance created above and the topic the system wants to listen to and sends to in Kafka.
Set the event message type to either json, avro based on your preference and automatically map fields or use data transform to set to clipboard.
Test the Integration:
Test the event publishing by triggering an event in Pega by writting an activity and save the event message into data set.
Use Dataset-execute method with save to publish event to azure event hub's topic.
Use Dataset-execute with browse to consume events from kafka topic published from azure event hub.
For the above there are many POC's published by our pega community friends.
Example: https://mypegapoc.com/2022/04/integration-of-pega-with-kafka/
Thanks for publishing the step by step test on dataset browse and save my friend.
You should see the events being transmitted from Pega to Kafka, and from Kafka to Azure Event Hub.
From azure event hub send events by creating event with in event hub from data explorer. Below link will help with step by step for testing the publishing event and reading the event.
https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-data-explorer
Send events or view events from the event hub with in data explorer View all the events from data explorer in azure event hub
Step 4: Consuming the events real time
Create a data flow:
Create a data flow and configure data set browse step and then anticipated subsequent steps to process the events.
Create a real time data flow run and leave the data flow work object up and running.
Whenever the event is published into the topic from azure event hub, real time data flow would pick up the event and run the steps defined with in the data flow.
Monitor Kafka:
Check Kafka logs to ensure the Event Hub Sink Connector is correctly sending data.
Ensure there are no issues with the configuration or network connectivity.
Monitor Pega Event Stream:
Verify Pega is successfully publishing events by checking the Pega logs and validating that events are appearing in Kafka.
Step 5: Optional - Scaling and Optimization
Scale Kafka:
If you expect a high volume of events, consider scaling Kafka by adding more partitions.
Azure Event Hub Optimization:
Optimize Event Hub throughput by adjusting the partition count and configuring Auto-Inflate based on your volume.
Pega Event Optimization:
Adjust Pega’s event publishing rate and ensure there is minimal impact on performance.
Add queue processing with in data flow to process the events for better through put if the events are in quite high number.
#pegazureeventhubintegration #pegaintegration #pegakafkaintegration #kafkarealtime #pegarealtimeeventsprocessing #pegadataflow #pegadataset
Related Posts
