top of page
Ready for School

Become a member for free

Integrating Pega with Azure event hub using native kafka

Dec 16, 2024

4 min read

3

315

0

Many organizations are using Azure Event Hub these days to stream the events to multiple applications with in enterprise architecture. To adhere to the organization's changes the events should be consumed or published to and from Pega Business Process Management (BPM) platform to or from Azure Event Hub. To achieve this, easy way is to use native kafka provided by azure event hub to integrate with Pega. The integration allows seamless communication between Pega and Azure Event Hub to stream real-time events, enabling enhanced data exchange and event-driven processing.


Note: Before creating the connection between the systems whitelist Pega hostname, IP range in Azure event hub and whitelist Azure events hub's hostname, IP range and open ports with in pega cloud.


Step 1: Set Up Azure Event Hub

  1. Create an Event Hub Namespace:

    • Go to the Azure portal and navigate to the Event Hubs service.

    • Click on + Add, then provide a name for your Event Hub Namespace, select a region, and click Review + Create.

    • Verify that Kafka surface is enabled or not. By default the name space should enable the kafka surface when a name space is created.

    • Helpful link to know more instructions how to create name space and event hub is below

      https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-create

      Name space and kafka surface enabled show highlighted
      Name space and kafka surface enabled show highlighted

  2. Create an Event Hub:

    • In the newly created Event Hub Namespace, select + Event Hub and create an Event Hub within it.

    • Give the Event Hub a name (e.g., pega-events) and configure any required partitions based on your load.

    • Event hub is nothing but a topic.

      Event hub shown highlighted
      Event hub shown highlighted
  3. Get the Event Hub Connection String:

Step 2: Kafka configuration prerequisite

  1. Create Kafka properties file to use for connecting to Azure Event Hub:


  2. Example azurekafka.properties :


    Copy below code to properties file created

    security.protocol=SASL_SSL

    sasl.mechanism=PLAIN sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ username="$ConnectionString" \ password="<your-eventhub-connection-string>";


    Replace <your-eventhub-connection-string> with actual values from your Azure Event Hub.


Step 3: Pega Configuration for Kafka Integration

  1. Configure Kafka in Pega:

    • In Pega, go to Records → SysadminKafka.

    • Create a new Kafka record for integration.

    • Provide the Kafka broker details (hostname and port) and other necessary configurations to enable communication.

    • Upload the properties file created for authorization and test connectivity.


    • Provide your azure event hub host name and port (9093 or 9092)
      Provide your azure event hub host name and port (9093 or 9092)
  2. Configure Pega to Send Events or receive events using kafka topic:

    • Create a data set rule of type Stream Kafka rule in Pega to send events to Kafka and read the events from kafka.

    • Configure the data set with kafka instance created above and the topic the system wants to listen to and sends to in Kafka.

    • Set the event message type to either json, avro based on your preference and automatically map fields or use data transform to set to clipboard.


  3. Test the Integration:

    • Test the event publishing by triggering an event in Pega by writting an activity and save the event message into data set.

    • Use Dataset-execute method with save to publish event to azure event hub's topic.

    • Use Dataset-execute with browse to consume events from kafka topic published from azure event hub.

    • For the above there are many POC's published by our pega community friends.

      Example: https://mypegapoc.com/2022/04/integration-of-pega-with-kafka/

      Thanks for publishing the step by step test on dataset browse and save my friend.

    • You should see the events being transmitted from Pega to Kafka, and from Kafka to Azure Event Hub.

    • From azure event hub send events by creating event with in event hub from data explorer. Below link will help with step by step for testing the publishing event and reading the event.

      https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-data-explorer

      Send events or view events from the event hub with in data explorer
      Send events or view events from the event hub with in data explorer
      View all the events from data explorer in azure event hub
      View all the events from data explorer in azure event hub

Step 4: Consuming the events real time

  1. Create a data flow:

    • Create a data flow and configure data set browse step and then anticipated subsequent steps to process the events.

    • Create a real time data flow run and leave the data flow work object up and running.

    • Whenever the event is published into the topic from azure event hub, real time data flow would pick up the event and run the steps defined with in the data flow.

  2. Monitor Kafka:

    • Check Kafka logs to ensure the Event Hub Sink Connector is correctly sending data.

    • Ensure there are no issues with the configuration or network connectivity.

  3. Monitor Pega Event Stream:

    • Verify Pega is successfully publishing events by checking the Pega logs and validating that events are appearing in Kafka.


Step 5: Optional - Scaling and Optimization

  1. Scale Kafka:

    • If you expect a high volume of events, consider scaling Kafka by adding more partitions.

  2. Azure Event Hub Optimization:

    • Optimize Event Hub throughput by adjusting the partition count and configuring Auto-Inflate based on your volume.

  3. Pega Event Optimization:

    • Adjust Pega’s event publishing rate and ensure there is minimal impact on performance.

    • Add queue processing with in data flow to process the events for better through put if the events are in quite high number.


#pegazureeventhubintegration #pegaintegration #pegakafkaintegration #kafkarealtime #pegarealtimeeventsprocessing #pegadataflow #pegadataset

Related Posts

Comments

Share Your ThoughtsBe the first to write a comment.
Ready for School

Become a member for free

bottom of page