Member-only story
Understanding of Kafka service externalization in the Pega Application
In this post, let's understand the Kafka service externalization in the Pega Application. Kafka is an open-source streaming platform that collects, processes, and stores streaming data. Pega products come with Kafka as an embedded service. It supports the Queue processor and Job Scheduler execution in the platform. From Pega 8.6 onwards, the product recommendation is to externalise the service to enhance the application's performance, scalability, and maintainability.
The externalisation of Kafka service supports microservices architecture and cloud-native readiness. The clients can manage their own Kafka clusters or use the AWS, Google and Azure publish and streaming services like Azure Event Hubs, or AWS Kinesis. Apart from Pega, other client applications like Salesforce, and AWS cloud VPCs can effectively utilize the centralized Kafka cluster.
Here I would like to explain simply if the client is managing the Kafka cluster then how the Kafka topics store the event data. Most companies, use Kubernetes docker containers to provide the service.
Step 1: Install the docker desktop app
Follow the instructions required for Mac or Windows.