Simple Kafka Connector

Content Type: Module
Categories: Connectors


Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. This app was developed in order to provide a simpler way to consume and produce kafka topics, there are some examples that shows the basic flow. The current supported operations are onConsume and onProduce. We've implemented some scheduled events in order to keep que consumer listening for topic changes and new messages on the broker.



This module offers simpler ways to implement apache kafka messages in your app. You only need to set up the url and port constants in order to make it work.

Typical usage scenario

The user needs to implement an apache kafka instance in order to use a message broker. 


  • onConsume : retrieve data from the topic
  • onProduce : insert data on the topic


  • There are no functional limitations at all.


When exporting the module, be sure to check the following libraries: jaxb-api-2.3.1.jar, kafka-clients-2.8.0.jar and slf4j-api-1.7.30.jar.


Before the first run, check the constants folder and set up the values for your kafka enviroment.

When creating you consumer, be sure that the parameter names are the same as in your java action, or your app will not work correctly.

There are 2 scheduled events: StartConsumer and StartProducer, only the StartConsumer is enabled by default. Feel free to use the StartProducer to verify if everything is ok with your producer.

In order to user the JA_StartConsumer, you will need to provide the microflow parameter, this might be passed as 'Module.Microflow'.

Both the producer and the consumer can be triggered using a REST call or a button in a page, use it as you need.


Version: 2.0.0
Framework Version: 9.6.0
Release Notes: Added new notes to readme.
Version: 1.0.0
Framework Version: 9.6.0
Release Notes: Initial version.