Kafka TCP Integration Guide
Review the following guide for setting up an integration between Litmus Edge and a Kafka broker.
You will need access to a Kafka-TCP broker.
Refer to the following links to learn more:
You will need to set up the Kafka server so that it is ready to connect with Litmus Edge.
If you are using a docker-compose yaml file, do the following:
- Connect and log in to the Kafka-TCP broker.
- From the command screen cursor, enter cat docker-compose.yml. The services list appears.
- From the cursor, enter docker-compose up. The TCP_Zookeeper and TCP_Kafka start up.
Follow the steps to Add a Connector and select the Kafka TCP provider.
Configure the following parameters.
- Name: Enter a name for the connector.
- Brokers list separated by comma: Enter the Kafka broker URL. For example: 192.168.56.1.
- SASL mechanism: Select an option: NONE, SASL_PLAINTEXT, SCRAM-SHA-256, SCRAM-SHA-512.
- Username (Optional): If needed, enter the username to access the broker.
- Password: (Optional): If needed, enter the password associated with the username.
- Topic: Enter the name of the default topic used for publishing data.
- Throttling Limit: The maximum number of messages per second to be processed. The default value is zero, which means that there is no limit.
- Persistent storage: When enabled, this will cause messages to undergo a store-and-forward procedure. Messages will be stored within Litmus Edge when cloud providers are online.
- Queue Mode: Select the queue mode as lifo (last in first out) or fifo (first in first out). Selecting lifo means that the last data entry is processed first, and selecting fifo means the first data entry is processed first.
After adding the connector, click the toggle in the connector tile to enable it.
If you see a Failed status, you can review the Connector Logs and relevant error messages.
You will now need to import the tags you added in Step 3 to the connector as topics.
Topics must already exist on the Kafka server to avoid errors. Using non-existent Kafka topics will result in errors in logs, failed connection and/or persisted storage.
After adding all required topics, navigate to the Integration overview page and ensure the connector is not disabled and still shows a CONNECTED status.
Access the Kafka broker and verify that Litmus Edge is successfully sending data.