How-To Guides
Integration Guides
Kafka TCP Integration Guide
10min
review the following guide for setting up an integration between litmus edge and a kafka broker before you begin you will need access to a kafka tcp broker refer to the following links to learn more apache kafka download confluent platform docs step 1 set up the kafka tcp server you will need to set up the kafka server so that it is ready to connect with litmus edge if you are using a docker compose yaml file, do the following connect and log in to the kafka tcp broker from the command screen cursor, enter cat docker compose yml the services list appears from the cursor, enter docker compose up the tcp zookeeper and tcp kafka start up step 2 add device follow the steps to connect a device docid 3eyafppweuvmblcey17sq the device will be used to store tags that will be eventually used to create outbound topics in the connector make sure to select the enable data store checkbox step 3 add tags after connecting the device in litmus edge, you can add tags docid 8se7z3pmrfwl1nmzcwalx to the device create tags that you want to use to create outbound topics for the connector step 4 add the kafka tcp connector follow the steps to add a connector docid\ ogw7fkqbwidbabn4wl5rr and select the kafka tcp provider configure the following parameters name enter a name for the connector brokers list separated by comma enter the kafka broker url for example 192 168 56 1 sasl mechanism select an option none, sasl plaintext, scram sha 256, scram sha 512 username (optional) if needed, enter the username to access the broker password (optional) if needed, enter the password associated with the username topic enter the name of the default topic used for publishing data throttling limit the maximum number of messages per second to be processed the default value is zero, which means that there is no limit persistent storage when enabled, this will cause messages to undergo a store and forward procedure messages will be stored within litmus edge when cloud providers are online queue mode select the queue mode as lifo (last in first out) or fifo (first in first out) selecting lifo means that the last data entry is processed first, and selecting fifo means the first data entry is processed first step 5 enable the connector after adding the connector, click the toggle in the connector tile to enable it if you see a failed status, you can review the manage connectors docid 3u7jzldinehy8shvifd d and relevant error messages step 6 create topics for connector you will now need to import the tags you added in step 3 to the connector as topics topics must already exist on the kafka server to avoid errors using non existent kafka topics will result in errors in logs, failed connection and/or persisted storage to create outbound topics click the connector tile the connector dashboard appears click the topics tab click the add tags icon the add subscription dropdown appears click the +add a new subscription in dropdown menu the create subsubscription dialog appears for publishing litmus edge data to aws sitewise asset measurement, define the following data direction = local to remote outbound local topic = any topic in litmus edge remote topic = asset measurement alias defined on the aws sitewise portal the measurement alias is a user defined string for each measurement can be found under the alias column description = description for the subscription optional field click ok to add the subscription after adding all required topics, navigate to the integration overview page and ensure the connector is not disabled and still shows a connected status step 7 enable topics ensure the topics you imported are enabled by returning to the topics tab and clicking the enable all topics icon step 8 verify connection access the kafka broker and verify that litmus edge is successfully sending data