Solutions

Litmus Edge to Confluent using Kafka over SSL

36min
overview how to prepare confluent for a litmus kafka over ssl integration this section will provide instructions on how to prepare a confluent cluster to accept a connection using litmus edge integration kafka over ssl requirements a confluent cluster has been deployed step by step guide to prepare the confluent cluster this guide will provide instructions on the minimum required configuration to connect to the confluent cluster using litmus edge integration of kafka over ssl this guide will not provide instructions on how to manage a confluent cluster or actions not related to connecting to the cluster via the litmus integration step 1 example cluster 0 will be used step 2 to add a topic to which litmus integration will send data or read data from, select the topics option from the left panel step 3 add a topic by using the create topic button step 4 step 4 1 for a quick start, the new topic setup can be finished with the create with defaults button step 4 2 the topic configuration can also be adjusted using the show advanced settings option, which is also documented within the confluent documentation after configuring the topic according to your needs, create the topic with the save & create button step 5 a new topic is now available step 6 add an api key, so that the litmus kafka over ssl integration can authenticate against confluent by using the api keys sub option under the data integration option step 7 create a new api key with the create key button step 8 select for the key type the global access option and press the next button step 9 provide a description if needed to create the key, press the download and continue button the key will be created and a text file with the key and secret will be download to your default download folder step 10 a new api key is now available how to ingest data from litmus edge using confluent this section will provide instructions on how to setup the litmus kafka over ssl integration to publish data to confluent requirements to allow that data can be published from litmus edge to confluent using litmus kafka over ssl integration, four requirements have to be met beforehand a confluent cluster has been configured at least one topic has been configured on the confluent cluster an api key with the data to be published are collected by litmus edge these can either be data collected through a data created by litmus data created through a other sources like containers step by step guide to publish data from litmus edge to confluent this guide will provide instructions on how to create a litmus kafka over ssl integration to publish data from litmus edge this chapter will not provide instructions on how to build a confluent stream, as is it expected that user are familiar with this process also does confluent provide well documented instruction onto this aspect step 1 log on to your litmus edge device and open the integration menu step 2 press the add a connector button step 3 use the drop down menu to select the kafka ssl integration step 4 to be able to conect to confluent, the connector requires 6 settings steps 4 1 4 6 will look at each config item step 4 1 name this is the name the integration displays inside the litmus edge system note (has to be unique across all integrations) example step 4 2 broker this is the bootstrap server which can be either be acquired from the cluster settings sub option under the cluster overview option or from the file which was downloaded to create the api key example step 4 3 sasl mechanism this is the security protocol and mechanism used by the confluent kafka broker for confluent this is by default sasl plaintext step 4 4 username this is the sasl user / api key from the file which was downloaded to create the api key example step 4 5 password this is the sasl password / api secret from the file which was downloaded to create the api key example note to see the password, the eye symbol can be disabled to turn of the hiding step 4 6 topic this is the topic which was created on the cluster and can be acquired from the topics option ​ example step 5 add the new integration through the add button after providing all the required settings as shown above step 6 the new kafka over ssl integration will be created by default in the disabled state step 7 the next step is to add topics as outbound (local > remote) to the integration this can either be done via import from devicehub or on a tag by tag or topic wildcard base as described in the documentation example step 8 after adding topics to the integration, they have to be enabled this can be either be done on a topic by topic base or for all topics at once example step 9 to finalize the setup, enable the integration through the switch at the top right disabled enabled step 10 data are now published to confluent and can be monitored through the messages tab under the topics option with this, the data are now available in confluent as a kafka producer and users can stream the data to kafka consumers using confluent stream lineage this guide will not be able to provide instructions on these topics confluent provides instructions through their own documentation how to send data from confluent to litmus edge this section will provide instructions on how to setup the litmus kafka over ssl integration to read data from confluent requirements to allow that data can be read by litmus edge from confluent using litmus kafka over ssl integration, three requirements have to be met beforehand a confluent cluster has been configured at least one topic has been configured on the confluent cluster an api key with the step by step guide to publish data to litmus edge via apache nifi this guide will provide instructions on how to create a litmus kafka over ssl integration to read data from confluent this chapter will not provide instructions on how to build a confluent stream, as is it expected that user are familiar with this process also does confluent provide well documented instruction onto this aspect ​ step 1 log on to your litmus edge device and open the integration menu step 2 press the add a connector button step 3 use the drop down menu to select the kafka ssl integration step 4 to be able to connect to confluent, the connector requires 6 settings steps 4 1 4 6 will look at each config item step 4 1 name this is the name the integration displays inside the litmus edge system note (has to be unique across all integrations) example step 4 2 broker this is the bootstrap server which can be either be acquired from the cluster settings sub option under the cluster overview option or from the file which was downloaded to create the api key example step 4 3 sasl mechanism this is the security protocol and mechanism used by the confluent kafka broker for confluent this is by default sasl plaintext step 4 4 username this is the sasl user / api key from the file which was downloaded to create the api key example step 4 5 password this is the sasl password / api secret from the file which was downloaded to create the api key example note to see the password, the eye symbol can be disabled to turn of the hiding step 4 6 topic this is the topic which was created on the cluster and can be acquired from the topics option ​ example step 5 add the new integration through the add button after providing all the required settings as shown above step 6 the new kafka over ssl integration will be created by default in the disabled state step 7 the next step is to add topics as inbound (remote > local) to the integration this is described through the litmus documentation step 8 after adding topics to the integration, they have to be enabled this can be either be done on a topic by topic base or for all topics at once step 9 to finalize the setup, enable the integration through the switch at the top right disabled enabled step 10 data are now read from confluent into litmus edge to verify the data stream, open the integration and go to topics and copy the local topic by clicking on it ​step 11 open a flow and add a datahub subscribe and a debug node ​ step 12 add the topic copied in step 10 into the datahub subscribe node and save step 13 after deploying the flow, the data read from confluent will be shown in the debug window (optional) how to deploy the initial integration via template this section will provide instructions on how to deploy and configure the litmus kafka over ssl integration via a litmus template both via litmus edge ui and litmus edge manager requirements to allow that data can be read by litmus edge from confluent using litmus kafka over ssl integration, four requirements have to be met beforehand a confluent cluster has been configured at least one topic has been configured on the confluent cluster an api key with the the litmsu edge devie needs to have a license (optional) to deploy the integration via litmsu edge manager, the litmsu edge device has to be attached to litmus edge manager knowledge prerequisits to deploy the initital integration via template, the user is expected to be familiar with either deplyoing a template through the litmus edge ui or upload a template to litmus edge manager deploy a template from litmus edge manager step by step guide to deploy the initial integration via template this guide will provide instructions on how to deploy the initial integration via template and how to then edit it to adjust the configuration it is expected, that the user is knowledgeable about how to deploy a template and edit an existing integration download the tar file which includes both the template to be used via the litmus edge ui as well as litmus edge manager deploy the template via litmus edge ui step 1 open the litmus edge ui step 2 open the template option from system >backup/restore step 3 use the upload template button ​ step 4 select the file "le template confluent connector json"​ after the import, a success message appears step 5 to modify the configuration, which is required, open integrations step 6 use the edit button step 7 modify the 4 entries for the \ broker \ username \ password \ topic note follow the steps 4 2, 4 4, 4 5 and 4 6 of the chapter step by step guide to publish data from litmus edge to confluent step 8 save the changes with the update button step 9 enable the integration and add the topics to be written to confluent according to the chapter " step by step guide to publish data from litmus edge to confluent" or the be read into litmus edge according to the chapter "step by step guide to publish data to litmus edge via apache nifi" deploy the template via litmus edge manager step 1 open the litmus edge manager user ui step 2 open the template option under the company or project you want the template to be available in example on project level ​step 3 use the upload template button ​ step 4 select the simple template option select file "lem template confluent connector json", provide a name and upload the template after the import, the template is in the list step 5 apply the template to an le using the apply template option ​ step 6 select the le you want the template applied to and apply it with the apply button step 7 log on to the le(s) to modify the integration with the right settings for your confluent connection step 8 to modify the configuration, which is required, open integrations step 9 use the edit button step 10 modify the 4 entries for the \ broker \ username \ password \ topic note follow the steps 4 2, 4 4, 4 5 and 4 6 of the chapter step by step guide to publish data from litmus edge to confluent step 11 save the changes with the update button step 12 enable the integration and add the topics to be written to confluent according to the chapter "step by step guide to publish data from litmus edge to confluent" or the be read into litmus edge according to the chapter "step by step guide to publish data to litmus edge via apache nifi"