Solutions

Litmus Edge to Confluent using Kafka over SSL

36min

Overview

Document image


How to prepare Confluent for a Litmus Kafka over SSL Integration

This section will provide instructions on how to prepare a Confluent cluster to accept a connection using Litmus Edge Integration Kafka over SSL

Requirements

A Confluent cluster has been deployed.

Step by Step guide to prepare the Confluent cluster

This guide will provide instructions on the minimum required configuration to connect to the confluent cluster using Litmus Edge Integration of Kafka over SSL.

This guide will not provide instructions on how to manage a Confluent cluster or actions not related to connecting to the cluster via the Litmus Integration.

Step 1: Example: cluster_0 will be used

Document image


Step 2: To add a topic to which Litmus Integration will send data or read data from, select the Topics option from the left panel.

Document image


Step 3: Add a topic by using the Create topic button.

Document image


Step 4: Step 4.1: For a quick start, the new topic setup can be finished with the Create with defaults button.

Document image


Step 4.2: The topic configuration can also be adjusted using the Show advanced settings option, which is also documented within the Confluent documentation.

Document image


After configuring the topic according to your needs, create the topic with the Save & create button.

Document image


Step 5: A new topic is now available.

Document image


Step 6: Add an API key, so that the Litmus Kafka over SSL Integration can authenticate against Confluent by using the API keys sub-option under the Data integration option.

Document image


Step 7: Create a new API key with the Create key button.

Document image


Step 8: Select for the key type the Global access option and press the Next button.

Document image


Step 9: Provide a Description if needed. To create the key, press the Download and continue button.

Document image


The key will be created and a text file with the key and secret will be download to your default download folder.

Document image


Step 10: A new API key is now available.

Document image


How to ingest data from Litmus Edge using Confluent

This section will provide instructions on how to setup the Litmus Kafka over SSL Integration to publish data to confluent.

Requirements

To allow that data can be published from Litmus Edge to Confluent using Litmus Kafka over SSL Integration, four requirements have to be met beforehand.

  1. A Confluent Cluster has been configured
  2. At least one topic has been configured on the Confluent cluster
  3. An API key with the
  4. Data to be published are collected by Litmus Edge. These can either be:
    1. Data collected through a
    2. Data created by Litmus
    3. Data created through a
    4. Other sources like containers.

Step by Step guide to publish data from Litmus Edge to Confluent

This guide will provide instructions on how to create a Litmus Kafka over SSL Integration to publish data from Litmus Edge. This chapter will not provide instructions on how to build a Confluent stream, as is it expected that user are familiar with this process. Also does Confluent provide well documented instruction onto this aspect.

Step 1: Log on to your Litmus Edge device and open the Integration menu.

Document image


Step 2: Press the Add a connector button.

Document image


Step 3: Use the drop-down menu.

Document image


To select the Kafka SSL integration.

Document image


Step 4: To be able to conect to confluent, the connector requires 6 settings.

Document image


Steps 4.1 - 4.6 will look at each config item.

Step 4.1: Name This is the name the integration displays inside the Litmus Edge system. Note: (has to be unique across all integrations) Example:

Document image


Step 4.2: Broker This is the Bootstrap server which can be either be acquired from the Cluster settings sub-option under the Cluster overview option.

Document image


Or from the file which was downloaded to create the API key.

Document image


Example:

Document image


Step 4.3: SASL mechanism This is the security protocol and mechanism used by the Confluent Kafka broker. For Confluent this is by default SASL_PLAINTEXT.

Document image


Step 4.4: Username This is the sasl user / API key from the file which was downloaded to create the API key.

Document image


Example:

Document image


Step 4.5: Password This is the sasl password / API secret from the file which was downloaded to create the API key.

Document image


Example:

Document image


Note: To see the password, the eye symbol can be disabled to turn of the hiding.

Document image


Step 4.6: Topic This is the topic which was created on the cluster and can be acquired from the Topics option.​

Document image


Example:

Document image


Step 5: Add the new integration through the Add button after providing all the required settings as shown above.

Document image


Step 6: The new Kafka over SSL integration will be created by default in the disabled state.

Document image


Step 7: The next step is to add Topics as outbound (Local -> Remote) to the Integration. This can either be done via import from devicehub.. Or on a tag by tag or topic wildcard base as described in the documentation.

Example:

Document image


Step 8: After adding Topics to the integration, they have to be enabled. This can be either be done on a topic by topic base or for all topics at once.

Example:

Document image


Step 9: To finalize the setup, enable the integration through the switch at the top right.

Disabled:

Document image


Enabled:

Document image


Step 10: Data are now published to Confluent and can be monitored through the Messages tab under the Topics option.

Document image


With this, the data are now available in Confluent as a Kafka Producer and users can stream the data to Kafka consumers using Confluent Stream lineage

This guide will not be able to provide instructions on these topics. Confluent provides instructions through their own documentation

How to send data from Confluent to Litmus Edge

This section will provide instructions on how to setup the Litmus Kafka over SSL Integration to read data from Confluent.

Requirements

To allow that data can be read by Litmus Edge from Confluent using Litmus Kafka over SSL Integration, three requirements have to be met beforehand.

  1. A Confluent Cluster has been configured
  2. At least one topic has been configured on the Confluent cluster
  3. An API key with the

Step by Step guide to publish data to Litmus Edge via Apache Nifi

This guide will provide instructions on how to create a Litmus Kafka over SSL Integration to read data from Confluent. This chapter will not provide instructions on how to build a Confluent stream, as is it expected that user are familiar with this process. Also does Confluent provide well documented instruction onto this aspect.​

Step 1: Log on to your Litmus Edge device and open the Integration menu.

Document image


Step 2: Press the Add a connector button.

Document image


Step 3: Use the drop-down menu.

Document image


To select the Kafka SSL integration.

Document image


Step 4: To be able to connect to confluent, the connector requires 6 settings.

Document image


Steps 4.1 - 4.6 will look at each config item.

Step 4.1: Name This is the name the integration displays inside the Litmus Edge system. Note: (has to be unique across all integrations) Example:

Document image


Step 4.2: Broker This is the Bootstrap server which can be either be acquired from the Cluster settings sub-option under the Cluster overview option.

Document image


Or from the file which was downloaded to create the API key.

Document image


Example:

Document image


Step 4.3: SASL mechanism This is the security protocol and mechanism used by the Confluent Kafka broker. For Confluent this is by default SASL_PLAINTEXT.

Document image


Step 4.4: Username This is the sasl user / API key from the file which was downloaded to create the API key.

Document image


Example:

Document image


Step 4.5: Password This is the sasl password / API secret from the file which was downloaded to create the API key.

Document image


Example:

Document image


Note: To see the password, the eye symbol can be disabled to turn of the hiding.

Document image


Step 4.6: Topic This is the topic which was created on the cluster and can be acquired from the Topics option.​

Document image


Example:

Document image


Step 5: Add the new integration through the Add button after providing all the required settings as shown above.

Document image


Step 6: The new Kafka over SSL integration will be created by default in the disabled state.

Document image


Step 7: The next step is to add Topics as inbound (Remote -> Local) to the Integration. This is described through the litmus documentation.

Document image

Document image


Step 8: After adding Topics to the integration, they have to be enabled. This can be either be done on a topic by topic base or for all topics at once.

Document image


Step 9: To finalize the setup, enable the integration through the switch at the top right.

Disabled:

Document image


Enabled:

Document image


Step 10: Data are now read from Confluent into Litmus Edge. To verify the data stream, open the integration and go to Topics and copy the Local Topic by clicking on it.

Document image


​Step 11: Open a flow and add a Datahub Subscribe and a debug node.

Document image


Step 12: Add the topic copied in step 10 into the datahub subscribe node and save.

Document image


Step 13: After deploying the flow, the data read from Confluent will be shown in the debug window.

Document image


(Optional) How to deploy the initial Integration via template

This section will provide instructions on how to deploy and configure the Litmus Kafka over SSL Integration via a Litmus template both via Litmus Edge UI and Litmus Edge Manager.

Requirements

To allow that data can be read by Litmus Edge from Confluent using Litmus Kafka over SSL Integration, four requirements have to be met beforehand.

  1. A Confluent Cluster has been configured
  2. At least one topic has been configured on the Confluent cluster
  3. An API key with the
  4. The Litmsu Edge devie needs to have a license
  5. (Optional) To deploy the integration via Litmsu Edge Manager, the Litmsu Edge device has to be attached to Litmus Edge Manager

Knowledge Prerequisits

To deploy the initital integration via template, the user is expected to be familiar with

Either:

  • Deplyoing a template through the Litmus Edge UI

OR

  • Upload a template to Litmus Edge manager
  • Deploy a template from Litmus Edge Manager

Step by Step guide to deploy the initial Integration via template

This guide will provide instructions on how to deploy the initial integration via template and how to then edit it to adjust the configuration. It is expected, that the user is knowledgeable about how to deploy a template and edit an existing integration.

Download the tar file which includes both the template to be used via the Litmus Edge UI as well as Litmus Edge Manager.

Deploy the Template via Litmus Edge UI

Step 1: Open the Litmus Edge UI

Step 2: Open the Template option from System->Backup/Restore

Document image


Step 3: Use the Upload Template button.

Document image


Step 4: Select the file "LE_Template_Confluent_Connector.json"​.

After the import, a success message appears.

Document image


Step 5: To modify the configuration, which is required, open Integrations.

Document image


Step 6: Use the Edit button

Document image


Step 7: Modify the 4 entries for the - Broker - Username - Password - Topic

Document image


Note: Follow the steps 4.2, 4.4, 4.5 and 4.6 of the chapter Step by Step guide to publish data from Litmus Edge to Confluent

Step 8: Save the changes with the Update button.

Document image


Step 9: Enable the integration and add the topics to be written to confluent according to the chapter "Step by Step guide to publish data from Litmus Edge to Confluent"

or the be read into Litmus Edge according to the chapter "Step by Step guide to publish data to Litmus Edge via Apache Nifi"

Deploy the Template via Litmus Edge Manager

Step 1: Open the Litmus Edge Manager user UI

Step 2: Open the Template option under the company or project you want the template to be available in.

Example: On Project Level

Document image


​Step 3: Use the Upload Template button.

Document image


Step 4: Select the Simple Template Option. Select file "LEM_Template_Confluent_Connector.json", provide a name and upload the template.

Document image


After the import, the template is in the list. 

Document image


Step 5: Apply the template to an LE. using the Apply Template option.

Document image




Document image


Step 6: Select the LE you want the template applied to and apply it with the Apply button.

Document image


Step 7: Log on to the LE(s) to modify the integration with the right settings for your Confluent connection.

Step 8: To modify the configuration, which is required, open Integrations.

Document image


Step 9: Use the Edit button

Document image


Step 10: Modify the 4 entries for the - Broker - Username - Password - Topic

Document image


Note: Follow the steps 4.2, 4.4, 4.5 and 4.6 of the chapter Step by Step guide to publish data from Litmus Edge to Confluent

Step 11: Save the changes with the Update button.

Document image


Step 12: Enable the integration and add the topics to be written to confluent according to the chapter "Step by Step guide to publish data from Litmus Edge to Confluent"

or the be read into Litmus Edge according to the chapter "Step by Step guide to publish data to Litmus Edge via Apache Nifi"