Litmus Edge to Confluent using Kafka over SSL
This section will provide instructions on how to prepare a Confluent cluster to accept a connection using Litmus Edge Integration Kafka over SSL
A Confluent cluster has been deployed.
This guide will provide instructions on the minimum required configuration to connect to the confluent cluster using Litmus Edge Integration of Kafka over SSL.
This guide will not provide instructions on how to manage a Confluent cluster or actions not related to connecting to the cluster via the Litmus Integration.
Step 1: Example: cluster_0 will be used
Step 2: To add a topic to which Litmus Integration will send data or read data from, select the Topics option from the left panel.
Step 3: Add a topic by using the Create topic button.
Step 4: Step 4.1: For a quick start, the new topic setup can be finished with the Create with defaults button.
Step 4.2: The topic configuration can also be adjusted using the Show advanced settings option, which is also documented within the Confluent documentation.
After configuring the topic according to your needs, create the topic with the Save & create button.
Step 5: A new topic is now available.
Step 6: Add an API key, so that the Litmus Kafka over SSL Integration can authenticate against Confluent by using the API keys sub-option under the Data integration option.
Step 7: Create a new API key with the Create key button.
Step 8: Select for the key type the Global access option and press the Next button.
Step 9: Provide a Description if needed. To create the key, press the Download and continue button.
The key will be created and a text file with the key and secret will be download to your default download folder.
Step 10: A new API key is now available.
This section will provide instructions on how to setup the Litmus Kafka over SSL Integration to publish data to confluent.
To allow that data can be published from Litmus Edge to Confluent using Litmus Kafka over SSL Integration, four requirements have to be met beforehand.
- A Confluent Cluster has been configured
- At least one topic has been configured on the Confluent cluster
- An API key with the
- Data to be published are collected by Litmus Edge. These can either be:
- Data collected through a
- Data created by Litmus
- Data created through a
- Other sources like containers.
This guide will provide instructions on how to create a Litmus Kafka over SSL Integration to publish data from Litmus Edge. This chapter will not provide instructions on how to build a Confluent stream, as is it expected that user are familiar with this process. Also does Confluent provide well documented instruction onto this aspect.
Step 1: Log on to your Litmus Edge device and open the Integration menu.
Step 2: Press the Add a connector button.
Step 3: Use the drop-down menu.
To select the Kafka SSL integration.
Step 4: To be able to conect to confluent, the connector requires 6 settings.
Steps 4.1 - 4.6 will look at each config item.
Step 4.1: Name This is the name the integration displays inside the Litmus Edge system. Note: (has to be unique across all integrations) Example:
Step 4.2: Broker This is the Bootstrap server which can be either be acquired from the Cluster settings sub-option under the Cluster overview option.
Or from the file which was downloaded to create the API key.
Example:
Step 4.3: SASL mechanism This is the security protocol and mechanism used by the Confluent Kafka broker. For Confluent this is by default SASL_PLAINTEXT.
Step 4.4: Username This is the sasl user / API key from the file which was downloaded to create the API key.
Example:
Step 4.5: Password This is the sasl password / API secret from the file which was downloaded to create the API key.
Example:
Note: To see the password, the eye symbol can be disabled to turn of the hiding.
Step 4.6: Topic This is the topic which was created on the cluster and can be acquired from the Topics option.
Example:
Step 5: Add the new integration through the Add button after providing all the required settings as shown above.
Step 6: The new Kafka over SSL integration will be created by default in the disabled state.
Step 7: The next step is to add Topics as outbound (Local -> Remote) to the Integration. This can either be done via import from devicehub.. Or on a tag by tag or topic wildcard base as described in the documentation.
Example:
Step 8: After adding Topics to the integration, they have to be enabled. This can be either be done on a topic by topic base or for all topics at once.
Example:
Step 9: To finalize the setup, enable the integration through the switch at the top right.
Disabled:
Enabled:
Step 10: Data are now published to Confluent and can be monitored through the Messages tab under the Topics option.
With this, the data are now available in Confluent as a Kafka Producer and users can stream the data to Kafka consumers using Confluent Stream lineage
This guide will not be able to provide instructions on these topics. Confluent provides instructions through their own documentation
This section will provide instructions on how to setup the Litmus Kafka over SSL Integration to read data from Confluent.
To allow that data can be read by Litmus Edge from Confluent using Litmus Kafka over SSL Integration, three requirements have to be met beforehand.
- A Confluent Cluster has been configured
- At least one topic has been configured on the Confluent cluster
- An API key with the
This guide will provide instructions on how to create a Litmus Kafka over SSL Integration to read data from Confluent. This chapter will not provide instructions on how to build a Confluent stream, as is it expected that user are familiar with this process. Also does Confluent provide well documented instruction onto this aspect.
Step 1: Log on to your Litmus Edge device and open the Integration menu.
Step 2: Press the Add a connector button.
Step 3: Use the drop-down menu.
To select the Kafka SSL integration.
Step 4: To be able to connect to confluent, the connector requires 6 settings.
Steps 4.1 - 4.6 will look at each config item.
Step 4.1: Name This is the name the integration displays inside the Litmus Edge system. Note: (has to be unique across all integrations) Example:
Step 4.2: Broker This is the Bootstrap server which can be either be acquired from the Cluster settings sub-option under the Cluster overview option.
Or from the file which was downloaded to create the API key.
Example:
Step 4.3: SASL mechanism This is the security protocol and mechanism used by the Confluent Kafka broker. For Confluent this is by default SASL_PLAINTEXT.
Step 4.4: Username This is the sasl user / API key from the file which was downloaded to create the API key.
Example:
Step 4.5: Password This is the sasl password / API secret from the file which was downloaded to create the API key.
Example:
Note: To see the password, the eye symbol can be disabled to turn of the hiding.
Step 4.6: Topic This is the topic which was created on the cluster and can be acquired from the Topics option.
Example:
Step 5: Add the new integration through the Add button after providing all the required settings as shown above.
Step 6: The new Kafka over SSL integration will be created by default in the disabled state.
Step 7: The next step is to add Topics as inbound (Remote -> Local) to the Integration. This is described through the litmus documentation.
Step 8: After adding Topics to the integration, they have to be enabled. This can be either be done on a topic by topic base or for all topics at once.
Step 9: To finalize the setup, enable the integration through the switch at the top right.
Disabled:
Enabled:
Step 10: Data are now read from Confluent into Litmus Edge. To verify the data stream, open the integration and go to Topics and copy the Local Topic by clicking on it.
Step 11: Open a flow and add a Datahub Subscribe and a debug node.
Step 12: Add the topic copied in step 10 into the datahub subscribe node and save.
Step 13: After deploying the flow, the data read from Confluent will be shown in the debug window.
This section will provide instructions on how to deploy and configure the Litmus Kafka over SSL Integration via a Litmus template both via Litmus Edge UI and Litmus Edge Manager.
To allow that data can be read by Litmus Edge from Confluent using Litmus Kafka over SSL Integration, four requirements have to be met beforehand.
- A Confluent Cluster has been configured
- At least one topic has been configured on the Confluent cluster
- An API key with the
- The Litmsu Edge devie needs to have a license
- (Optional) To deploy the integration via Litmsu Edge Manager, the Litmsu Edge device has to be attached to Litmus Edge Manager
To deploy the initital integration via template, the user is expected to be familiar with
Either:
OR
This guide will provide instructions on how to deploy the initial integration via template and how to then edit it to adjust the configuration. It is expected, that the user is knowledgeable about how to deploy a template and edit an existing integration.
Download the tar file which includes both the template to be used via the Litmus Edge UI as well as Litmus Edge Manager.
Step 1: Open the Litmus Edge UI
Step 2: Open the Template option from System->Backup/Restore
Step 3: Use the Upload Template button.
Step 4: Select the file "LE_Template_Confluent_Connector.json".
After the import, a success message appears.
Step 5: To modify the configuration, which is required, open Integrations.
Step 6: Use the Edit button
Step 7: Modify the 4 entries for the - Broker - Username - Password - Topic
Note: Follow the steps 4.2, 4.4, 4.5 and 4.6 of the chapter Step by Step guide to publish data from Litmus Edge to Confluent
Step 8: Save the changes with the Update button.
Step 9: Enable the integration and add the topics to be written to confluent according to the chapter "Step by Step guide to publish data from Litmus Edge to Confluent"
or the be read into Litmus Edge according to the chapter "Step by Step guide to publish data to Litmus Edge via Apache Nifi"
Step 1: Open the Litmus Edge Manager user UI
Step 2: Open the Template option under the company or project you want the template to be available in.
Example: On Project Level
Step 3: Use the Upload Template button.
Step 4: Select the Simple Template Option. Select file "LEM_Template_Confluent_Connector.json", provide a name and upload the template.
After the import, the template is in the list.
Step 5: Apply the template to an LE. using the Apply Template option.
Step 6: Select the LE you want the template applied to and apply it with the Apply button.
Step 7: Log on to the LE(s) to modify the integration with the right settings for your Confluent connection.
Step 8: To modify the configuration, which is required, open Integrations.
Step 9: Use the Edit button
Step 10: Modify the 4 entries for the - Broker - Username - Password - Topic
Note: Follow the steps 4.2, 4.4, 4.5 and 4.6 of the chapter Step by Step guide to publish data from Litmus Edge to Confluent
Step 11: Save the changes with the Update button.
Step 12: Enable the integration and add the topics to be written to confluent according to the chapter "Step by Step guide to publish data from Litmus Edge to Confluent"
or the be read into Litmus Edge according to the chapter "Step by Step guide to publish data to Litmus Edge via Apache Nifi"