How-To Guides
Integration Guides
Create a Custom Table in a MySQL Database
19min
you can use the show mapping option in sql connector integrations to create a custom table user scenario in your manufacturing plant there is one plc that tracks the temperature you want to store the temperature value in a database with a timestamp, but don't need the additional data provided in the payload of devicehub devices you also want to convert the values that are calculated in celsius to fahrenheit as integer values for this guide, you will do the following simulate temperature values coming from a plc convert the values to fahrenheit deploy a mysql container in litmus edge create a custom table in the mysql database configure a connection to the database through the mysql connector send the data through the connector into the custom table step 1 add a device follow the steps to connect a device docid\ ish7bqhzxswtdx8vbnszb and configure the following parameters device type simulator driver name generator enable alias topics select the checkbox step 2 add tag after connecting the device, add tags docid\ xgwokqbtpevii7or82ll0 to the device tag 1 name select s random value generator value type select int64 polling interval enter 5 tag name enter tag1 min value enter 100 max value enter 129 step 3 create analytics flow you can now create the analytics flow that will convert the temperature values into celsius to create the analytics flow navigate to analytics > instances click add flow the create flow dialog displays for processor input , select datahub subscribe click the search icon and select the device and tag you previously created click next for processor function , select conversion then, select the celsius > fahrenheit conversion click next for processor output , select datahub publish , then, copy the topic name and store it somewhere securely for reference later click create flow click save to save the analytics flow on the canvas step 4 create flow since the only data needed from the device is the converted temperature value and the timestamp, you will create the flow that will parse the data and only provide the required data in the payload to create the flow navigate to flows manager and create a new flow see create a flow docid\ veyyte7xdcangglzbqya0 to learn more connect the following nodes together datahub subscribe json change json datahub publish debug (connected to second json node) configure the nodes datahub subscribe the datahub subscribe node will be configured to subscribe to the topic that publishes the converted temperature values from the analytics flow to configure the datahub subscribe node double click the datahub subscribe node in the topic field paste the topic name copied from the datahub publish processor in the analytics flow if needed, configure the datahub subscribe connection see the step 3 configure connector nodes section in create a flow docid\ veyyte7xdcangglzbqya0 to learn more optionally, enter a name for the node click done json nodes the two json nodes will do necessary parsing of data between json string and object no configurations are needed for these nodes change the change node will delete the unnecessary data in the payload from the subscribed topic to configure the change node double click the change node update set to delete in the msg field, enter payload metadata click +add continue adding the following deletions payload registerid payload tagname payload success payload description payload datatype payload devicename payload deviceid click done datahub publish the datahub publish node will be used by the mysql connector to write data into the database table double click the datahub publish node in the topic field enter a topic name you will refer to this name later click done debug the debug node will be used to review the updated payload and ensure the output is correct to review the payload on the flow canvas, click deploy to save the changes to the flow expand the message window beneath the flow click the debug icon to view the data output confirm that the output is only including two items in the payload value and timestamp click deploly to save the flow step 5 add mysql marketplace application you can now add the mysql application to litmus edge you can customize the parameters to your own preferred configurations note the version used in this guide for mysql version is 8 0 see the mysql 8 0 reference manual for more information to add the mysql marketplace application in litmus edge, navigate to applications > marketplace click marketplace list and select default marketplace catalog click the mysql application tile from the installation script version drop down list, select latest configure the following parameters name enter mysql database enter sample user enter user password enter the user password mysql password enter the same value as the password parameter click launch navigate to applications > catalog apps the mysql application appears, pulls the image, and then starts the application the mysql application shows as running navigate to applications > containers the mysql application container is running copy the ip address for the mysql application step 6 create custom database table you will need to create the custom table in the database before data can be written into it update the credentials, database name, and table name to your own specific configurations to create the custom table in litmus edge, navigate to applications > containers click the terminal icon next to the mysql container the mysql shell opens from the mysql container terminal, enter mysql u user p and press enter enter password appears enter your user password, and then press enter welcome to the mysql monitor appears enter show databases; , and then press enter the database names appears enter use sample; , and then press enter reading table information for completion of table and column names appears in the console this allows you to use the database enter create table conversion values( timestamp bigint null, value int null ); , and press enter the custom table is created with the following the table name conversion values the column names timestamp and value the column timestamp has the data type bigint and value has the data type int both columns allow null values step 7 configure mysql connector follow the steps to add a connector docid\ ogw7fkqbwidbabn4wl5rr and select the db mysql provider configure the following required parameters name enter a name for the connector hostname paste the ip address you copied in step 5 port confirm the mysql server port the default value is 3306 username enter user or the username configured in step 5 password (optional) enter the password configured in step 5 database enter sample or the database configured in step 5 table enter conversion values or the table name configured in step 6 show mapping enter the following key/value pairs and click + add key 1 timestamp value 1 {{ timestamp}} key 2 value value 2 {{ value}} create table make sure the checkbox is not selected configure the following optional parameters as needed commit timeout enter the transaction commit timeout in (ms) max transaction size enter the maximum number of messages before a transaction is committed, regardless of timeout parameter bulk insert count to enable this option, enter the number of messages to group together and send as one bulk insert statement enabling this option can improve how quickly data is processed when dealing with high volumes of tags throttling limit the maximum number of messages per second to be processed the default value is zero, which means that there is no limit persistent storage when enabled, this will cause messages to undergo a store and forward procedure messages will be stored within litmus edge when cloud providers are online queue mode select the queue mode as lifo (last in first out) or fifo (first in first out) selecting lifo means that the last data entry is processed first, and selecting fifo means the first data entry is processed first step 8 add outbound topic you will need to add the datahub publish topic you configured on the flow canvas as an outbound topic in the mysql connector to add the outbound topic navigate to integration click the mysql connector tile click the topics tab click the add icon and select add a new subscription configure the parameters for the topic data direction select local to remote outbound local data topic paste or enter the datahub publish topic name configured in the flow canvas in step 4 in our example, enter conversion values remote data topic optionally, enter your desired topic name that will be used to write the data into the database table description optionally, enter a description enable click the toggle to enable the topic click ok the new topic appears on the topics pane and is enabled this topic will write data to the mysql database table step 9 verify connection use the command line in the mysql application to confirm data is successfully being written into the database update the credentials, database name, and table name to your own specific configurations to verify the connection in litmus edge, navigate to applications > containers click the terminal icon next to the mysql container the mysql shell opens from the mysql container terminal, enter mysql u user p and press enter enter password appears enter your user password, and then press enter welcome to the mysql monitor appears enter use sample; , and then press enter reading table information for completion of table and column names appears in the console enter select from conversion values; , and press enter the data in the custom table displays