Configuring the Litmus Production Record Event Processing Flow
To allow the flow to monitor tag data recorded by Litmus Edge for event triggers and publish the data to Litmus Production Record Database, there are three mandatory configurations the user has to make.
The flow requires an event monitoring configuration in json format to process the raw data correctly.
Litmus does offer a solution called Litmus Production Record External Event Configurator for creating such a configuration which can be purchased through the Litmus Central Portal.
The flow offers two options as source for the configuration.
- The entire JSON string copied from the file and pasted into the flow -> Default option
- The file itself
To set the source, a user has to configure the blue inject node under the section "1. Select Origin of Event Configuration"
By default, the source is set as true which equals that the event configuration is provided as JSON string.
To change the source, use the drop down to select the right value
Available are:
- true = the event configuration is provided as JSON string
- false = the event configuration is provided as file
Note: To be able to use the file option, the respective JSON file has to be made available by one of two options:
- Litmus Edge internal FTP folder "/var/lib/customer/ftp-data/" -> requires FTP setup and transfer first
After having set the right value, press the Done button.
Deploy the flow.
The flow will also show a color coded signal which choice has been made.
In this case the flow expects the event configuration as JSON string:
In this case the flow expects the event configuration as file:
Note: The colors also indicate, where the user has to perform the next step
The next step is to provide the event configuration accordingly to the setting done in the section "1. Select Origin of Event Configuration".
To provide either the JSON string or the path and file, the user goes to "2. Provide JSON or Path and File Name"
The two available options are color coded and correspond to the color shown after the first configuration step
If in the first step the option was chosen to be the JSON string (true)
the user has to copy the event configuration from its source and paste it into the flow.
It is recommended to use the Litmus Solution Litmus Production Record External Event Configurator which creates the json file (default name LE_Production_Record_EventConfig.json but can be changed by user up on creation). The user can then open this file and copy the content
After having copied the content, the user does select the yellow highlighted template node
And opens it for editing
Replace the two "{}" with the event configuration by selecting them and then pasting the JSON string and finish by pressing the Done button.
Deploy the flow.
If in the first step the option was chosen to be the file (false
The user has to provide the path and file name.
It is recommended to use the Litmus Solution Litmus Production Record External Event Configurator which creates the json file (default name LE_Production_Record_EventConfig.json but can be changed by user up on creation). The user can then place this file either into a folder which is linked as external storage to Litmus Edge or ftp the file to Litmus Edge.
Once the file is placed correctly, the user does select the green highlighted read file node.
The user opens it for editing
And provides the path and file name and presses the Done button to finish the setup.
Deploy the flow.
The purpose of the flow, is to ingest tag data collected by Litmus Edge as DeviceHub tags and monitor them in stream for the events of interest. The flow has to therefore subscribe to the topics of the tags which are of interest for the event configuration.
The user provides these topics through the section "3. Add the Topics for the Tags used for Traceability".
To provide a topic, users edit the datahub subscribe node
and replace the placeholder "Please enter topic"
with the topic of a tag to be used by the event configuration and press the Done button.
Deploy the flow.
As typically more than one tag is required, users have several options for providing all the tags they require.
The flow comes with a single Datahub Subscribe node in the section
But users are able to add more nodes as they need and wire them all into the output.
This option works well if a single devicehub device is to be monitored, but can become challenging to support if the flow is used to monitor multiple devices.
Litmus Edge allows users to make use of wildcards when providing a topic, so that one datahub subscribe node can be used to read more than one tag.
The flow is able to handle having tags published which are not part of the configuration and will simply drop them. Making it possible to use wildcards as well
This option has to be chosen carefully, as if several thousand tags are read, the flow can experience a slowdown.
Example for a wildcard on a Device Level
This option is very well suited if multiple devices are to be monitored. But care has to be taken to the amount of tags the flow may subscribe too.
To make sure, that ideally only these tags used by the event configuration are read, the best approach is to combine both options.
This makes use of the ability of Litmus Edge, that different DeviceHub devices can have the same tag name for the same information. For example the tag Temperature or Downtime can exist on several devices.
This means that the wildcards are used to read the same tag from all devices which have it.
But as each tag needs a data hub subscribe node, the user would add additional nodes accordingly.
After setting the flow up, it will automatically monitor the tags against the event monitoring configuration and publish a message for each item if a trigger does fire.
The message structure is based on the device Hub message object to compliant with older forms of Litmus Integrations and will look like this example.
{"tagName":"PartNo","deviceID":"Tooling_Shop","success":false,"datatype":"PN7458,BN87564","timestamp":1684854704011,"value":"PN7458","registerId":"PartNo,SerialNo"}
The topic it will be publish is a combination of the Litmus Edge DeviceHub device, the event and the item
MSSQL.SerialNo_Tracking_Asset.Data_Asset1.ToolCtr_PartNo_String_Item1
After all configurations have been done and are deployed, it is recommended to setup a Litmus Integration using the DB - Microsoft SQL Server connector.
How to setup an integration can be read up on using our Litmus Edge Documentation.
After the integration is established, the user needs to subscribe to the topic which has been setup in the event configuration.
How a subscription for an integration is added, can be found in the Litmus Edge Documentation.
Add as Local Data Topic the value used in your event configuration.
Now, every time the flow reacts to a trigger event and publishes the data to thi topic, the integration will send it to Litmus Production Record Database.
The flow allows users, to take advantage of the trigger mechanism to power additional functionality they may seek.
In this example, the trigger signal is used to determine if a product has gone into rework based as well as updating another tag used for visualization.
It is also possible to add code in front of the processing flow to manipulate tags before they are processed. This can fo example include the change of values of a tag, such as translating a numeric value for example a reason code into a human readable phrase.