Solutions
...
Litmus Companion Solutions for...
Litmus Production Record Event...
Configuring the Litmus Production Record Event Processing Flow
19min
to allow the flow to monitor tag data recorded by litmus edge for event triggers and publish the data to litmus production record database, there are three mandatory configurations the user has to make configure the source of the event configuration the flow requires an event monitoring configuration in json format to process the raw data correctly litmus does offer a solution called litmus production record external event configurator for creating such a configuration which can be purchased through the litmus central portal the flow offers two options as source for the configuration the entire json string copied from the file and pasted into the flow > default option the file itself to set the source, a user has to configure the blue inject node under the section "1 select origin of event configuration" by default, the source is set as true which equals that the event configuration is provided as json string to change the source, use the drop down to select the right value available are true = the event configuration is provided as json string false = the event configuration is provided as file note to be able to use the file option, the respective json file has to be made available by one of two options litmus edge internal ftp folder "/var/lib/customer/ftp data/" > requires ftp setup and transfer first external storage after having set the right value, press the done button deploy the flow the flow will also show a color coded signal which choice has been made in this case the flow expects the event configuration as json string in this case the flow expects the event configuration as file note the colors also indicate, where the user has to perform the next step configure the event configuration the next step is to provide the event configuration accordingly to the setting done in the section "1 select origin of event configuration" to provide either the json string or the path and file, the user goes to "2 provide json or path and file name" the two available options are color coded and correspond to the color shown after the first configuration step option 1 json string if in the first step the option was chosen to be the json string (true) the user has to copy the event configuration from its source and paste it into the flow it is recommended to use the litmus solution litmus production record external event configurator which creates the json file (default name le production record eventconfig json but can be changed by user up on creation) the user can then open this file and copy the content after having copied the content, the user does select the yellow highlighted template node and opens it for editing replace the two "{}" with the event configuration by selecting them and then pasting the json string and finish by pressing the done button deploy the flow option 2 file if in the first step the option was chosen to be the file (false the user has to provide the path and file name it is recommended to use the litmus solution litmus production record external event configurator which creates the json file (default name le production record eventconfig json but can be changed by user up on creation) the user can then place this file either into a folder which is linked as external storage to litmus edge or ftp the file to litmus edge once the file is placed correctly, the user does select the green highlighted read file node the user opens it for editing and provides the path and file name and presses the done button to finish the setup deploy the flow configure the topics for the tags to be used the purpose of the flow, is to ingest tag data collected by litmus edge as devicehub tags and monitor them in stream for the events of interest the flow has to therefore subscribe to the topics of the tags which are of interest for the event configuration the user provides these topics through the section "3 add the topics for the tags used for traceability" to provide a topic, users edit the datahub subscribe node and replace the placeholder "please enter topic" with the topic of a tag to be used by the event configuration and press the done button deploy the flow as typically more than one tag is required, users have several options for providing all the tags they require option 1 add more datahubsubscribe nodes the flow comes with a single datahub subscribe node in the section but users are able to add more nodes as they need and wire them all into the output this option works well if a single devicehub device is to be monitored, but can become challenging to support if the flow is used to monitor multiple devices option 2 wildcards litmus edge allows users to make use of wildcards when providing a topic, so that one datahub subscribe node can be used to read more than one tag the flow is able to handle having tags published which are not part of the configuration and will simply drop them making it possible to use wildcards as well this option has to be chosen carefully, as if several thousand tags are read, the flow can experience a slowdown example for a wildcard on a device level this option is very well suited if multiple devices are to be monitored but care has to be taken to the amount of tags the flow may subscribe too option 3 combination to make sure, that ideally only these tags used by the event configuration are read, the best approach is to combine both options this makes use of the ability of litmus edge, that different devicehub devices can have the same tag name for the same information for example the tag temperature or downtime can exist on several devices this means that the wildcards are used to read the same tag from all devices which have it but as each tag needs a data hub subscribe node, the user would add additional nodes accordingly result after setting the flow up, it will automatically monitor the tags against the event monitoring configuration and publish a message for each item if a trigger does fire the message structure is based on the device hub message object to compliant with older forms of litmus integrations and will look like this example {"tagname" "partno","deviceid" "tooling shop","success"\ false,"datatype" "pn7458,bn87564","timestamp" 1684854704011,"value" "pn7458","registerid" "partno,serialno"} the topic it will be publish is a combination of the litmus edge devicehub device, the event and the item mssql serialno tracking asset data asset1 toolctr partno string item1 final step after all configurations have been done and are deployed, it is recommended to setup a litmus integration using the db microsoft sql server connector how to setup an integration can be read up on using our litmus edge documentation after the integration is established, the user needs to subscribe to the topic which has been setup in the event configuration how a subscription for an integration is added, can be found in the litmus edge documentation add as local data topic the value used in your event configuration now, every time the flow reacts to a trigger event and publishes the data to thi topic, the integration will send it to litmus production record database additional options the flow allows users, to take advantage of the trigger mechanism to power additional functionality they may seek in this example, the trigger signal is used to determine if a product has gone into rework based as well as updating another tag used for visualization it is also possible to add code in front of the processing flow to manipulate tags before they are processed this can fo example include the change of values of a tag, such as translating a numeric value for example a reason code into a human readable phrase