Snowflake Snowpipe Stream Integration Guide
16 min
note this feature is available starting with litmus edge 4 1 x this guide shows you how to set up an integration between litmus edge and snowflake snowpipe streaming once the integration is set up, you can use it to publish data from litmus edge directly into snowflake tables using the snowpipe streaming high performance rest api with low latency and no virtual warehouse required note the snowflake snowpipe stream connector only supports outbound (publish) data flows data travels from litmus edge to snowflake inbound subscriptions are not supported by this connector before you begin before configuring the connector in litmus edge, complete the following setup steps in snowflake important snowpipe streaming requires a snowflake account on enterprise edition or higher confirm your account type before proceeding refer to the following snowflake resources to learn more about snowpipe streaming and configuring your account snowpipe streaming overview https //docs snowflake com/en/user guide/snowpipe streaming/data load snowpipe streaming overview snowpipe streaming high performance rest api https //docs snowflake com/en/user guide/snowpipe streaming/snowpipe streaming high performance rest api key pair authentication and key pair rotation https //docs snowflake com/en/user guide/key pair auth step 1 generate an rsa key pair note this guide uses devicehub tag data as an example for the snowflake table schema and outbound topic configuration litmus edge also supports streaming custom json structures produced by analytics or digital twins to snowflake the steps are the same, but you will need to define your snowflake table columns to match your custom payload structure litmus edge uses rsa key pair (jwt) authentication to connect to snowflake run the following commands to generate a private key and extract the corresponding public key if you are on windows or prefer a different tool, refer to snowflake's key pair authentication and key pair rotation https //docs snowflake com/en/user guide/key pair auth guide for platform specific instructions \# generate a private key in pkcs#8 format openssl genrsa out rsa key pem 2048 openssl pkcs8 topk8 inform pem outform pem nocrypt in rsa key pem out rsa key pkcs8 pem \# extract the public key openssl rsa in rsa key pem pubout out rsa key pub after running these commands, you will have rsa key pkcs8 pem private key to enter in the litmus edge connector configuration rsa key pub public key to register with your snowflake user in the next step important store the private key securely it authenticates litmus edge with your snowflake account step 2 register the public key with snowflake register the public key from step 1 with the snowflake user that litmus edge will authenticate as run the following sql in your snowflake worksheet, replacing \<username> with your snowflake username \<public key> with the contents of rsa key pub exclude the begin public key and end public key header and footer lines alter user \<username> set rsa public key='\<public key>'; after running this command, note the following values you will need them when configuring the connector in litmus edge account your snowflake account identifier, in the format \<orgname> \<account locator> (for example myorg ml890087 ) find this under admin > accounts in the snowflake web interface username snowflake username you registered the public key on (for example litmus svc user ) find this under admin > users and roles private key full contents of rsa key pkcs8 pem including the begin private key and end private key lines step 3 create a database in your snowflake account, create a database to store the data streamed from litmus edge or use an existing database create database if not exists \<database>; replace \<database> with the name you want to use step 4 create a schema within the database, create a schema or use the existing schema replace \<schema> with the name you want to use create schema if not exists \<database> \<schema>; step 5 create a table create a table in the schema with columns that match the data fields litmus edge will publish the following example creates a table compatible with the litmus edge devicehub tag payload replace \<database> , \<schema> , and \<table> with the names you chose in steps 3 and 4 create table if not exists \<database> \<schema> \<table> ( success boolean, datatype string, timestamp timestamp ntz, registerid string, value number, deviceid string, tagname string, devicename string, description string, metadata variant ); note column names are case sensitive and must match the devicehub payload field names exactly as shown above additional columns may be added to the table they will receive null for rows inserted by this connector for more information on table creation syntax, see create table https //docs snowflake com/en/sql reference/sql/create table set up the outbound connection (publish to snowflake) follow the steps below to enable litmus edge to publish data to snowflake step 1 add the snowflake snowpipe stream connector follow the steps to add a connector and select the snowflake snowpipe stream connector provider configure the following parameters name connector name which is also used internally as the snowpipe identifier account your snowflake account identifier username the snowflake username with the rsa public key registered private key the rsa private key in pem format (pkcs#8) paste the full contents of rsa key pkcs8 pem or upload the file including the header and footer lines private key passphrase (optional) the passphrase used when generating the private key leave empty if the key was generated without encryption database the name of the target snowflake database schema the name of the target schema within the database default table the name of the default target table this table is used by all outbound topics that do not specify a remote data topic important store the private key securely it is displayed as \<protected> after saving and must be entered again if you need to update the connector credentials step 2 enable the connector after adding the connector, use the toggle in the connector tile to enable it the connector authenticates with snowflake, discovers the snowpipe streaming ingest endpoint, and opens a streaming channel the status transitions to connected on success if you see a failed status, review the manage connectors page and relevant error messages common issues include the rsa public key is not registered on the snowflake user run alter user \<username> set rsa public key=' ' in snowflake the account identifier is incorrect verify the format is \<orgname> \<account locator> the database, schema, or table does not exist confirm the resources were created in the before you begin docid\ znvzcs4uzvk2pjirjlrth steps step 3 add an outbound topic after the connector is enabled, add one or more outbound topics to define which litmus edge tag data to publish to snowflake to add an outbound topic click the connector tile the connector dashboard appears click the topics tab click the add a new topic icon the data integration dialog box appears configure the following parameters data direction select local to remote outbound local data topic select the devicehub tag or topic from litmus edge that you want to publish remote data topic (optional) enter the name of a snowflake table to publish this topic's data to if left empty, data is published to the default table defined at the connector level the table must exist in the configured database and schema enable select the toggle to enable the topic click ok to add the topic from the connector tile, verify that the connector shows a connected status and the topic shows an enabled status note you can route different device tags to separate snowflake tables by specifying a different remote data topic per outbound topic each unique target table gets its own dedicated streaming channel that is managed automatically by the connector step 4 verify data in snowflake to confirm that data is flowing from litmus edge into snowflake in your snowflake account, open a worksheet and navigate to the target database and schema run a query against the table to view incoming rows select from \<database> \<schema> \<table> order by timestamp desc limit 100; confirm that rows appear with values from your litmus edge tags tip you can also monitor ingestion from the snowflake query history view or by checking the table's row count after a short interval