Snowflake

Datafly Signal delivers events to Snowflake for cloud data warehousing, analytics, data sharing, and data marketplace integrations.

⚠️

This integration is currently in beta. Configuration and behaviour may change.

Prerequisites

Before configuring Snowflake in Signal, you need a Snowflake account with a warehouse, database, schema, and a user with appropriate permissions.

Create a Snowflake Account

  1. Sign up at snowflake.com for a free trial or use an existing account.
  2. Choose your cloud provider (AWS, Azure, or GCP) and region.
  3. Note your Account identifier — this includes the account locator and region (e.g. xy12345.us-east-1). Find it in the Snowflake URL or under Admin > Accounts.

Create a Warehouse

  1. In the Snowflake console (Snowsight), go to Admin > Warehouses.
  2. Click + Warehouse.
  3. Enter a name (e.g. DATAFLY_WH).
  4. Set the Size (X-Small is sufficient for low to moderate event volume).
  5. Enable Auto-suspend (e.g. 5 minutes) and Auto-resume to manage costs.
  6. Click Create Warehouse.

Warehouses are billed per second while running. Auto-suspend and auto-resume ensure you only pay for compute when events are being loaded.

Create a Database and Schema

CREATE DATABASE IF NOT EXISTS ANALYTICS;
CREATE SCHEMA IF NOT EXISTS ANALYTICS.DATAFLY;

Create a Table

CREATE TABLE ANALYTICS.DATAFLY.EVENTS (
  event_id STRING NOT NULL,
  type STRING,
  event STRING,
  anonymous_id STRING,
  user_id STRING,
  timestamp TIMESTAMP_NTZ,
  received_at TIMESTAMP_NTZ,
  sent_at TIMESTAMP_NTZ,
  context VARIANT,
  properties VARIANT,
  traits VARIANT,
  source_id STRING,
  integration_id STRING
);

The VARIANT type stores semi-structured data (JSON) natively. Use Snowflake’s dot notation or GET_PATH() to query nested fields (e.g. properties:total::FLOAT).

Create a User and Role for Signal

-- Create a role for Signal
CREATE ROLE DATAFLY_SIGNAL_ROLE;
 
-- Grant warehouse usage
GRANT USAGE ON WAREHOUSE DATAFLY_WH TO ROLE DATAFLY_SIGNAL_ROLE;
 
-- Grant database and schema usage
GRANT USAGE ON DATABASE ANALYTICS TO ROLE DATAFLY_SIGNAL_ROLE;
GRANT USAGE ON SCHEMA ANALYTICS.DATAFLY TO ROLE DATAFLY_SIGNAL_ROLE;
 
-- Grant table write permission
GRANT INSERT ON TABLE ANALYTICS.DATAFLY.EVENTS TO ROLE DATAFLY_SIGNAL_ROLE;
 
-- Create user
CREATE USER DATAFLY_SIGNAL
  PASSWORD = 'your_secure_password'
  DEFAULT_ROLE = DATAFLY_SIGNAL_ROLE
  DEFAULT_WAREHOUSE = DATAFLY_WH;
 
-- Assign role to user
GRANT ROLE DATAFLY_SIGNAL_ROLE TO USER DATAFLY_SIGNAL;

Configuration

FieldTypeRequiredDescription
account_identifierstringYesYour Snowflake account identifier, including the region suffix (e.g. xy12345.us-east-1).
warehousestringYesThe Snowflake virtual warehouse to use for loading data.
databasestringYesThe Snowflake database containing the target table.
schemastringYesThe Snowflake schema containing the target table.
usernamestringYesThe Snowflake username for authentication.
passwordsecretYesThe password for the Snowflake user account.
rolestringNoOptional Snowflake role to assume when connecting. Defaults to the user’s default role.

Signal Setup

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Snowflake or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default blueprint.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/snowflake/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Snowflake",
    "variant": "default",
    "config": {
      "account_identifier": "xy12345.us-east-1",
      "warehouse": "DATAFLY_WH",
      "database": "ANALYTICS",
      "schema": "DATAFLY",
      "username": "DATAFLY_SIGNAL",
      "password": "your_secure_password",
      "role": "DATAFLY_SIGNAL_ROLE"
    },
    "delivery_mode": "server_side"
  }'

Testing

  1. Enable the integration in Signal and trigger a test event on your website.
  2. Open the Snowflake console and run a query:
SELECT * FROM ANALYTICS.DATAFLY.EVENTS
ORDER BY timestamp DESC
LIMIT 10;
  1. Verify that event rows are appearing with correct data.
  2. Test VARIANT querying:
SELECT event_id, properties:total::FLOAT AS total, properties:currency::STRING AS currency
FROM ANALYTICS.DATAFLY.EVENTS
WHERE event = 'Order Completed'
ORDER BY timestamp DESC
LIMIT 5;
  1. In Signal, check the Live Events view to confirm delivery status shows as successful.

Troubleshooting

ProblemSolution
Events not appearing in SnowflakeVerify the account identifier, warehouse, database, schema, and table are correct.
Incorrect username or passwordThe username or password is incorrect. Verify the credentials. Snowflake usernames are case-insensitive but passwords are case-sensitive.
Insufficient privilegesThe role lacks the required permissions. Verify the role has USAGE on the warehouse, database, and schema, plus INSERT on the table.
Warehouse suspendedThe warehouse auto-suspended. It will auto-resume on the next query but the first request may be slower. Ensure auto-resume is enabled.
Object does not existThe database, schema, or table does not exist. Verify the names (Snowflake identifiers are case-sensitive when quoted).
Account identifier formatThe account identifier must include the region suffix. Check under Admin > Accounts in Snowsight.
Connection timeoutEnsure Signal can reach account_identifier.snowflakecomputing.com on port 443. Check network policies and firewall rules.

Visit Snowflake documentation for full SQL reference, VARIANT query syntax, and cost management guides.