Snowflake
Datafly Signal delivers events to Snowflake for cloud data warehousing, analytics, data sharing, and data marketplace integrations.
This integration is currently in beta. Configuration and behaviour may change.
Prerequisites
Before configuring Snowflake in Signal, you need a Snowflake account with a warehouse, database, schema, and a user with appropriate permissions.
Create a Snowflake Account
- Sign up at snowflake.com for a free trial or use an existing account.
- Choose your cloud provider (AWS, Azure, or GCP) and region.
- Note your Account identifier — this includes the account locator and region (e.g.
xy12345.us-east-1). Find it in the Snowflake URL or under Admin > Accounts.
Create a Warehouse
- In the Snowflake console (Snowsight), go to Admin > Warehouses.
- Click + Warehouse.
- Enter a name (e.g.
DATAFLY_WH). - Set the Size (X-Small is sufficient for low to moderate event volume).
- Enable Auto-suspend (e.g. 5 minutes) and Auto-resume to manage costs.
- Click Create Warehouse.
Warehouses are billed per second while running. Auto-suspend and auto-resume ensure you only pay for compute when events are being loaded.
Create a Database and Schema
CREATE DATABASE IF NOT EXISTS ANALYTICS;
CREATE SCHEMA IF NOT EXISTS ANALYTICS.DATAFLY;Create a Table
CREATE TABLE ANALYTICS.DATAFLY.EVENTS (
event_id STRING NOT NULL,
type STRING,
event STRING,
anonymous_id STRING,
user_id STRING,
timestamp TIMESTAMP_NTZ,
received_at TIMESTAMP_NTZ,
sent_at TIMESTAMP_NTZ,
context VARIANT,
properties VARIANT,
traits VARIANT,
source_id STRING,
integration_id STRING
);The VARIANT type stores semi-structured data (JSON) natively. Use Snowflake’s dot notation or GET_PATH() to query nested fields (e.g. properties:total::FLOAT).
Create a User and Role for Signal
-- Create a role for Signal
CREATE ROLE DATAFLY_SIGNAL_ROLE;
-- Grant warehouse usage
GRANT USAGE ON WAREHOUSE DATAFLY_WH TO ROLE DATAFLY_SIGNAL_ROLE;
-- Grant database and schema usage
GRANT USAGE ON DATABASE ANALYTICS TO ROLE DATAFLY_SIGNAL_ROLE;
GRANT USAGE ON SCHEMA ANALYTICS.DATAFLY TO ROLE DATAFLY_SIGNAL_ROLE;
-- Grant table write permission
GRANT INSERT ON TABLE ANALYTICS.DATAFLY.EVENTS TO ROLE DATAFLY_SIGNAL_ROLE;
-- Create user
CREATE USER DATAFLY_SIGNAL
PASSWORD = 'your_secure_password'
DEFAULT_ROLE = DATAFLY_SIGNAL_ROLE
DEFAULT_WAREHOUSE = DATAFLY_WH;
-- Assign role to user
GRANT ROLE DATAFLY_SIGNAL_ROLE TO USER DATAFLY_SIGNAL;Configuration
| Field | Type | Required | Description |
|---|---|---|---|
account_identifier | string | Yes | Your Snowflake account identifier, including the region suffix (e.g. xy12345.us-east-1). |
warehouse | string | Yes | The Snowflake virtual warehouse to use for loading data. |
database | string | Yes | The Snowflake database containing the target table. |
schema | string | Yes | The Snowflake schema containing the target table. |
username | string | Yes | The Snowflake username for authentication. |
password | secret | Yes | The password for the Snowflake user account. |
role | string | No | Optional Snowflake role to assume when connecting. Defaults to the user’s default role. |
Signal Setup
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Snowflake or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default blueprint.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/snowflake/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Snowflake",
"variant": "default",
"config": {
"account_identifier": "xy12345.us-east-1",
"warehouse": "DATAFLY_WH",
"database": "ANALYTICS",
"schema": "DATAFLY",
"username": "DATAFLY_SIGNAL",
"password": "your_secure_password",
"role": "DATAFLY_SIGNAL_ROLE"
},
"delivery_mode": "server_side"
}'Testing
- Enable the integration in Signal and trigger a test event on your website.
- Open the Snowflake console and run a query:
SELECT * FROM ANALYTICS.DATAFLY.EVENTS
ORDER BY timestamp DESC
LIMIT 10;- Verify that event rows are appearing with correct data.
- Test VARIANT querying:
SELECT event_id, properties:total::FLOAT AS total, properties:currency::STRING AS currency
FROM ANALYTICS.DATAFLY.EVENTS
WHERE event = 'Order Completed'
ORDER BY timestamp DESC
LIMIT 5;- In Signal, check the Live Events view to confirm delivery status shows as successful.
Troubleshooting
| Problem | Solution |
|---|---|
| Events not appearing in Snowflake | Verify the account identifier, warehouse, database, schema, and table are correct. |
Incorrect username or password | The username or password is incorrect. Verify the credentials. Snowflake usernames are case-insensitive but passwords are case-sensitive. |
Insufficient privileges | The role lacks the required permissions. Verify the role has USAGE on the warehouse, database, and schema, plus INSERT on the table. |
| Warehouse suspended | The warehouse auto-suspended. It will auto-resume on the next query but the first request may be slower. Ensure auto-resume is enabled. |
Object does not exist | The database, schema, or table does not exist. Verify the names (Snowflake identifiers are case-sensitive when quoted). |
| Account identifier format | The account identifier must include the region suffix. Check under Admin > Accounts in Snowsight. |
| Connection timeout | Ensure Signal can reach account_identifier.snowflakecomputing.com on port 443. Check network policies and firewall rules. |
Visit Snowflake documentation for full SQL reference, VARIANT query syntax, and cost management guides.