Databricks
Load event data into Databricks for unified analytics, data engineering, and machine learning on a lakehouse platform.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
workspace_url | string | Yes | The Databricks workspace URL, e.g. https://xxx.cloud.databricks.com. |
warehouse_id | string | Yes | The SQL warehouse ID to execute queries against. |
catalog | string | Yes | The Unity Catalog name. |
schema_name | string | Yes | The schema name within the catalog. |
table_name | string | Yes | The target table name to insert rows into. |
access_token | secret | Yes | Personal access token or service principal token for authentication. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Databricks or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/databricks/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Databricks",
"variant": "default",
"config": {
"workspace_url": "https://xxx.cloud.databricks.com",
"warehouse_id": "abc123def456",
"catalog": "main",
"schema_name": "datafly",
"table_name": "events"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Databricks. No client-side scripts are loaded for this integration.
Visit Databricks documentation for full API documentation and credential setup instructions.