Databricks

Load event data into Databricks for unified analytics, data engineering, and machine learning on a lakehouse platform.

Configuration

FieldTypeRequiredDescription
workspace_urlstringYesThe Databricks workspace URL, e.g. https://xxx.cloud.databricks.com.
warehouse_idstringYesThe SQL warehouse ID to execute queries against.
catalogstringYesThe Unity Catalog name.
schema_namestringYesThe schema name within the catalog.
table_namestringYesThe target table name to insert rows into.
access_tokensecretYesPersonal access token or service principal token for authentication.

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Databricks or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default configuration.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/databricks/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Databricks",
    "variant": "default",
    "config": {
      "workspace_url": "https://xxx.cloud.databricks.com",
      "warehouse_id": "abc123def456",
      "catalog": "main",
      "schema_name": "datafly",
      "table_name": "events"
    },
    "delivery_mode": "server_side"
  }'

Delivery

Events are delivered server-side from your Datafly Signal infrastructure directly to Databricks. No client-side scripts are loaded for this integration.

Visit Databricks documentation for full API documentation and credential setup instructions.