Getting StartedQuick Start

Quick Start

Get Datafly Signal running locally and send your first event in under five minutes.

Prerequisites

Make sure you have Docker (24+) with Docker Compose v2 installed. You will also need Go 1.25, Node.js 20+, golang-migrate, and psql (PostgreSQL client).

Step 1: Clone the Repository

git clone https://github.com/datafly/signal.git
cd signal/application

All commands in this guide are run from the application/ directory.

Step 2: Start Infrastructure

Start Kafka, Redis, PostgreSQL, and Kafka UI with a single command:

make docker-up

This launches the following containers:

ServiceAddress
PostgreSQLlocalhost:5432
Redislocalhost:6379
Kafkalocalhost:9092
Kafka UIhttp://localhost:8090

Wait roughly 30 seconds for Kafka to finish its health check before proceeding. You can verify with docker compose -f deployments/docker-compose.yml ps — all services should show healthy.

Step 3: Run Database Migrations

Create the schema in PostgreSQL:

make migrate-up

This uses golang-migrate to apply all migration files from database/migrations/.

Step 4: Seed Development Data

Load a sample organisation, user, source, and integrations:

make seed

The seed inserts a development organisation with a pre-configured source and pipeline key you can use immediately.

Step 5: Start the Services

Open a separate terminal tab for each service (all from the application/ directory):

# Terminal 1 -- Ingestion Gateway (port 8080)
make run-ingestion-gateway
 
# Terminal 2 -- Event Processor (port 8081)
make run-event-processor
 
# Terminal 3 -- Delivery Workers (port 8082)
make run-delivery-workers
 
# Terminal 4 -- Identity Hub (port 8083)
make run-identity-hub
 
# Terminal 5 -- Management API (port 8084)
make run-management-api

Then start the Management UI:

# Terminal 6 -- Management UI (port 3000)
cd management-ui && npm install && npm run dev

Each Go service uses sensible defaults for local development. No environment variables are required when running against the Docker Compose infrastructure.

Step 6: Open the Management UI

Navigate to http://localhost:3000 in your browser. Log in with the seeded development credentials displayed by the make seed command.

From the dashboard you can:

  • View your organisation and sources
  • Configure integrations (vendor destinations)
  • Define transformation pipelines — each pipeline can define parameters (e.g. measurement IDs, API secrets) that are injected into integration configs at processing time
  • Open the real-time event debugger

Step 7: Add Datafly.js to a Test Page

Create a simple HTML file to test event collection. Replace YOUR_PIPELINE_KEY with the pipeline key shown in the Management UI under your source settings.

<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8" />
  <title>Datafly Test</title>
</head>
<body>
  <h1>Datafly Signal Test Page</h1>
  <button id="buy">Buy Now</button>
 
  <!-- Load Datafly.js from the Ingestion Gateway -->
  <script>
    !function(){var d=window._df=window._df||[];if(!d.initialize){
    d.methods=["page","track","identify","group","reset"];
    d.factory=function(m){return function(){var a=Array.prototype.slice.call(arguments);
    a.unshift(m);d.push(a);return d}};
    for(var i=0;i<d.methods.length;i++){var m=d.methods[i];d[m]=d.factory(m)}
    d.load=function(k,o){d._pipelineKey=k;d._options=o||{};
    var s=document.createElement("script");s.type="text/javascript";s.async=true;
    s.src=d._options.endpoint?d._options.endpoint+"/d.js":"http://localhost:8080/d.js";
    var f=document.getElementsByTagName("script")[0];f.parentNode.insertBefore(s,f)};
    d.initialize=true}}();
 
    // Initialise with your source pipeline key
    _df.load("YOUR_PIPELINE_KEY", { endpoint: "http://localhost:8080" });
 
    // Track a page view
    _df.page();
 
    // Track a button click
    document.getElementById("buy").addEventListener("click", function () {
      _df.track("Product Purchased", {
        product_id: "SKU-123",
        price: 49.99,
        currency: "USD"
      });
    });
  </script>
</body>
</html>

Open this file in a browser (you can use npx serve . or any local HTTP server).

Step 8: See Events Flowing

  1. Go back to the Management UI at http://localhost:3000.
  2. Open the Real-time Debugger from the sidebar.
  3. You should see the page event from your test page appear immediately.
  4. Click the Buy Now button on your test page and watch the Product Purchased event flow through.

You can also inspect events at each stage:

  • Kafka UI at http://localhost:8090 — browse the raw-events and delivery-* topics.
  • Management API — query GET http://localhost:8084/v1/admin/events for recent events.

Events flow through the full pipeline: Ingestion Gateway receives the event, publishes to the raw-events Kafka topic, the Event Processor applies Org Data Layer and Pipeline transformations, then publishes to per-integration delivery-* topics, and the Delivery Workers send events to configured vendor APIs.

What’s Next?

  • Installation — Build from source, understand the module structure.
  • Configuration — Customise environment variables for your setup.
  • Architecture — Deep dive into the data flow and component design.
  • Datafly.js SDK — Full SDK reference with all methods and options.
  • Integrations — Configure vendor destinations (GA4, Meta, TikTok, and more).