IntegrationsData Clean RoomsInfoSum

InfoSum

Datafly Signal delivers hashed first-party identifiers to InfoSum, the privacy-first data clean room platform from WPP. Once your data is in an InfoSum dataset, you can collaborate with advertisers, publishers, and other partners using deterministic identity matching — without raw PII ever leaving your environment.

⚠️

Alpha integration. The catalog entry, blueprint field mappings, and configuration UI are in place, but the production batch-export delivery path is still being built. Current alpha behaviour delivers per-event hits to InfoSum’s import API, which works for low-volume validation but is not the long-term architecture for clean-room workflows. See Architecture & roadmap below.

What InfoSum does

InfoSum’s core idea is a decentralised Bunker model: every party (you, the advertiser, the publisher) keeps their own data in their own InfoSum-managed environment. To collaborate, parties run matched analyses against each other’s bunkers using Private Set Intersection (PSI) — the platform computes overlaps and aggregates without anyone seeing the other side’s raw rows.

Common use cases:

  • Audience overlap analysis with media partners
  • Closed-loop measurement (campaign exposure ↔ outcomes)
  • Audience activation to walled-garden ad platforms via InfoSum destinations (The Trade Desk, Amazon Data Manager, Google, etc.)
  • Insight generation across first-party datasets

Prerequisites

Before configuring InfoSum in Signal, you need:

An InfoSum account and Bunker

InfoSum is enterprise-only — contact InfoSum for access. Your Bunker is provisioned in a region (EU, UK, or US) that must match the residency requirements of the data you’re loading.

A dataset in your Bunker

Create a dataset via the InfoSum UI under Datasets → New Dataset. The dataset defines the schema of the records you’ll publish (typically: hashed email, hashed phone, customer ID, plus any non-PII attributes you want to make available for matching).

An importer

Importers map incoming columns from Signal to your dataset’s schema. Create one in the InfoSum UI under Imports → New Importer and configure the column mapping for the fields Signal will send (see Identity signals below).

A V2 API key

Generate one in the InfoSum platform under your account settings. Signal uses V2 API keys exclusively — V1 keys are not supported.

Configure in Signal

Configuration fields

FieldRequiredDescription
api_keyYesYour InfoSum V2 API key.
dataset_idYesThe dataset to publish to. Must already exist in your Bunker.
importer_idYesThe importer that maps incoming columns to the dataset schema.
regionYesInfoSum residency region (eu, uk, or us). Must match your Bunker.

Management UI setup

  1. Go to IntegrationsAdd IntegrationInfoSum.
  2. Enter the four fields above.
  3. Select consent categories appropriate to your use case (typically marketing or advertising).
  4. Click Save.

Identity signals

InfoSum matches on deterministic identifiers — typically a SHA-256 hash of a normalised email or phone number. Signal hashes these server-side before they leave your infrastructure.

The default blueprint maps the following Datafly fields to InfoSum columns:

Datafly sourceInfoSum columnHashing
user_idcustomer_idPlain text
$traits.emailemailSHA-256, lowercase, trimmed
$traits.first_namefirst_namePlain text
$traits.last_namelast_namePlain text
$traits.phonephoneSHA-256, normalised

Your importer column names must match the targets shown above (or you can override them in the blueprint editor). Hashing rules in your InfoSum normalisation config must agree with Signal’s — both sides need to lowercase and trim before hashing for matches to work.

How to send user data

Call _df.identify() when a user logs in, registers, or submits a form:

_df.identify("user-123", {
  email: "[email protected]",
  phone: "+44 7700 900123",
  firstName: "Jane",
  lastName: "Doe"
});

Signal normalises and hashes these fields automatically before sending to InfoSum.

Architecture & roadmap

InfoSum is not a per-event HTTP destination like GA4 or Meta CAPI. It’s a batch / file-based platform: the canonical flow is stage records into cloud storage → InfoSum imports them → InfoSum normalises and publishes to your dataset. Per-event API calls work for low-volume validation but are wasteful and don’t match the platform’s design.

Signal’s roadmap delivers this in two phases:

PhaseScopeStatus
Alpha (current)Catalog entry, blueprint field mappings, configuration UI, per-event delivery via POST /api/v2/import/executions/execute. Suitable for low-volume validation and demos.Available
BetaObject-storage staging (S3/GCS/Azure Blob) in your environment, scheduled flush to an InfoSum cloud vault, batched importer execution. Production-grade. Cloud vault stays in your cloud — InfoSum pulls from it.In design

The Phase 2 architecture is shared with other batch destinations (Google Customer Match, LiveRamp, Snowflake share, Adobe AEP). Once it lands, InfoSum becomes the reference implementation.

Troubleshooting

ProblemSolution
Records not appearing in datasetVerify the importer is configured to expect the column names Signal sends (see Identity signals). Check the importer’s run history in the InfoSum UI for normalisation errors.
Low match rate with partnersConfirm your normalisation config and the partner’s both lowercase + trim before hashing. Mismatched salt or casing produces zero matches even with identical underlying values.
401 Unauthorized errorsConfirm you generated a V2 API key, not a V1 key. Check the key has access to the target dataset and importer.
Region errorsThe region config field must match the residency region of your Bunker exactly.

Resources