Transformer Reference

Overview

Transformers are categorized by their primary function:

  • Format Conversion Transformers: Convert data between different formats (XML, Protobuf, JSON, GeoJSON)

  • Sink Transformers: Write data to external systems (databases, APIs)

  • Source Transformers: Ingest data from external systems into SDL

  • Data Processing Transformers: Filter, enrich, or otherwise process data streams

  • Dynamic Transformers: User-defined transformation logic (see Dynamic Transformers)

Format Conversion Transformers

These transformers convert data between different formats, enabling interoperability between systems that use different data representations.

CoT XML to GeoJSON

Converts Cursor-on-Target (CoT) XML messages to GeoJSON features.

Property Value

UID

urn:rdp:transformer:cotxml-to-geojson

Input

Kafka topic (CoT XML messages)

Output

Kafka topic (GeoJSON features)

Labels

cot, xml, geojson

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Converting CoT tracking data for visualization in GIS applications

  • Enabling web-based mapping applications to display CoT entities

CoT XML to CoT Protobuf

Converts Cursor-on-Target (CoT) XML messages to their protobuf equivalent.

Property Value

UID

urn:rdp:transformer:cotxml-to-cotproto

Input

Kafka topic (CoT XML messages)

Output

Kafka topic (CoT Protobuf messages)

Labels

cot, xml, protobuf

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Reducing message size for bandwidth-constrained networks

  • Improving serialization/deserialization performance

  • Preparing CoT data for systems that require protobuf format

CoT XML to ETF Protobuf

Converts Cursor-on-Target (CoT) XML messages to Entity Tracking Format (ETF) protobuf.

Property Value

UID

urn:rdp:transformer:cotxml-to-etfproto

Input

Kafka topic (CoT XML messages)

Output

Kafka topic (ETF Protobuf messages)

Labels

cot, xml, etf, protobuf

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Converting legacy CoT XML data to modern ETF format

  • Integrating CoT sources with ETF-based systems

CoT XML to Lattice Entities

Converts Cursor-on-Target (CoT) XML messages to Anduril Lattice entity JSON.

Property Value

UID

urn:rdp:transformer:cotxml-to-lattice

Input

Kafka topic (CoT XML messages)

Output

Kafka topic (Lattice JSON entities)

Labels

cot, xml, lattice

Configuration Parameters:

Parameter Description Default Required

LATTICE_INTEGRATION_NAME

Integration name to set in Lattice entity provenance

SOF Data Layer

No

LATTICE_SIMULATED

Mark entities as simulated (sets indicators.simulated flag)

false

No

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Publishing CoT tracks to Anduril Lattice for C2 visualization

  • Converting CoT data for consumption by Lattice-based applications

CoT Protobuf to CoT XML

Converts Cursor-on-Target (CoT) protobuf messages to their XML equivalent.

Property Value

UID

urn:rdp:transformer:cotproto-to-cotxml

Input

Kafka topic (CoT Protobuf messages)

Output

Kafka topic (CoT XML messages)

Labels

cot, protobuf, xml

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Converting protobuf data for legacy systems that require XML

  • Human-readable debugging of CoT protobuf messages

CoT Protobuf to Lattice Entities

Converts Cursor-on-Target (CoT) Protobuf messages to Anduril Lattice entity JSON.

Property Value

UID

urn:rdp:transformer:cotproto-to-lattice

Input

Kafka topic (CoT Protobuf messages)

Output

Kafka topic (Lattice JSON entities)

Labels

cot, protobuf, lattice

Configuration Parameters:

Parameter Description Default Required

LATTICE_INTEGRATION_NAME

Integration name to set in Lattice entity provenance

SOF Data Layer

No

LATTICE_SIMULATED

Mark entities as simulated (sets indicators.simulated flag)

false

No

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Publishing CoT protobuf tracks to Anduril Lattice

  • Integrating protobuf-based CoT sources with Lattice

ETF Protobuf to Lattice Entities

Converts Entity Tracking Format (ETF) protobuf messages to Anduril Lattice entity JSON.

Property Value

UID

urn:rdp:transformer:etfproto-to-lattice

Input

Kafka topic (ETF Protobuf messages)

Output

Kafka topic (Lattice JSON entities)

Labels

etf, protobuf, lattice

Configuration Parameters:

Parameter Description Default Required

LATTICE_INTEGRATION_NAME

Integration name to set in Lattice entity provenance

SOF Data Layer

No

LATTICE_SIMULATED

Mark entities as simulated (sets indicators.simulated flag)

false

No

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Publishing ETF tracks to Anduril Lattice

  • Converting GCCS-J or other ETF sources for Lattice consumption

ETF Protobuf to CoT Protobuf

Converts Entity Tracking Format (ETF) protobuf messages to Cursor-on-Target (CoT) protobuf.

Property Value

UID

urn:rdp:transformer:etfproto-to-cotproto

Input

Kafka topic (ETF Protobuf messages)

Output

Kafka topic (CoT Protobuf messages)

Labels

etf, protobuf, cot

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Converting modern ETF data to CoT format for legacy systems

  • Bridging ETF-based and CoT-based tracking systems

ETF Protobuf to CoT XML

Converts Entity Tracking Format (ETF) protobuf messages to Cursor-on-Target (CoT) XML.

Property Value

UID

urn:rdp:transformer:etfproto-to-cotxml

Input

Kafka topic (ETF Protobuf messages)

Output

Kafka topic (CoT XML messages)

Labels

etf, protobuf, cot, xml

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Converting ETF data to CoT XML for legacy systems

  • Human-readable debugging of ETF protobuf messages

ETF Protobuf to GeoJSON

Converts Entity Tracking Format (ETF) protobuf messages to GeoJSON features.

Property Value

UID

urn:rdp:transformer:etfproto-to-geojson

Input

Kafka topic (ETF Protobuf messages)

Output

Kafka topic (GeoJSON features)

Labels

etf, protobuf, geojson

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Visualizing ETF tracks in web-based mapping applications

  • Converting ETF data for GIS tools

ETF Protobuf to JSON

Converts Entity Tracking Format (ETF) protobuf messages to human-readable JSON for debugging.

Property Value

UID

urn:rdp:transformer:etfproto-to-json

Input

Kafka topic (ETF Protobuf messages)

Output

Kafka topic (JSON messages)

Labels

etf, protobuf, json

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Debugging ETF protobuf messages

  • Inspecting ETF data structure during development

  • Creating human-readable logs of ETF data streams

This transformer is intended for debugging and should not be used in production pipelines due to the large message sizes generated.

Kafka Passthrough

Passes all messages from the inbound topic to the outbound topic without modification.

Property Value

UID

urn:rdp:transformer:passthrough

Input

Kafka topic (any format)

Output

Kafka topic (same format as input)

Labels

passthrough, kafka

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Testing data pipeline connectivity

  • Copying data between topics

  • Creating topic mirrors for separate consumption paths

  • Adding monitoring points in data pipelines

Source Transformers

Source transformers ingest data from external systems into SDL. They do not consume from Kafka topics but produce output topics.

UDL Ingester

Ingests data from Unified Data Library (UDL).

Property Value

UID

urn:rdp:source:udl

Input

None (polls UDL API)

Output

Kafka topic

Labels

sdl.catalog.datasource=f39f9db2-21cb-4f0a-a0a6-ca0befbafd1b

Type

Source

Configuration Parameters:

Parameter Description Default Required

CATALOG_DATASOURCE_PATH

Path to the datasource in the catalog

(empty)

Yes

CATALOG_DATASET_PATH

Path to the dataset in the catalog

(empty)

Yes

CLASSIFICATION_LEVEL

Network classification

(empty)

Yes

OAUTH_CLIENT_ID

Client ID for internal OAuth2 communications

(empty)

Yes

OAUTH_CLIENT_SECRET

Client secret for internal OAuth2 communications

(empty)

Yes

UDL_URL

URL of the Unified Data Library instance

(Internet-facing instance)

No

UDL_USER

Username for authentication

(empty)

Yes

UDL_PASS

Password for authentication

(empty)

Yes

CATALOG_URL

URL of the SDL Catalog API

(empty)

No

DF_CLASSIFICATION_URL

URL of the SDL Classification service

(empty)

No

Use Cases:

  • Ingesting data from DoD’s Unified Data Library

  • Pulling reference data into SDL

  • Synchronizing external datasets

REST JSON API Ingester

Ingests data from a generic JSON REST API.

Property Value

UID

urn:rdp:source:rest

Input

None (polls JSON API)

Output

Kafka topic (JSON messages)

Labels

sdl.catalog.datasource=REST

Type

Source

Configuration Parameters:

Parameter Description Default Required

CATALOG_DATASOURCE_PATH

Path to the datasource in the catalog

(empty)

Yes

CATALOG_DATASET_PATH

Path to the dataset in the catalog

(empty)

Yes

CLASSIFICATION_LEVEL

Network classification

(empty)

Yes

OAUTH_CLIENT_ID

Client ID for internal OAuth2 communications

(empty)

Yes

OAUTH_CLIENT_SECRET

Client secret for internal OAuth2 communications

(empty)

Yes

HTTP_JSON_CONFIG

JSON configuration for the HTTP client

(empty)

No

CATALOG_URL

URL of the SDL Catalog API

(empty)

No

DF_CLASSIFICATION_URL

URL of the SDL Classification service

(empty)

No

Use Cases:

  • Ingesting data from custom REST APIs

  • Polling external services for data

  • Integrating third-party data sources

GCCS-J ETF Generator

Generates realistic mock ETF (Entity Tracking Format) protobuf messages using ShadowTraffic for testing and development.

Property Value

UID

urn:rdp:source:gccsj-etf-mock

Input

None (generates mock data)

Output

Kafka topic (ETF Protobuf messages)

Labels

gccs-j, etf, mock

Type

Source (mock data generator)

Configuration Parameters:

Parameter Description Default Required

LOG_LEVEL

ShadowTraffic log level (TRACE, DEBUG, INFO, WARN, ERROR)

INFO

No

THROTTLE_MS

Milliseconds between message generation cycles

5000

No

Generated Data:

  • Military tracking data with MIL-2525 codes

  • AIS (Automatic Identification System) data

  • Realistic callsigns and unit identifiers

  • Geospatial positioning data

Use Cases:

  • Testing ETF processing pipelines

  • Development and debugging without live data sources

  • Load testing and performance evaluation

  • Training and demonstration environments

This transformer requires ShadowTraffic to be enabled in the platform configuration and is conditionally included based on the global.shadowtraffic.enabled setting.

Sink Transformers

Sink transformers write data to external systems, databases, or APIs. They consume from Kafka topics but do not produce output topics.

Publish to Lattice

Publishes entities to Anduril Lattice via the Lattice API.

Property Value

UID

urn:rdp:sink:lattice

Input

Kafka topic (Lattice JSON entities)

Output

None (writes to Lattice API)

Labels

lattice

Type

Sink

Configuration Parameters:

Parameter Description Default Required

LATTICE_ENDPOINT

Lattice API endpoint URL

(empty)

Yes

LATTICE_TOKEN

Lattice API authentication token

(empty)

Yes

LATTICE_SANDBOX_TOKEN

Lattice sandbox authorization token (sent as anduril-sandbox-authorization header)

(empty)

No

LATTICE_MAX_RETRIES

Maximum number of retries for transient failures

3

No

LATTICE_TIMEOUT_MS

Request timeout in milliseconds

30000

No

LATTICE_RESET_INVALID_EXPIRY

Reset any invalid expiryTime values to LATTICE_DEFAULT_EXPIRY_SECONDS

false

No

LATTICE_DEFAULT_EXPIRY_SECONDS

Default expiry period (from now) in seconds when resetting invalid expiryTime

86400 (24 hours)

No

LATTICE_OVERWRITE_SOURCE_UPDATE_TIME

Overwrite provenance.sourceUpdateTime to current time before publishing

false

No

LATTICE_INTEGRATION_NAME

Integration name to set in Lattice entity provenance

SOF Data Layer

No

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Use Cases:

  • Publishing real-time tracks to Lattice C2 system

  • Integrating SDL data with Anduril’s ecosystem

  • Feeding processed sensor data to Lattice

GeoJSON to GeoServer

Writes GeoJSON payloads as features to a PostGIS-enabled PostgreSQL database backing GeoServer.

Property Value

UID

urn:rdp:sink:geojson-to-postgis

Input

Kafka topic (GeoJSON features)

Output

None (writes to PostgreSQL/PostGIS)

Labels

gis, postgres, geojson

Type

Sink

Configuration Parameters:

Parameter Description Default Required

POSTGRES_TABLE_NAME

Table name (view name, history table will be {table}_history)

geojson_features

No

POSTGRES_SCHEMA

Schema name

public

No

POSTGRES_BATCH_SIZE

Buffer this many messages before writing

100

No

POSTGRES_FLUSH_INTERVAL

Max milliseconds to buffer messages before flush

50

No

GEOJSON_FEATURE_ID_FIELD

Property field to use as feature ID if GeoJSON id is missing

uid

No

LOG_LEVEL

Log level (trace, debug, info, warn, error, fatal, panic, disabled)

info

No

Database Schema:

  • View: geojson_features (or configured table name) - Latest view of all unique features

  • History Table: geojson_features_history - Full history of all feature updates

Use Cases:

  • Serving GeoJSON data via WFS/WMS from GeoServer

  • Maintaining historical track data in PostGIS

  • Enabling spatial queries on streaming geospatial data

  • Visualizing real-time tracks in GIS applications

PostgreSQL connection details are automatically configured from the sdl-geoserver-db-app secret and are not user-configurable.

Iceberg Sink

Sinks data into an Apache Iceberg table for advanced analytics.

Property Value

UID

urn:rdp:sink:iceberg

Input

Kafka topic (JSON messages)

Output

Iceberg table in object storage

Labels

DataOnboardingSink, IcebergEnabled

Type

Sink

Configuration Parameters:

Parameter Description Default Required

CATALOG_DATASET_PATH

Path to the dataset in the catalog

(empty)

Yes

OAUTH_CLIENT_ID

Client ID of the OAuth2 client to use for internal communications

(empty)

Yes

OAUTH_CLIENT_SECRET

Client secret of the OAuth2 client to use for internal communications

(empty)

Yes

LOG_LEVEL

Log level (DEBUG, INFO, WARN, ERROR)

INFO

No

ICEBERG_CATALOG_REST_HOST

Iceberg catalog REST host

http://df-lakekeeper

No

Use Cases:

  • Building a data lakehouse for historical analytics

  • Enabling time-travel queries on streaming data

  • Creating queryable archives of data pipeline outputs

  • Supporting BI tools and data science workflows

Data Processing Transformers

These transformers filter, enrich, or otherwise process data streams.

Kafka JSON KSQL Filter

Filters JSON messages based on a KSQL query for data quality and routing.

Property Value

UID

urn:rdp:transformer:json-ksql-filter

Input

Kafka topic (JSON messages)

Output

Kafka topic (filtered JSON messages)

Labels

DataProcessing, StreamProcessing

Configuration Parameters:

Parameter Description Default Required

PROCESSOR_CONFIG_SQL

The KSQL query to use for filtering messages

SELECT * FROM d WHERE d.title LIKE '%Demo%'

Yes

Use Cases:

  • Filtering messages based on field values

  • Routing messages to different topics based on content

  • Data quality checks and validation

  • Removing unwanted or malformed messages

Example Queries:

  • SELECT * FROM d WHERE d.priority > 5 - Filter by priority

  • SELECT * FROM d WHERE d.type = 'alert' - Filter by type

  • SELECT d.id, d.timestamp FROM d WHERE d.status = 'active' - Project specific fields

Dynamic Transformers

Dynamic transformers allow users to define custom transformation logic using JavaScript, Python, or Rhai scripting languages.

Custom JavaScript Function

Runs a dynamic JavaScript snippet on each message from Kafka.

Property Value

UID

urn:rdp:transformer:dynamic-js

Input

Kafka topic (JSON messages)

Output

Kafka topic (transformed JSON messages)

Labels

Dynamic

Configuration Parameters:

Parameter Description Default Required

script (file)

JavaScript file containing the handler function

Default passthrough script

Yes

Default Script:

const handler = (obj) => {
    out = {};
    out.input_msg = obj;
    out.version = '1.0';
    return out
}

Use Cases:

  • Custom data transformations

  • Field renaming and restructuring

  • Enrichment with computed values

  • Lightweight processing logic

For more information on dynamic transformers, including Python and Rhai variants, see Dynamic Transformers.

See Also