Transformer Reference
This reference documents all transformers available in SDL.
Transformers are processing units in a pipeline that do one of:
-
ingress data from a source
-
transform or filter data
-
egress data to a sink
For information on how to create and register custom transformers, see Transformer Guide.
For information on dynamic transformers (JavaScript, Python, etc), see Dynamic Transformers.
- Overview
- Format Conversion Transformers
- CoT XML to GeoJSON
- CoT XML to CoT Protobuf
- CoT XML to ETF Protobuf
- CoT XML to Lattice Entities
- CoT Protobuf to CoT XML
- CoT Protobuf to Lattice Entities
- ETF Protobuf to Lattice Entities
- ETF Protobuf to CoT Protobuf
- ETF Protobuf to CoT XML
- ETF Protobuf to GeoJSON
- ETF Protobuf to JSON
- Kafka Passthrough
- Source Transformers
- Sink Transformers
- Data Processing Transformers
- Dynamic Transformers
- See Also
Overview
Transformers are categorized by their primary function:
-
Format Conversion Transformers: Convert data between different formats (XML, Protobuf, JSON, GeoJSON)
-
Sink Transformers: Write data to external systems (databases, APIs)
-
Source Transformers: Ingest data from external systems into SDL
-
Data Processing Transformers: Filter, enrich, or otherwise process data streams
-
Dynamic Transformers: User-defined transformation logic (see Dynamic Transformers)
Format Conversion Transformers
These transformers convert data between different formats, enabling interoperability between systems that use different data representations.
CoT XML to GeoJSON
Converts Cursor-on-Target (CoT) XML messages to GeoJSON features.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (CoT XML messages) |
Output |
Kafka topic (GeoJSON features) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Converting CoT tracking data for visualization in GIS applications
-
Enabling web-based mapping applications to display CoT entities
CoT XML to CoT Protobuf
Converts Cursor-on-Target (CoT) XML messages to their protobuf equivalent.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (CoT XML messages) |
Output |
Kafka topic (CoT Protobuf messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Reducing message size for bandwidth-constrained networks
-
Improving serialization/deserialization performance
-
Preparing CoT data for systems that require protobuf format
CoT XML to ETF Protobuf
Converts Cursor-on-Target (CoT) XML messages to Entity Tracking Format (ETF) protobuf.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (CoT XML messages) |
Output |
Kafka topic (ETF Protobuf messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Converting legacy CoT XML data to modern ETF format
-
Integrating CoT sources with ETF-based systems
CoT XML to Lattice Entities
Converts Cursor-on-Target (CoT) XML messages to Anduril Lattice entity JSON.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (CoT XML messages) |
Output |
Kafka topic (Lattice JSON entities) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Integration name to set in Lattice entity provenance |
|
No |
|
Mark entities as simulated (sets indicators.simulated flag) |
|
No |
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Publishing CoT tracks to Anduril Lattice for C2 visualization
-
Converting CoT data for consumption by Lattice-based applications
CoT Protobuf to CoT XML
Converts Cursor-on-Target (CoT) protobuf messages to their XML equivalent.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (CoT Protobuf messages) |
Output |
Kafka topic (CoT XML messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Converting protobuf data for legacy systems that require XML
-
Human-readable debugging of CoT protobuf messages
CoT Protobuf to Lattice Entities
Converts Cursor-on-Target (CoT) Protobuf messages to Anduril Lattice entity JSON.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (CoT Protobuf messages) |
Output |
Kafka topic (Lattice JSON entities) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Integration name to set in Lattice entity provenance |
|
No |
|
Mark entities as simulated (sets indicators.simulated flag) |
|
No |
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Publishing CoT protobuf tracks to Anduril Lattice
-
Integrating protobuf-based CoT sources with Lattice
ETF Protobuf to Lattice Entities
Converts Entity Tracking Format (ETF) protobuf messages to Anduril Lattice entity JSON.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (ETF Protobuf messages) |
Output |
Kafka topic (Lattice JSON entities) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Integration name to set in Lattice entity provenance |
|
No |
|
Mark entities as simulated (sets indicators.simulated flag) |
|
No |
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Publishing ETF tracks to Anduril Lattice
-
Converting GCCS-J or other ETF sources for Lattice consumption
ETF Protobuf to CoT Protobuf
Converts Entity Tracking Format (ETF) protobuf messages to Cursor-on-Target (CoT) protobuf.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (ETF Protobuf messages) |
Output |
Kafka topic (CoT Protobuf messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Converting modern ETF data to CoT format for legacy systems
-
Bridging ETF-based and CoT-based tracking systems
ETF Protobuf to CoT XML
Converts Entity Tracking Format (ETF) protobuf messages to Cursor-on-Target (CoT) XML.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (ETF Protobuf messages) |
Output |
Kafka topic (CoT XML messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Converting ETF data to CoT XML for legacy systems
-
Human-readable debugging of ETF protobuf messages
ETF Protobuf to GeoJSON
Converts Entity Tracking Format (ETF) protobuf messages to GeoJSON features.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (ETF Protobuf messages) |
Output |
Kafka topic (GeoJSON features) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Visualizing ETF tracks in web-based mapping applications
-
Converting ETF data for GIS tools
ETF Protobuf to JSON
Converts Entity Tracking Format (ETF) protobuf messages to human-readable JSON for debugging.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (ETF Protobuf messages) |
Output |
Kafka topic (JSON messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Debugging ETF protobuf messages
-
Inspecting ETF data structure during development
-
Creating human-readable logs of ETF data streams
| This transformer is intended for debugging and should not be used in production pipelines due to the large message sizes generated. |
Kafka Passthrough
Passes all messages from the inbound topic to the outbound topic without modification.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (any format) |
Output |
Kafka topic (same format as input) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Testing data pipeline connectivity
-
Copying data between topics
-
Creating topic mirrors for separate consumption paths
-
Adding monitoring points in data pipelines
Source Transformers
Source transformers ingest data from external systems into SDL. They do not consume from Kafka topics but produce output topics.
UDL Ingester
Ingests data from Unified Data Library (UDL).
| Property | Value |
|---|---|
UID |
|
Input |
None (polls UDL API) |
Output |
Kafka topic |
Labels |
|
Type |
Source |
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Path to the datasource in the catalog |
(empty) |
Yes |
|
Path to the dataset in the catalog |
(empty) |
Yes |
|
Network classification |
(empty) |
Yes |
|
Client ID for internal OAuth2 communications |
(empty) |
Yes |
|
Client secret for internal OAuth2 communications |
(empty) |
Yes |
|
URL of the Unified Data Library instance |
(Internet-facing instance) |
No |
|
Username for authentication |
(empty) |
Yes |
|
Password for authentication |
(empty) |
Yes |
|
URL of the SDL Catalog API |
(empty) |
No |
|
URL of the SDL Classification service |
(empty) |
No |
Use Cases:
-
Ingesting data from DoD’s Unified Data Library
-
Pulling reference data into SDL
-
Synchronizing external datasets
REST JSON API Ingester
Ingests data from a generic JSON REST API.
| Property | Value |
|---|---|
UID |
|
Input |
None (polls JSON API) |
Output |
Kafka topic (JSON messages) |
Labels |
|
Type |
Source |
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Path to the datasource in the catalog |
(empty) |
Yes |
|
Path to the dataset in the catalog |
(empty) |
Yes |
|
Network classification |
(empty) |
Yes |
|
Client ID for internal OAuth2 communications |
(empty) |
Yes |
|
Client secret for internal OAuth2 communications |
(empty) |
Yes |
|
JSON configuration for the HTTP client |
(empty) |
No |
|
URL of the SDL Catalog API |
(empty) |
No |
|
URL of the SDL Classification service |
(empty) |
No |
Use Cases:
-
Ingesting data from custom REST APIs
-
Polling external services for data
-
Integrating third-party data sources
GCCS-J ETF Generator
Generates realistic mock ETF (Entity Tracking Format) protobuf messages using ShadowTraffic for testing and development.
| Property | Value |
|---|---|
UID |
|
Input |
None (generates mock data) |
Output |
Kafka topic (ETF Protobuf messages) |
Labels |
|
Type |
Source (mock data generator) |
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
ShadowTraffic log level (TRACE, DEBUG, INFO, WARN, ERROR) |
|
No |
|
Milliseconds between message generation cycles |
|
No |
Generated Data:
-
Military tracking data with MIL-2525 codes
-
AIS (Automatic Identification System) data
-
Realistic callsigns and unit identifiers
-
Geospatial positioning data
Use Cases:
-
Testing ETF processing pipelines
-
Development and debugging without live data sources
-
Load testing and performance evaluation
-
Training and demonstration environments
This transformer requires ShadowTraffic to be enabled in the platform configuration and is conditionally included based on the global.shadowtraffic.enabled setting.
|
Sink Transformers
Sink transformers write data to external systems, databases, or APIs. They consume from Kafka topics but do not produce output topics.
Publish to Lattice
Publishes entities to Anduril Lattice via the Lattice API.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (Lattice JSON entities) |
Output |
None (writes to Lattice API) |
Labels |
|
Type |
Sink |
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Lattice API endpoint URL |
(empty) |
Yes |
|
Lattice API authentication token |
(empty) |
Yes |
|
Lattice sandbox authorization token (sent as anduril-sandbox-authorization header) |
(empty) |
No |
|
Maximum number of retries for transient failures |
|
No |
|
Request timeout in milliseconds |
|
No |
|
Reset any invalid expiryTime values to LATTICE_DEFAULT_EXPIRY_SECONDS |
|
No |
|
Default expiry period (from now) in seconds when resetting invalid expiryTime |
|
No |
|
Overwrite provenance.sourceUpdateTime to current time before publishing |
|
No |
|
Integration name to set in Lattice entity provenance |
|
No |
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Use Cases:
-
Publishing real-time tracks to Lattice C2 system
-
Integrating SDL data with Anduril’s ecosystem
-
Feeding processed sensor data to Lattice
GeoJSON to GeoServer
Writes GeoJSON payloads as features to a PostGIS-enabled PostgreSQL database backing GeoServer.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (GeoJSON features) |
Output |
None (writes to PostgreSQL/PostGIS) |
Labels |
|
Type |
Sink |
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Table name (view name, history table will be {table}_history) |
|
No |
|
Schema name |
|
No |
|
Buffer this many messages before writing |
|
No |
|
Max milliseconds to buffer messages before flush |
|
No |
|
Property field to use as feature ID if GeoJSON id is missing |
|
No |
|
Log level (trace, debug, info, warn, error, fatal, panic, disabled) |
|
No |
Database Schema:
-
View:
geojson_features(or configured table name) - Latest view of all unique features -
History Table:
geojson_features_history- Full history of all feature updates
Use Cases:
-
Serving GeoJSON data via WFS/WMS from GeoServer
-
Maintaining historical track data in PostGIS
-
Enabling spatial queries on streaming geospatial data
-
Visualizing real-time tracks in GIS applications
PostgreSQL connection details are automatically configured from the sdl-geoserver-db-app secret and are not user-configurable.
|
Iceberg Sink
Sinks data into an Apache Iceberg table for advanced analytics.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (JSON messages) |
Output |
Iceberg table in object storage |
Labels |
|
Type |
Sink |
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
Path to the dataset in the catalog |
(empty) |
Yes |
|
Client ID of the OAuth2 client to use for internal communications |
(empty) |
Yes |
|
Client secret of the OAuth2 client to use for internal communications |
(empty) |
Yes |
|
Log level (DEBUG, INFO, WARN, ERROR) |
|
No |
|
Iceberg catalog REST host |
No |
Use Cases:
-
Building a data lakehouse for historical analytics
-
Enabling time-travel queries on streaming data
-
Creating queryable archives of data pipeline outputs
-
Supporting BI tools and data science workflows
Data Processing Transformers
These transformers filter, enrich, or otherwise process data streams.
Kafka JSON KSQL Filter
Filters JSON messages based on a KSQL query for data quality and routing.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (JSON messages) |
Output |
Kafka topic (filtered JSON messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
The KSQL query to use for filtering messages |
|
Yes |
Use Cases:
-
Filtering messages based on field values
-
Routing messages to different topics based on content
-
Data quality checks and validation
-
Removing unwanted or malformed messages
Example Queries:
-
SELECT * FROM d WHERE d.priority > 5- Filter by priority -
SELECT * FROM d WHERE d.type = 'alert'- Filter by type -
SELECT d.id, d.timestamp FROM d WHERE d.status = 'active'- Project specific fields
Dynamic Transformers
Dynamic transformers allow users to define custom transformation logic using JavaScript, Python, or Rhai scripting languages.
Custom JavaScript Function
Runs a dynamic JavaScript snippet on each message from Kafka.
| Property | Value |
|---|---|
UID |
|
Input |
Kafka topic (JSON messages) |
Output |
Kafka topic (transformed JSON messages) |
Labels |
|
Configuration Parameters:
| Parameter | Description | Default | Required |
|---|---|---|---|
|
JavaScript file containing the handler function |
Default passthrough script |
Yes |
Default Script:
const handler = (obj) => {
out = {};
out.input_msg = obj;
out.version = '1.0';
return out
}
Use Cases:
-
Custom data transformations
-
Field renaming and restructuring
-
Enrichment with computed values
-
Lightweight processing logic
For more information on dynamic transformers, including Python and Rhai variants, see Dynamic Transformers.
See Also
-
Transformer Guide - How to create and register transformers
-
Dynamic Transformers - Dynamic transformer details
-
Transformer
configuration- Transformer configuration guide -
Inputs and Outputs - Input/output connection types