Event Streaming
The Event Stream Producer (ESP) is a component of the Jade Platform, enabling real‑time data capture and transmission for downstream processing. It uses a journal‑based Change Data Capture (CDC) mechanism to record state changes from the database and publish events to an event streaming platform such as Kafka or Azure Event Hubs.
The key features are listed in the following table.
Feature | Description |
---|---|
Journal-based CDC | Captures state changes from database journals, starting from the latest log sequence number (LSN) by default. |
Event serialization | Converts captured state changes into JSON‑serialized events following a well‑defined JSON schema for consistent data representation. |
Event publishing | Events are published to one or more topics in an event stream, using Kafka or Azure Event Hubs. |
Configurable operation | The ESP is enabled via configuration settings and supports customization, including the ability to set or reset the starting journal offset. |
At-least-once delivery | The ESP ensures "at least once" message delivery semantics, retransmitting messages in case of failure. |
Globally unique event identifiers | Each event has a unique identifier in the format <database‑unique‑ID>-<object‑OID‑and‑edition>, enabling consumers to detect and manage duplicate events. |
Flexible deployment | The ESP can run on primary or secondary (SDS) databases, supporting a variety of deployment environments. |
Large data handling | Support for handling large binary (blob) or string (slob) data via external storage. External data is referenced in events using URIs. |
Schema registry support | The producer is designed to support the Confluent Schema Registry protocol for managing and validating schemas, ensuring compatibility with Azure Event Hubs. |
You can download the ESP .zip file from the Extensions page on the Jade World Developer Centre; that is, from https://www.jadeplatform.tech/developer-centre/extensions.
Integration and Extensibility
Integration and extensibility is provided by the components listed in the following table.
Component | Description |
---|---|
Kafka producer | A tested Kafka producer integrates with Azure Event Hubs, using the Event Hubs Kafka Interface client API |
Schema evolution | Events are described using JSON schemas, to enable data consistency across system versions and to enable easy validation |