Chapter 1    Event Streaming Overview

Event streaming has gained popularity due to increasing demands for real-time interactivity and data-driven decision‑making to leverage event streaming to power core business processes and deliver exceptional customer experiences by providing many benefits, including:

The Jade Event Stream Producer (ESP) delivers a reliable and performant Event Streaming capability for the Jade Platform, extending the value of the platform for Jade customers and partners. It is a crucial component of the Jade Platform's data processing strategy that enables real‑time data capture and transmission, allowing for immediate analysis and action on events as they occur.

Download the ESP .zip file from the Extensions page on the Jade World Developer Centre; that is, from https://www.jadeplatform.tech/developer-centre/extensions.

The ESP is designed to be configurable, allowing you to adjust settings based on your specific needs. This flexibility ensures optimal performance across various use cases.

The Jade ESP supports both the Apache Kafka application programming interface (API) and Azure Event Hubs, offering flexibility in deployment options. Key features include:

The key features are listed in the following table.

Feature Description
Journal-based CDC Captures state changes from database journals, starting from the latest Log Sequence Number (LSN) by default.
Event serialization Converts captured state changes into JSON‑serialized events following a well‑defined JSON schema for consistent data representation.
Event publishing Events are published to one or more topics in an event stream, using Kafka or Azure Event Hubs.
Configurable operation The ESP is enabled via configuration settings and supports customization, including the ability to set or reset the starting journal offset.
At-least-once delivery The ESP ensures at‑least‑once message delivery semantics, retransmitting messages in case of failure.
Globally unique event identifiers Each event has a unique identifier in the format <database‑unique‑ID>-<object‑OID‑and‑edition>, enabling consumers to detect and manage duplicate events.
Flexible deployment ESP can run on primary or secondary (SDS) databases, supporting a variety of deployment environments.
Large data handling Support for handling large binary (blob) or string (slob) data via external storage. External data is referenced in events using URIs.
Schema registry support The producer is designed to support the Confluent Schema Registry protocol for managing and validating schemas, ensuring compatibility with Azure Event Hubs.
Schema evolution Events are described using JSON schemas, to enable data consistency across system versions and to enable easy validation.

The ESP processes captured state, serializes to events, and publishes events to one or several topics on an event stream.