Chapter 1 Event Streaming Overview
Event streaming has gained popularity due to increasing demands for real-time interactivity and data-driven decision‑making to leverage event streaming to power core business processes and deliver exceptional customer experiences by providing many benefits, including:
-
Faster insights and actions
Allows you to react to changes in environment, customer behavior, or market conditions in milliseconds rather than hours or days. This can lead to improved customer experience, operational efficiency, and competitive advantage.
-
Modernization and integration
Enables breaking down data silos and connecting heritage systems with modern applications, using a common event streaming platform that acts as a central nervous system for an organization.
-
Greater agility and innovation
You can develop and deploy new applications and features faster, by making use of an architecture that is based on event‑driven communication. You can also experiment with new ideas and test them in real‑time, without affecting your existing systems.
-
Single source of truth
You can store all your events in a durable, scalable, and secure event log that preserves the order and context of the events. You can use this event log as a single source of truth for your data, and replay or rewind the events for different purposes; for example, auditing, debugging, or analytics.
The Jade Event Stream Producer (ESP) delivers a reliable and performant Event Streaming capability for the Jade Platform, extending the value of the platform for Jade customers and partners. It is a crucial component of the Jade Platform's data processing strategy that enables real‑time data capture and transmission, allowing for immediate analysis and action on events as they occur.
Download the ESP .zip file from the Extensions page on the Jade World Developer Centre; that is, from https://www.jadeplatform.tech/developer-centre/extensions.
The ESP is designed to be configurable, allowing you to adjust settings based on your specific needs. This flexibility ensures optimal performance across various use cases.
The Jade ESP supports both the Apache Kafka application programming interface (API) and Azure Event Hubs, offering flexibility in deployment options. Key features include:
-
Seamless integration with a Jade Platform database
-
Support for incremental replication to multiple target databases
The key features are listed in the following table.
Feature | Description |
---|---|
Journal-based CDC | Captures state changes from database journals, starting from the latest Log Sequence Number (LSN) by default. |
Event serialization | Converts captured state changes into JSON‑serialized events following a well‑defined JSON schema for consistent data representation. |
Event publishing | Events are published to one or more topics in an event stream, using Kafka or Azure Event Hubs. |
Configurable operation | The ESP is enabled via configuration settings and supports customization, including the ability to set or reset the starting journal offset. |
At-least-once delivery | The ESP ensures at‑least‑once message delivery semantics, retransmitting messages in case of failure. |
Globally unique event identifiers | Each event has a unique identifier in the format <database‑unique‑ID>-<object‑OID‑and‑edition>, enabling consumers to detect and manage duplicate events. |
Flexible deployment | ESP can run on primary or secondary (SDS) databases, supporting a variety of deployment environments. |
Large data handling | Support for handling large binary (blob) or string (slob) data via external storage. External data is referenced in events using URIs. |
Schema registry support | The producer is designed to support the Confluent Schema Registry protocol for managing and validating schemas, ensuring compatibility with Azure Event Hubs. |
Schema evolution | Events are described using JSON schemas, to enable data consistency across system versions and to enable easy validation. |
The ESP processes captured state, serializes to events, and publishes events to one or several topics on an event stream.
-
The ESP is enabled by configuration. (See "Event Stream Producer Configuration", in Chapter 3.)
-
You can set or reset the starting offset in a journal, if required.
-
Events are published in JSON format described using a JSON schema (https://json-schema.org).
-
The ESP can run on a primary or SDS secondary database. The capture process can run on a native SDS secondary database.
-
Blob and slob properties are included in parent object create, update, or delete events.
-
The JournalCloseAction parameter with a value of Move in the [PersistentDb] section of the Jade initialization file is supported; that is, journal capture looks for journals in both the current and the archive journal directories.
-
At‑least‑once message delivery semantics means that in case of failures that lead to message loss or taking too long to recover from a failure, messages are retransmitted to assure they are delivered at least once.
-
Every event has a globally unique identifier that enables consumers to detect duplicate events. The format is <database‑unique‑ID>-<object‑OID‑and‑edition>; for example, "eventId": "2b317b93-b2bf-ee11-90bd-005056b64821.2429.1.2".
-
The ESP implementation supports representing large data; for example, blobs and slobs in external storage with a URI that links to data in the main event.
-
The ESP supports Azure Schema Registry for Event Hubs.
-
The ESP handles errors for asynchronous requests.