Data Collection Stream Concepts
Concepts for creating and maintaining data change events in FairCom DB and RTG servers
The Data Change Streaming API and engine run on FairCom MQ and connect to FairCom DB and FairCom RTG. It is optimized for the fastest possible change streaming inside an enterprise network, which may include internal network access to cloud services. It uses parallel data replication as its underlying technology. It centralizes the management of data change streaming to FairCom MQ. Its actions and properties are named to match the fact that the API and its engine run on the FairCom MQ server. The FairCom MQ server configures FairCom DB and RTG servers to replicate data change events to it.
In contrast, the DB Notify API and engine run on FairCom DB and FairCom RTG and connect to FairCom MQ. It is optimized to run locally and push data changes across the Internet to FairCom MQ in the cloud or another data center. If the target FairCom MQ server or the network is unavailable, it caches data changes locally and automatically resumes pushing changes when they become available.
FairCom's Data Change Streaming API tracks data change events in FairCom DB and RTG servers, and it stores these data changes in FairCom MQ for delivery to other databases and applications. FairCom MQ hosts the data change streaming service and API.
Once synchronized, FairCom MQ contains a log of data changes that applications can consume in bulk through SQL and the JSON DB API. FairCom MQ also uses MQTT to deliver JSON change messages in real time to subscribers. You can use FairCom MQ as a clearinghouse to synchronize data in FairCom servers to all types of databases and software systems.
A data change stream is a series of data change events that occur in one table of a database. A data change event is an insert, update, or delete of one record in the table. Each data change event is a JSON document containing a record's data before and after a change. A customer configures one or more tables to stream data change events to FairCom MQ. Each data change stream has the option to stream current and future events or only future events.
You can create filtered and unfiltered data change streams. An unfiltered stream delivers all data changes to all records and fields of a table. A filtered stream includes data changes only for records that match specified filtered criteria. You can filter records by specific field values and the type of data change event (insert, update, or delete). You may also limit which fields are included in data change events.
Once you create a data change stream, the server assigns a unique ID to it. You cannot modify a data change stream. You must delete it and start a new one. You may list data change streams to retrieve their IDs, descriptions, and properties. You can use data change stream IDs to start, pause, and delete data change streams.
FairCom MQ provides the following techniques to scale data change streaming:
You can consolidate data changes from multiple servers into a single FairCom MQ server to maximize its capacity.
You can spread data changes from a single source server across many FairCom MQ servers to handle high velocity transactions.
You can duplicate data changes from one source server to many FairCom MQ servers to scale the delivery of changes to many MQTT clients.
Consolidating data changes from multiple source servers
A single instance of FairCom MQ can capture data change events from multiple FairCom DB and RTG servers. This is useful for consolidating data changes and maximizing the use of FairCom MQ. For example, FairCom MQ can connect to FairCom servers at different customer locations and capture their changes.
Spreading data changes from a source server across multiple FairCom MQ servers
When a FairCom DB or RTG server processes more data changes than a single FairCom MQ server can handle, you can configure multiple FairCom MQ servers to handle the load by synchronizing some tables to one FairCom MQ server and other tables to another. This technique spreads data changes across multiple FairCom MQ servers to scale your solution as needed.
Duplicating data changes from a source server to multiple FairCom MQ servers
When a FairCom MQ server must deliver data changes to more subscribers than a single FairCom MQ server can handle, you can configure multiple FairCom MQ servers to handle the load by synchronizing the same tables to multiple FairCom MQ servers. This technique duplicates data changes across multiple FairCom MQ servers to scale delivery.
Below is an example of a data change event that FairCom MQ stores and delivers. Applications can subscribe to these messages using MQTT and can retrieve bulk messages on demand using SQL, and FairCom's JSON MQ and JSON DB APIs.
{
"operation": "update",
"transactionTimestamp": "2023-09-26T12:37:25.490Z",
"serverName": "FAIRCOMS", "sourceDatabaseName": "ctreeSQL", "sourceOwnerName": "admin",
"sourceTableName": "custmast",
"fields": [
{
"fieldName": "cm_custcity",
"beforeValue": "Harford",
"afterValue": "Columbia",
"pk": 1,
"changed": true
}
]
}