Documentation Index
Fetch the complete documentation index at: https://docs.streamkap.com/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
IBM Informix Change Data Capture requiredThe Connector uses Informix’s built-in CDC API, which is exposed through the
syscdcv1 system database. syscdcv1 ships with Informix and must be present on the server. CDC is supported on Informix Enterprise Edition.- Informix version ≥ 12.10 (14.10 or 15 recommended)
- The source database must be created with logging (ANSI-logged or buffered-log). Non-logged databases cannot be captured.
- A database user with privileges to:
- Connect to the source database and
syscdcv1 SELECTon tables you want to captureSELECT,INSERT,UPDATEon the Streamkap signal table
- Connect to the source database and
Informix Setup
The Informix CDC API reads the physical logical log and streams row-level change events back to the Connector over a JDBC session againstsyscdcv1. The Connector opens a capture session, enables Full Row Logging on the tables you’ve selected, and then consumes change records as they are written to the log.
Streamkap keeps Full Row Logging enabled on captured tables across Connector restarts (e.g. pod reschedules, deployment updates) so that DML occurring while the Connector is disconnected is still written to the log and can be replayed on reconnect. Without this, a restart would create a silent data gap.
1. Grant Database Access
- Configure one of the Connection Options to ensure Streamkap can reach your database.
2. Create Database User
It’s recommended to create a separate user for Streamkap. Below is an example script that does that.3. Enable Change Data Capture
CDC itself is activated at runtime by the Connector — you do not need to manually enable full row logging on each table. However, thesyscdcv1 database must be available on the server and the source database must be a logged database.
Verify the syscdcv1 database exists:
syscdcv1 is missing, create it by running the installer script shipped with Informix:
is_logging is 0, enable logging on the database:
4. Enable Snapshots
To backfill your data, the Connector needs to be able to perform snapshots (See Snapshots & Backfilling for more information). To enable this process, a table must be created for the Connector to use.5. Heartbeats
Connectors use “offsets”—like bookmarks—to track their position in the database’s log or change stream. When no changes occur for long periods, these offsets may become outdated, and the Connector might lose its place or stop capturing changes. Heartbeats ensure the Connector stays active and continues capturing changes. There are two layers of heartbeat protection:Layer 1: Connector heartbeats (enabled by default)
The Connector periodically emits heartbeat messages to an internal topic, even when no actual data changes are detected. This keeps offsets fresh and prevents staleness. No configuration is necessary for this layer; it is automatically enabled. We recommend keeping this layer enabled for all deployments.Layer 2: Source database heartbeats (recommended)
Why we recommend configuring Layer 2While Layer 2 is crucial for low-traffic or intermittent databases, we recommend configuring it for all deployments. It provides additional resilience and helps prevent issues during periods of inactivity.
- Read-write connections (when Read only is No during Streamkap Setup): The Connector updates the heartbeat table directly.
- Read-only connections (when Read only is Yes during Streamkap Setup): A scheduled job on the primary database updates the heartbeat table, and these changes replicate to the read replica for the Connector to consume.
pg_cron for PostgreSQL, event_scheduler for MySQL)—on your source database.
- Read-write connections
For read-write connections, the Connector writes to the heartbeat table directly.
Streamkap Setup
Follow these steps to configure your new connector:1. Create the Source
- Navigate to Add Connectors.
- Choose Informix.
2. Connection Settings
- Name: Enter a name for your connector.
- Hostname: IP address or hostname of the Informix database server.
-
Port: Default is
9088(the Informix SQLI listener port). -
Connect via SSH Tunnel: The Connector will connect to an SSH server in your network which has access to your database. This is necessary if the Connector cannot connect directly to your database.
- See SSH Tunnel for setup instructions.
-
Username: Username to access the database. By default, Streamkap scripts use
streamkap_user. - Password: Password to access the database.
- Source Database: The name of the Informix database from which to stream the changes.
-
Heartbeats:
- Heartbeat Table Schema: Streamkap will use a table in this schema to manage heartbeats. For Informix this is typically the owner of the
streamkap_heartbeattable (e.g.streamkap_user). See Heartbeats for setup instructions.
- Heartbeat Table Schema: Streamkap will use a table in this schema to manage heartbeats. For Informix this is typically the owner of the
3. Snapshot Settings
- Signal Table (Schema.Table): The Connector will use this table to manage snapshots. You can specify either just the schema/owner name (e.g.,
streamkap_user) or the full path inschema.tableformat (e.g.,streamkap_user.streamkap_signal). See Enable Snapshots for setup instructions.
4. Advanced Parameters
- Represent binary data as: Specifies how the data for binary columns e.g.
byte,blobshould be interpreted. Your destination for this data can impact which option you choose. Default isbytes. - CDC Engine Buffer Size (bytes) (Default
65536) — Size of the read buffer used by the CDC engine. Increase only if you see CDC-side back-pressure on very high-volume tables. - CDC Engine Timeout (seconds) (Default
5) — How long the CDC engine will wait on a blocking read before breaking out to check for shutdown signals. The default is correct for most deployments. - Capture Only Captured Databases DDL (Default
false) — Whether the Connector records schema structures for all databases on the server or only the one you’ve configured. Enabling this can improve performance and reduce startup time when you have many databases. See Schema History Optimization for details. - Capture Only Captured Tables DDL (Default
false) — Whether the Connector records schema structures for all tables in the configured database or only the ones it is capturing. Enabling this can improve performance when the database has many tables. See Schema History Optimization for details.
5. Schema and Table Capture
- Add Schemas/Tables: Specify the schema(s) and table(s) for capture. Enter each entry as
schema.table(for exampleinformix.customers). Streamkap automatically qualifies each entry with the Source Database you configured above, so you do not need to repeat the database name in the UI.- You can bulk upload here. The format is a simple list of schemas and tables, with each entry on a new row. Save as a
.csvfile without a header.
- You can bulk upload here. The format is a simple list of schemas and tables, with each entry on a new row. Save as a
Troubleshooting
The Connector is running but no data is arriving
The Connector is running but no data is arriving
There can be a number of reasons. The most common are misconfiguration of CDC and missing privileges.1. Confirm If If The user must have
syscdcv1 is present and reachablesyscdcv1 is missing, re-run dbaccess sysadmin $INFORMIXDIR/etc/syscdcv1.sql on the Informix host as the informix user.2. Confirm the source database is loggedis_logging is 0, the database is unlogged and no change events can be captured. Enable logging (see Enable Change Data Capture).3. Confirm the Streamkap user has accessC (Connect) access to both the source database and syscdcv1, plus SELECT on every table you want to capture.If you’re still stuck after these checks, please reach out to us.Enabling logging fails because no level-0 archive exists
Enabling logging fails because no level-0 archive exists
Informix refuses to change a database’s logging mode until a full archive has been taken. Run a level-0 archive first and then retry the If you don’t need to retain the archive, you can target
ondblog command./dev/null (or Windows equivalent) for the tape device in your $INFORMIXDIR/etc/onconfig (TAPEDEV) before running ontape -s -L 0.Schema changes on captured tables
Schema changes on captured tables
Informix’s CDC API emits a metadata record to the change stream whenever a captured table’s structure changes (for example, after an
ALTER TABLE ... ADD COLUMN). The Connector processes these records inline — it re-reads the current table definition from the Informix catalog, writes the updated structure to the schema-history topic, and subsequent change events for that table use the new schema automatically. No Connector restart is required.If you still don’t see new columns after a DDL change:- Confirm the DDL ran against the captured base table (not a view or synonym — CDC only fires for base tables).
- Confirm the Connector is streaming (not paused or in error). The metadata record is only processed during active streaming.
- Wait for at least one DML event on the altered table after the DDL. The metadata record is emitted as part of the ongoing CDC stream; if the table is completely idle, downstream consumers may not immediately observe the updated schema.