The Streamkap CLI lets you manage your Change Data Capture (CDC) infrastructure from the terminal. It covers the full Streamkap API — pipelines, sources, destinations, transforms, topics, and more.
The CLI is designed to work well in both interactive and agentic workflows. Output defaults to human-readable text in a terminal and switches to JSON when piped to another process or AI agent.
Prerequisites
- Node.js 20+
- An API token (Client ID and Client Secret) — see API Tokens for how to create one
Check your Node.js version:
node -v # should be v20.x or higher
If you need to install or update Node.js, visit nodejs.org or use nvm.
Installation
npm install -g @streamkap/tools
Authentication
The CLI supports three authentication methods. Pick whichever fits your workflow — when more than one is set, command-line flags win, then environment variables, then a saved profile.
Environment Variables
Login Command
Per-Command Flags
Recommended for CI/CD and scripts:export STREAMKAP_CLIENT_ID="your-client-id"
export STREAMKAP_CLIENT_SECRET="your-client-secret"
Saves credentials to ~/.config/streamkap/config.json:streamkap auth login --client-id your-client-id --client-secret your-client-secret
streamkap auth status # Check what's configured
streamkap auth logout # Clear saved tokens
Pass credentials inline:streamkap pipelines list --client-id your-client-id --client-secret your-client-secret
Verify Your Setup
After authenticating, validate your install end-to-end:
doctor checks your credentials, API connectivity, infrastructure counts, and optionally Kafka and Schema Registry. It exits non-zero if any check fails.
Usage
streamkap --help # List all commands
streamkap pipelines list # List pipelines
streamkap pipelines get <id> # Get pipeline details
streamkap sources list # List sources
streamkap sources metrics <id> # Source metrics
streamkap destinations list # List destinations
streamkap dashboard stats # Organisation overview
streamkap doctor # Validate setup & connectivity
By default, output is human-readable text in a terminal and JSON when piped. Override with:
streamkap pipelines list --json # Force JSON output
streamkap pipelines list --format text # Force text output
Destructive Commands
Commands that modify or delete resources (delete, stop, reset) require confirmation in interactive mode. Each destructive subcommand accepts --yes to skip the prompt and --dry-run to preview the call without executing it:
streamkap sources delete <id> --dry-run # Preview what would happen
streamkap sources delete <id> --yes # Skip confirmation
When output is piped (non-TTY), destructive commands run without confirmation so they work in scripts and agent pipelines. Be careful when piping commands like streamkap sources delete <id> into other tools — they will execute immediately.
Named Profiles
Use --profile to switch between multiple accounts without re-exporting env vars:
streamkap auth login --profile prod --client-id <id> --client-secret <secret>
streamkap pipelines list --profile prod
Commands
| Group | Description |
|---|
pipelines | Create, update, delete, monitor metrics and logs, bulk operations |
sources | Manage CDC connectors, deploy, pause, resume, stop, restart, snapshots |
destinations | Manage sinks, deploy, pause, resume, stop, restart |
transforms | Manage stream processors, deploy to preview/production, unit tests, clone |
topics | List, inspect, create Kafka topics, read sample messages |
tags | Organise resources with tags |
schema-registry | Browse subjects and schemas |
consumer-groups | Inspect lag, reset offsets |
kafka-access | Manage direct Kafka cluster users |
kafka | Direct Kafka produce, consume, and real-time subscribe |
dashboard | Organisation statistics, log search and aggregation |
alerts | Manage subscribers and notification preferences |
usage | Usage metrics and export |
cluster-scaling | Inspect and scale your Kafka cluster |
admin | List and switch services |
auth | Login, logout, status, token |
doctor | Validate setup, credentials, and API connectivity |
completions | Generate shell completion scripts |
Direct Kafka Access
Most users don’t need this on day one. The CLI’s REST-based commands (like topics messages) let you inspect topic data without Kafka credentials.
The kafka command group connects directly to the Kafka brokers for produce, consume, and real-time subscribe operations. These commands do not require API credentials — only Kafka credentials.
First, create a Kafka user from the Streamkap dashboard (Kafka Access page) or via streamkap kafka-access create. This gives you the bootstrap servers, username, and password. See Kafka Access for details.
Then set all three Kafka environment variables before running any kafka command. They must be set together — partial config will fail at startup with a clear error:
export KAFKA_BOOTSTRAP_SERVERS="host1:9092,host2:9092"
export KAFKA_API_KEY="your-kafka-username"
export KAFKA_API_SECRET="your-kafka-password"
# Optional — enables Avro / JSON Schema / Protobuf encode and decode.
# Schema Registry reuses the same Kafka user credentials.
export SCHEMA_REGISTRY_URL="https://sr.streamkap.net:8081"
export SCHEMA_REGISTRY_USERNAME="$KAFKA_API_KEY"
export SCHEMA_REGISTRY_PASSWORD="$KAFKA_API_SECRET"
# Produce a single message (optionally encoded via Schema Registry)
streamkap kafka produce my-topic \
--value '{"id":"123","event":"create"}' \
--key user-123 \
--schema-registry
# Consume a batch of messages with a timeout
streamkap kafka consume my-topic \
--max-messages 10 \
--timeout 10000 \
--decode
# Subscribe in real time to one or more topics (Ctrl+C to exit)
streamkap kafka subscribe topic1,topic2 --timeout 30000 --decode
# Subscribe by regex pattern
streamkap kafka subscribe --pattern 'source_.*' --decode
Global Options
| Flag | Description |
|---|
-j, --json | Force JSON output |
-f, --format <fmt> | Output format: json, text, or auto (default: auto) |
-p, --profile <name> | Named credential profile |
--api-url <url> | Override API URL |
--client-id <id> | Override client ID |
--client-secret <secret> | Override client secret (prefer env vars to avoid shell history exposure) |
-v, --verbose | Show request method, path, and timing on stderr |
-q, --quiet | Suppress all non-data output |
-y, --yes | Skip confirmation for destructive commands |
--no-color | Disable ANSI colors |
-V, --version | Show CLI version |
Exit Codes
The CLI returns typed exit codes so scripts and agents can branch on the failure mode without parsing error messages. When a command fails, the JSON error output also includes code and exitCode fields matching the table below.
| Code | Name | Meaning |
|---|
0 | SUCCESS | Command completed successfully |
1 | GENERAL_ERROR | Unspecified error |
2 | USAGE_ERROR | Invalid arguments or missing confirmation on a destructive command |
3 | AUTH_ERROR | Missing or invalid credentials (HTTP 401) |
4 | PERMISSION_DENIED | Authenticated but not authorised (HTTP 403) |
5 | NOT_FOUND | Resource not found (HTTP 404) |
6 | CONFLICT | Resource state conflict (HTTP 409) |
7 | RATE_LIMITED | API rate limit exceeded (HTTP 429) |
8 | SERVER_ERROR | Streamkap API returned 5xx |
9 | TIMEOUT | Request timed out |
10 | CONFIG_ERROR | Local configuration problem |
11 | NETWORK_ERROR | DNS/connection failure |
Shell Completions
Generate shell completions for your shell:
streamkap completions bash >> ~/.bashrc
streamkap completions zsh >> ~/.zshrc
streamkap completions fish > ~/.config/fish/completions/streamkap.fish
Using with AI Agents
When output is piped (non-TTY), the CLI automatically switches to JSON and skips confirmation prompts. For a richer agent experience with natural language, use the MCP Server instead. See Agents for all integration paths.
- Agents — using the CLI and other tools in agent workflows
- MCP Server — connect AI agents to Streamkap via natural language
- API Reference — full REST API documentation