Skip to main content
The Streamkap CLI lets you manage your Change Data Capture (CDC) infrastructure from the terminal. It covers the full Streamkap API — pipelines, sources, destinations, transforms, topics, and more. The CLI is designed to work well in both interactive and agentic workflows. Output defaults to human-readable text in a terminal and switches to JSON when piped to another process or AI agent.

Prerequisites

  • Node.js 20+
  • An API token (Client ID and Client Secret) — see API Tokens for how to create one
Check your Node.js version:
node -v   # should be v20.x or higher
If you need to install or update Node.js, visit nodejs.org or use nvm.

Installation

npm install -g @streamkap/tools

Authentication

The CLI supports three authentication methods. Pick whichever fits your workflow — when more than one is set, command-line flags win, then environment variables, then a saved profile.
Recommended for CI/CD and scripts:
export STREAMKAP_CLIENT_ID="your-client-id"
export STREAMKAP_CLIENT_SECRET="your-client-secret"

Verify Your Setup

After authenticating, validate your install end-to-end:
streamkap doctor
doctor checks your credentials, API connectivity, infrastructure counts, and optionally Kafka and Schema Registry. It exits non-zero if any check fails.

Usage

streamkap --help                          # List all commands
streamkap pipelines list                  # List pipelines
streamkap pipelines get <id>             # Get pipeline details
streamkap sources list                    # List sources
streamkap sources metrics <id>           # Source metrics
streamkap destinations list              # List destinations
streamkap dashboard stats                # Organisation overview
streamkap doctor                         # Validate setup & connectivity

Output Formats

By default, output is human-readable text in a terminal and JSON when piped. Override with:
streamkap pipelines list --json          # Force JSON output
streamkap pipelines list --format text   # Force text output

Destructive Commands

Commands that modify or delete resources (delete, stop, reset) require confirmation in interactive mode. Each destructive subcommand accepts --yes to skip the prompt and --dry-run to preview the call without executing it:
streamkap sources delete <id> --dry-run    # Preview what would happen
streamkap sources delete <id> --yes        # Skip confirmation
When output is piped (non-TTY), destructive commands run without confirmation so they work in scripts and agent pipelines. Be careful when piping commands like streamkap sources delete <id> into other tools — they will execute immediately.

Named Profiles

Use --profile to switch between multiple accounts without re-exporting env vars:
streamkap auth login --profile prod --client-id <id> --client-secret <secret>
streamkap pipelines list --profile prod

Commands

GroupDescription
pipelinesCreate, update, delete, monitor metrics and logs, bulk operations
sourcesManage CDC connectors, deploy, pause, resume, stop, restart, snapshots
destinationsManage sinks, deploy, pause, resume, stop, restart
transformsManage stream processors, deploy to preview/production, unit tests, clone
topicsList, inspect, create Kafka topics, read sample messages
tagsOrganise resources with tags
schema-registryBrowse subjects and schemas
consumer-groupsInspect lag, reset offsets
kafka-accessManage direct Kafka cluster users
kafkaDirect Kafka produce, consume, and real-time subscribe
dashboardOrganisation statistics, log search and aggregation
alertsManage subscribers and notification preferences
usageUsage metrics and export
cluster-scalingInspect and scale your Kafka cluster
adminList and switch services
authLogin, logout, status, token
doctorValidate setup, credentials, and API connectivity
completionsGenerate shell completion scripts

Direct Kafka Access

Most users don’t need this on day one. The CLI’s REST-based commands (like topics messages) let you inspect topic data without Kafka credentials.
The kafka command group connects directly to the Kafka brokers for produce, consume, and real-time subscribe operations. These commands do not require API credentials — only Kafka credentials. First, create a Kafka user from the Streamkap dashboard (Kafka Access page) or via streamkap kafka-access create. This gives you the bootstrap servers, username, and password. See Kafka Access for details. Then set all three Kafka environment variables before running any kafka command. They must be set together — partial config will fail at startup with a clear error:
export KAFKA_BOOTSTRAP_SERVERS="host1:9092,host2:9092"
export KAFKA_API_KEY="your-kafka-username"
export KAFKA_API_SECRET="your-kafka-password"
# Optional — enables Avro / JSON Schema / Protobuf encode and decode.
# Schema Registry reuses the same Kafka user credentials.
export SCHEMA_REGISTRY_URL="https://sr.streamkap.net:8081"
export SCHEMA_REGISTRY_USERNAME="$KAFKA_API_KEY"
export SCHEMA_REGISTRY_PASSWORD="$KAFKA_API_SECRET"
# Produce a single message (optionally encoded via Schema Registry)
streamkap kafka produce my-topic \
  --value '{"id":"123","event":"create"}' \
  --key user-123 \
  --schema-registry
# Consume a batch of messages with a timeout
streamkap kafka consume my-topic \
  --max-messages 10 \
  --timeout 10000 \
  --decode
# Subscribe in real time to one or more topics (Ctrl+C to exit)
streamkap kafka subscribe topic1,topic2 --timeout 30000 --decode

# Subscribe by regex pattern
streamkap kafka subscribe --pattern 'source_.*' --decode

Global Options

FlagDescription
-j, --jsonForce JSON output
-f, --format <fmt>Output format: json, text, or auto (default: auto)
-p, --profile <name>Named credential profile
--api-url <url>Override API URL
--client-id <id>Override client ID
--client-secret <secret>Override client secret (prefer env vars to avoid shell history exposure)
-v, --verboseShow request method, path, and timing on stderr
-q, --quietSuppress all non-data output
-y, --yesSkip confirmation for destructive commands
--no-colorDisable ANSI colors
-V, --versionShow CLI version

Exit Codes

The CLI returns typed exit codes so scripts and agents can branch on the failure mode without parsing error messages. When a command fails, the JSON error output also includes code and exitCode fields matching the table below.
CodeNameMeaning
0SUCCESSCommand completed successfully
1GENERAL_ERRORUnspecified error
2USAGE_ERRORInvalid arguments or missing confirmation on a destructive command
3AUTH_ERRORMissing or invalid credentials (HTTP 401)
4PERMISSION_DENIEDAuthenticated but not authorised (HTTP 403)
5NOT_FOUNDResource not found (HTTP 404)
6CONFLICTResource state conflict (HTTP 409)
7RATE_LIMITEDAPI rate limit exceeded (HTTP 429)
8SERVER_ERRORStreamkap API returned 5xx
9TIMEOUTRequest timed out
10CONFIG_ERRORLocal configuration problem
11NETWORK_ERRORDNS/connection failure

Shell Completions

Generate shell completions for your shell:
streamkap completions bash >> ~/.bashrc
streamkap completions zsh >> ~/.zshrc
streamkap completions fish > ~/.config/fish/completions/streamkap.fish

Using with AI Agents

When output is piped (non-TTY), the CLI automatically switches to JSON and skips confirmation prompts. For a richer agent experience with natural language, use the MCP Server instead. See Agents for all integration paths.
  • Agents — using the CLI and other tools in agent workflows
  • MCP Server — connect AI agents to Streamkap via natural language
  • API Reference — full REST API documentation