Skip to main content
The Streamkap Terraform Provider lets you manage data integration infrastructure using Terraform’s declarative configuration language. Define sources, destinations, pipelines, and transforms as code for version control, reproducibility, and automation.

What is Terraform?

Terraform is an infrastructure-as-code (IaC) tool that allows you to define and provision infrastructure using a declarative configuration language. Instead of manually configuring resources through a UI, you write configuration files that describe your desired state, and Terraform handles creating, updating, and deleting resources to match.

Why Use Terraform with Streamkap?

Version Control

Track changes to your data infrastructure in Git alongside your application code.

Reproducibility

Easily replicate your Streamkap setup across development, staging, and production environments.

Automation

Integrate with CI/CD pipelines for automated infrastructure deployment and updates.

Documentation

Your Terraform configuration serves as living documentation of your data infrastructure.

Available Resources

The Streamkap provider supports managing sources, destinations, pipelines, topics, and more.
ResourceTerraform TypeDescription
PostgreSQLstreamkap_source_postgresqlCDC from PostgreSQL databases
MySQLstreamkap_source_mysqlCDC from MySQL databases
MongoDBstreamkap_source_mongodbCDC from MongoDB databases
DynamoDBstreamkap_source_dynamodbCDC from AWS DynamoDB tables
SQL Serverstreamkap_source_sqlserverCDC from Microsoft SQL Server
Kafka Directstreamkap_source_kafkadirectConsume from external Kafka topics
See the Resource Reference for complete configuration examples for each resource type.

Quick Example

terraform {
  required_providers {
    streamkap = {
      source  = "streamkap-com/streamkap"
      version = ">= 2.1.18"
    }
  }
}

provider "streamkap" {}

# Define a PostgreSQL source
resource "streamkap_source_postgresql" "orders_db" {
  name              = "orders-database"
  database_hostname = "db.example.com"
  database_port     = 5432
  database_dbname   = "orders"
  database_user     = var.db_username
  database_password = var.db_password
  database_sslmode  = "require"

  schema_include_list = "public"
  table_include_list  = "public.orders,public.customers"
  
  snapshot_read_only  = "No"
  signal_data_collection_schema_or_database    = "streamkap"
  heartbeat_enabled                            = true
  heartbeat_data_collection_schema_or_database = "streamkap"

  slot_name        = "streamkap_slot"
  publication_name = "streamkap_pub"
}

# Define a Snowflake destination
resource "streamkap_destination_snowflake" "warehouse" {
  name                    = "analytics-warehouse"
  snowflake_url_name      = var.snowflake_url
  snowflake_user_name     = var.snowflake_user
  snowflake_private_key   = var.snowflake_private_key
  snowflake_database_name = "ANALYTICS"
  snowflake_schema_name   = "PUBLIC"
  snowflake_role_name     = "STREAMKAP_ROLE"
  sfwarehouse             = "COMPUTE_WH"
}

# Create a pipeline connecting them
resource "streamkap_pipeline" "orders_to_snowflake" {
  name                = "orders-pipeline"
  snapshot_new_tables = true

  source = {
    id        = streamkap_source_postgresql.orders_db.id
    name      = streamkap_source_postgresql.orders_db.name
    connector = streamkap_source_postgresql.orders_db.connector
    topics    = ["public.orders", "public.customers"]
  }

  destination = {
    id        = streamkap_destination_snowflake.warehouse.id
    name      = streamkap_destination_snowflake.warehouse.name
    connector = streamkap_destination_snowflake.warehouse.connector
  }
}

Next Steps

Additional Resources