Multi-Environment Support¶
streamt supports managing multiple environments (dev, staging, prod) with isolated configurations, protected environment safeguards, and per-environment secrets.
Overview¶
In multi-environment mode, each environment has its own:
- Runtime configuration (Kafka clusters, Flink clusters, Schema Registry)
- Safety settings (protected environments, destructive operation controls)
- Environment variables (via
.env.{env}files)
Setup¶
Directory Structure¶
Create an environments/ directory in your project root:
my-project/
├── stream_project.yml # Project definition (no runtime section)
├── environments/
│ ├── dev.yml # Development environment
│ ├── staging.yml # Staging environment
│ └── prod.yml # Production environment
├── .env # Base environment variables
├── .env.dev # Dev-specific variables
├── .env.staging # Staging-specific variables
├── .env.prod # Prod-specific variables
├── models/
└── sources/
Mode Detection
streamt automatically detects the mode:
- Single-env mode: No
environments/directory →runtime:required instream_project.yml - Multi-env mode:
environments/directory exists →runtime:comes from environment files
Environment File Format¶
Each environment file defines runtime configuration and safety settings:
environment:
name: dev
description: Local development environment
runtime:
kafka:
bootstrap_servers: localhost:9092
schema_registry:
url: http://localhost:8081
flink:
default: local
clusters:
local:
rest_url: http://localhost:8082
sql_gateway_url: http://localhost:8084
environment:
name: prod
description: Production environment
protected: true # Requires confirmation for apply
runtime:
kafka:
bootstrap_servers: ${PROD_KAFKA_SERVERS}
schema_registry:
url: ${PROD_SR_URL}
username: ${PROD_SR_USER}
password: ${PROD_SR_PASS}
flink:
default: prod-cluster
clusters:
prod-cluster:
rest_url: ${PROD_FLINK_URL}
safety:
confirm_apply: true # Require --confirm flag in CI
allow_destructive: false # Block topic deletions, etc.
CLI Usage¶
Targeting Environments¶
Use the --env flag to target a specific environment:
# Validate specific environment
streamt validate --env dev
streamt validate --env prod
# Plan deployment
streamt plan --env staging
# Apply changes
streamt apply --env dev
Environment Variable¶
Set STREAMT_ENV to avoid repeating --env:
export STREAMT_ENV=dev
streamt validate # Uses dev environment
streamt plan # Uses dev environment
streamt apply # Uses dev environment
CLI Flag Priority
The --env flag always overrides STREAMT_ENV:
Validate All Environments¶
Validate all environments at once with --all-envs:
This validates each environment sequentially and fails if any environment is invalid.
List Environments¶
View available environments:
Output:
dev Local development environment
staging Staging environment
prod Production environment [protected]
Show Environment Config¶
View resolved configuration (secrets masked):
Output:
environment:
name: prod
description: Production environment
protected: true
runtime:
kafka:
bootstrap_servers: prod-kafka.example.com:9092
schema_registry:
url: https://prod-sr.example.com
username: admin
password: '****'
Protected Environments¶
Mark critical environments as protected to prevent accidental deployments:
Behavior¶
Interactive mode (terminal):
$ streamt apply --env prod
WARNING: Deploying to protected environment 'prod'
WARNING: 'prod' is a protected environment.
Type 'prod' to confirm: _
Non-interactive mode (CI/CD):
# Without --confirm: fails
$ streamt apply --env prod
ERROR: 'prod' is a protected environment. Use --confirm flag in non-interactive mode.
# With --confirm: proceeds
$ streamt apply --env prod --confirm
WARNING: Deploying to protected environment 'prod'
Applying changes...
Destructive Safety¶
Block destructive operations (topic deletions, connector removals) in critical environments:
Behavior¶
# Blocked by default
$ streamt apply --env prod --confirm
ERROR: Destructive operations blocked for 'prod' environment. Use --force flag to override.
# Override with --force
$ streamt apply --env prod --confirm --force
WARNING: --force flag used, allowing destructive operations on 'prod'
Applying changes...
Environment Variables¶
.env File Precedence¶
Environment variables are loaded with this precedence (later wins):
.env— Base variables (always loaded).env.{environment}— Environment-specific overrides- Actual environment variables — Highest priority
Example¶
runtime:
kafka:
bootstrap_servers: ${KAFKA_SERVERS} # Resolves to prod-kafka.example.com:9092
Override in CI/CD¶
# .env.prod has KAFKA_SERVERS=prod-kafka.example.com:9092
# But actual env var takes precedence
export KAFKA_SERVERS=custom-kafka.example.com:9092
streamt apply --env prod --confirm # Uses custom-kafka.example.com:9092
CI/CD Integration¶
GitHub Actions Example¶
name: Deploy
on:
push:
branches: [main]
jobs:
deploy-staging:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Deploy to Staging
env:
STAGING_KAFKA: ${{ secrets.STAGING_KAFKA }}
STAGING_SR_URL: ${{ secrets.STAGING_SR_URL }}
run: |
streamt validate --env staging
streamt plan --env staging
streamt apply --env staging
deploy-prod:
needs: deploy-staging
runs-on: ubuntu-latest
environment: production # GitHub environment protection
steps:
- uses: actions/checkout@v4
- name: Deploy to Production
env:
PROD_KAFKA_SERVERS: ${{ secrets.PROD_KAFKA }}
PROD_SR_URL: ${{ secrets.PROD_SR_URL }}
run: |
streamt validate --env prod
streamt plan --env prod
# --confirm required for protected environments in CI
streamt apply --env prod --confirm
Best Practices¶
1. Use Protected Environments for Production¶
2. Keep Secrets in .env Files¶
Never commit secrets to environment YAML files. Use variable references:
# Good: Reference variables
runtime:
kafka:
bootstrap_servers: ${KAFKA_SERVERS}
schema_registry:
password: ${SR_PASSWORD}
3. Validate All Environments in CI¶
4. Use Descriptive Environment Names¶
Migration from Single-Env Mode¶
To migrate an existing single-env project:
- Create
environments/directory - Move
runtime:fromstream_project.ymltoenvironments/dev.yml - Create additional environment files as needed
- Update CI/CD to use
--envflag
Before:
project:
name: my-project
version: "1.0.0"
runtime:
kafka:
bootstrap_servers: localhost:9092
sources:
- name: events
topic: events.raw.v1
After:
project:
name: my-project
version: "1.0.0"
sources:
- name: events
topic: events.raw.v1
environment:
name: dev
description: Development environment
runtime:
kafka:
bootstrap_servers: localhost:9092
Troubleshooting¶
"No environments configured"¶
You're using --env in single-env mode. Either:
- Create an environments/ directory with environment files
- Remove the --env flag
"Multiple environments found. Specify with --env"¶
In multi-env mode, you must specify which environment to use:
Or set the environment variable:
"Environment 'xyz' not found"¶
Check that the environment file exists: environments/xyz.yml
"Environment name mismatch"¶
ERROR: Environment name mismatch: file is 'dev.yml' but environment.name is 'development'. They must match.
The name in the YAML must match the filename: