This repository contains setup scripts and Terraform modules for integrating your GCP project or organization with Better Stack.
| Directory | Description |
|---|---|
| setup.sh | Bash setup script for quick deployment |
| terraform/ | Terraform module for infrastructure-as-code deployment |
To get started with Better Stack on GCP, create a new GCP Source in Better Stack and note the ingesting host and source token from the source configuration.
For deployment, choose either the setup script or the Terraform module.
When you deploy the integration you get:
- Automatic ingestion of all GCP Cloud Logging entries into Better Stack.
- Automatic ingestion of Cloud Monitoring metrics (compute, networking, storage, and more).
- Support for org-level deployment covering all projects, including future ones.
- Remote log filter management — configure which logs are forwarded without redeploying.
The fastest way to deploy. Requires the gcloud CLI.
# Org mode (all projects, including future ones):
./setup.sh \
--project=my-project \
--org-id=123456789 \
--source-token=YOUR_SOURCE_TOKEN \
--ingesting-host=s4191.g1.betterstackdata.com
# Project mode (single project only):
./setup.sh \
--project=my-project \
--source-token=YOUR_SOURCE_TOKEN \
--ingesting-host=s4191.g1.betterstackdata.com| Argument | Default | Description |
|---|---|---|
--region |
europe-west1 |
GCP region for the Dataflow job |
--batch-count |
100 |
Log entries per batch sent to Better Stack |
./setup.sh --teardown \
--project=my-project \
--ingesting-host=s4191.g1.betterstackdata.com \
--org-id=123456789See the Terraform module README for full usage and variable reference.
module "betterstack" {
source = "github.com/betterstack/gcp-integration//terraform"
project_id = "my-project"
org_id = "123456789"
source_token = var.source_token
ingesting_host = var.ingesting_host
}- Service accounts —
betterstack-integration(metrics and log sink management, impersonated by Better Stack via WIF) andbetterstack-dataflow(runs the log forwarding pipeline). - Workload Identity Federation — cross-project authentication using short-lived tokens. No static keys.
- Log sink — captures Cloud Logging entries and routes them to Pub/Sub. Supports org-level or project-level scope.
- Pub/Sub topic + subscription — buffered log delivery between the sink and the forwarding pipeline.
- Dataflow job — batches log entries from Pub/Sub and forwards them to Better Stack.
The user running the setup needs:
- Project-level:
roles/ownerorroles/editoron the GCP project - Org-level (only if deploying with
--org-id):roles/resourcemanager.organizationAdminon the organization
After deployment, Better Stack can remotely configure which logs are forwarded by updating the Logging filter from your source's settings page.