Skip to content

Auditware/gcp-security-alerts

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

gcp-security-alerts

Real-time GCP security alerting via Cloud Logging → Pub/Sub → Cloud Function → Slack (or any webhook).

The native GCP Security Command Center alerting has multi-minute latency. This module wires up a Cloud Logging sink that captures high-signal audit events and forwards them to a webhook within seconds.

Architecture

GCP Audit Logs (Admin Activity)
        │
        ▼
Cloud Logging sink  (filter: IAM, firewall, SA keys, GCS ACL, KMS, org policy)
        │
        ▼
Pub/Sub topic  ──────────────────────────────► dead-letter topic
        │
        ▼ (CloudEvent trigger)
Cloud Function (Gen 2, Python 3.12)
        │
        ▼
POST → Slack webhook (or any HTTP endpoint)

Events that trigger alerts (default filter)

Category What's watched
IAM SetIamPolicy on any resource
Service Accounts Create, delete, disable, enable, undelete, key create/delete
Firewall Rule insert, patch, delete
Network VPC create/delete, peering, subnetworks, routers
VPN / Interconnect Tunnel and attachment create/delete (traffic exfil risk)
Compute Instances Service account swap, metadata injection, IAM change
GCS Bucket/object ACL and policy changes
KMS Key create, destroy, disable, enable, import, update
Org Policy Create, update, delete
Secret Manager Secret and version lifecycle, IAM changes
Cloud SQL Data export, authorized network changes
GKE Cluster create, delete, update
Cloud Run IAM changes (public exposure), service deletion
Cloud Functions IAM changes (public exposure), function deletion
Logging Sinks Sink delete/update, exclusion create/update/delete (pipeline integrity)
Access Context Manager VPC service control perimeter create/update/delete
BigQuery Dataset IAM and sharing changes
Cloud Armor WAF security policy create, patch, delete
Artifact Registry Repository and image create/delete/update
Binary Authorization Policy update or delete
Cloud Scheduler / Tasks Job and queue create/update/delete
Billing Account updates, resource association changes

Override with your own filter by setting log_filter in terraform.tfvars.

Prerequisites

GCP APIs

Enable the following APIs in your project before applying:

gcloud services enable \
  cloudfunctions.googleapis.com \
  cloudbuild.googleapis.com \
  run.googleapis.com \
  pubsub.googleapis.com \
  logging.googleapis.com \
  storage.googleapis.com \
  artifactregistry.googleapis.com \
  --project YOUR_PROJECT_ID

Deployer permissions

The identity running terraform apply needs the following roles (or equivalent):

  • roles/cloudfunctions.admin
  • roles/iam.serviceAccountAdmin
  • roles/iam.serviceAccountUser
  • roles/pubsub.admin
  • roles/logging.admin
  • roles/storage.admin
  • roles/resourcemanager.projectIamAdmin

Terraform and gcloud CLI

  • Terraform >= 1.5.0
  • gcloud authenticated: gcloud auth application-default login

Usage

Standalone deployment

Create terraform.tfvars in this directory:

project_id  = "my-secure-project"
region      = "us-central1"
webhook_url = "https://hooks.slack.com/services/XXX/YYY/ZZZ"

Then apply:

terraform init
terraform apply

As a module

module "security_alerts" {
  source      = "./gcp-security-alerts"
  project_id  = "my-secure-project"
  region      = "us-central1"
  webhook_url = var.slack_webhook_url
}

Optional variables

Variable Default Description
name_prefix "security-alerts" Prefix for all resource names. Change to deploy multiple instances.
log_filter (curated security filter) Custom Cloud Logging filter. Set to "" to use default or provide your own.
function_memory_mb 256 Cloud Function memory in MB.
function_timeout_seconds 60 Cloud Function timeout in seconds.

Testing

1. Verify the pipeline is wired up

After terraform apply, check that the log sink and function exist:

PROJECT=your-project-id

# List log sinks
gcloud logging sinks list --project=$PROJECT

# Describe the function
gcloud functions describe security-alerts-fn --region=us-central1 --project=$PROJECT

2. Trigger a real event

The fastest way to generate an event matching the default filter is to create and immediately delete an IAM service account:

gcloud iam service-accounts create test-alert-sa \
  --description="Temporary SA for alert testing" \
  --project=$PROJECT

gcloud iam service-accounts delete \
  test-alert-sa@${PROJECT}.iam.gserviceaccount.com \
  --project=$PROJECT --quiet

You should see a Slack alert within ~10–30 seconds.

3. Inject a test message directly into Pub/Sub

This bypasses the log sink and tests the function in isolation. It is the fastest feedback loop.

First, build a realistic audit log payload and base64-encode it:

TOPIC=$(terraform output -raw pubsub_topic_name)

PAYLOAD=$(python3 -c "
import base64, json
entry = {
  'protoPayload': {
    '@type': 'type.googleapis.com/google.cloud.audit.AuditLog',
    'methodName': 'google.iam.admin.v1.CreateServiceAccountKey',
    'resourceName': 'projects/test-project/serviceAccounts/[email protected]',
    'serviceName': 'iam.googleapis.com',
    'authenticationInfo': {'principalEmail': '[email protected]'},
    'requestMetadata': {'callerIp': '203.0.113.1'}
  },
  'severity': 'NOTICE',
  'timestamp': '2026-03-25T12:00:00Z',
  'resource': {'labels': {'project_id': 'test-project'}}
}
print(base64.b64encode(json.dumps(entry).encode()).decode())
")

gcloud pubsub topics publish $TOPIC \
  --message="$PAYLOAD" \
  --project=$PROJECT

4. Check function logs

gcloud functions logs read security-alerts-fn \
  --region=us-central1 \
  --project=$PROJECT \
  --limit=50

Or via Cloud Logging:

gcloud logging read \
  'resource.type="cloud_run_revision" AND resource.labels.service_name="security-alerts-fn"' \
  --project=$PROJECT \
  --limit=20 \
  --format=json

5. Inspect the dead-letter topic

If alerts are not arriving, check whether messages landed in the dead-letter topic:

DEAD_LETTER=$(terraform output -raw dead_letter_topic_name)

gcloud pubsub subscriptions create tmp-dl-check \
  --topic=$DEAD_LETTER \
  --project=$PROJECT

gcloud pubsub subscriptions pull tmp-dl-check \
  --auto-ack \
  --project=$PROJECT

# Clean up
gcloud pubsub subscriptions delete tmp-dl-check --project=$PROJECT

Teardown

terraform destroy

The GCS bucket has force_destroy = true so it will be deleted along with its contents.

Notes

  • The webhook URL is marked sensitive in Terraform so it will not appear in plan/apply output.
  • The function runs under a dedicated least-privilege service account (roles/logging.logWriter only).
  • The function uses Cloud Functions Gen 2 (backed by Cloud Run), which supports longer timeouts and better concurrency than Gen 1.
  • 4xx responses from the webhook are treated as permanent failures (not retried). 5xx and network errors are retried automatically by Cloud Functions.
  • Unprocessable messages (permanently malformed) are ACKed so they do not cause infinite retries.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors