Skip to main content

What is this integration?

Google Cloud Platform (GCP) provides fully managed cloud services including relational databases (Cloud SQL), NoSQL document databases (Firestore), and serverless data warehouses (BigQuery).

What Monk manages

  • Cloud SQL instances, databases, and users
  • Firestore databases with PITR and backup support
  • BigQuery datasets and table snapshots
  • Memorystore for Redis instances with export/import support
  • Cloud Storage buckets
  • Cloud Storage HMAC keys for S3-compatible access
  • Service accounts and IAM bindings
  • API enablement via Service Usage

What the Agent can do and how to use it

  • Database Creation: Provision Cloud SQL, Firestore, BigQuery, and Memorystore for Redis
  • Backup & Recovery: Automated backups, on-demand snapshots, export/import, and restore operations
  • Scaling: Modify instance tiers, storage, and enable high availability
  • Security: Configure authorized networks, SSL, and IAM permissions
  • Monitoring: Access instance status and connection information
Steps:
  1. Ensure GCP provider is added: monk cluster provider add -p gcp
  2. monk update <namespace>/<name>

Auth

  • Uses GCP provider credentials configured via monk cluster provider add -p gcp
  • GCP credentials are automatically injected into the GCP client

Getting Started

  1. Ensure GCP provider is added:
monk cluster provider add -p gcp
  1. Define a Cloud SQL instance (save as gcp-stack.yaml):
namespace: my-app

enable-apis:
  defines: gcp/service-usage
  apis:
    - sqladmin.googleapis.com

my-postgres:
  defines: gcp/cloud-sql-instance
  name: my-app-db
  database_version: POSTGRES_14
  tier: db-f1-micro
  region: us-central1
  backup_start_time: "03:00"              # Enable automated backups
  point_in_time_recovery_enabled: true    # Enable PITR
  depends:
    wait-for:
      runnables:
        - my-app/enable-apis
      timeout: 300
  1. Create/update:
monk load gcp-stack.yaml
monk update my-app/my-postgres
monk describe my-app/my-postgres

S3-Compatible Cloud Storage Access (HMAC)

Create HMAC keys to access Cloud Storage using S3-compatible clients. Make sure storage.googleapis.com is enabled via gcp/service-usage, and use a service account from gcp/service-account:
storage-hmac-keys:
  defines: gcp/cloud-storage-hmac-keys
  service_account_email: <- connection-target("sa") entity-state get-member("email")
  access_key_secret_ref: gcs-hmac-access-key
  secret_key_secret_ref: gcs-hmac-secret-key
  permitted-secrets:
    gcs-hmac-access-key: true
    gcs-hmac-secret-key: true
  connections:
    sa:
      runnable: gcp/service-account/my-sa
      service: service-account
Use https://storage.googleapis.com as the S3 endpoint and the secrets gcs-hmac-access-key / gcs-hmac-secret-key as credentials.

Cloud SQL Backup & Restore Actions

ActionDescription
get-backup-infoView backup configuration and PITR status
create-backupCreate an on-demand backup
list-backupsList available backups (automated and on-demand)
describe-backupGet detailed information about a specific backup
delete-backupDelete a backup
restoreRestore from backup (overwrites instance)
get-restore-statusCheck status of restore operation
# View backup configuration
monk do my-app/my-postgres/get-backup-info

# Create a backup before maintenance
monk do my-app/my-postgres/create-backup description="Pre-upgrade backup"

# List available backups
monk do my-app/my-postgres/list-backups

# Restore from backup (WARNING: overwrites instance!)
monk do my-app/my-postgres/restore backup_id="1765968494026"

# Check restore progress
monk do my-app/my-postgres/get-restore-status operation_name="operations/abc123"

Firestore Backup & Restore Actions

ActionDescription
get-backup-infoView PITR status and configuration
export-documentsExport database to Cloud Storage
import-documentsImport from Cloud Storage export
list-backupsList scheduled backups in a location
describe-backupGet backup details
delete-backupDelete a scheduled backup
restoreRestore to a new database from backup
get-restore-statusCheck restore operation progress
# View backup configuration
monk do my-app/my-firestore/get-backup-info

# Export database to Cloud Storage
monk do my-app/my-firestore/export-documents output_uri_prefix="gs://my-bucket/backup"

# Import from Cloud Storage
monk do my-app/my-firestore/import-documents input_uri_prefix="gs://my-bucket/backup"

# List scheduled backups
monk do my-app/my-firestore/list-backups location="us-central1"

# Restore to a new database
monk do my-app/my-firestore/restore backup_name="projects/.../backups/..." target_database="restored-db"
Note: Firestore PITR enables reading historical document versions (7 days), not database-level restore. Use export-documents for full database backups.

BigQuery Backup & Restore Actions

ActionDescription
get-backup-infoView time travel settings and storage model
create-snapshotCreate a table snapshot
list-snapshotsList tables/snapshots in dataset
describe-snapshotGet table/snapshot details
delete-snapshotDelete a snapshot table
restoreCreate new table from snapshot
# View backup configuration
monk do my-app/my-dataset/get-backup-info

# Create a snapshot of a table
monk do my-app/my-dataset/create-snapshot source_table="events"

# Create snapshot at a specific point in time (time travel)
monk do my-app/my-dataset/create-snapshot source_table="events" snapshot_time="2024-12-16T10:00:00Z"

# List all tables/snapshots
monk do my-app/my-dataset/list-snapshots

# Restore by creating a new table from snapshot
monk do my-app/my-dataset/restore snapshot_table="events_backup" target_table="events_restored"
Time Travel: BigQuery provides built-in time travel (2-7 days) for querying historical data without creating snapshots:
SELECT * FROM `project.dataset.table`
FOR SYSTEM_TIME AS OF TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 1 DAY)

Restore Behavior Summary

DatabaseRestore TargetWarning
Cloud SQLSame instance⚠️ OVERWRITES existing instance
FirestoreNew database✅ Safe - creates new database
BigQueryNew table✅ Safe - creates new table (clone)