diff --git a/.agent/skills/gcp-migration/SKILL.md b/.agent/skills/gcp-migration/SKILL.md new file mode 100644 index 0000000..c112af9 --- /dev/null +++ b/.agent/skills/gcp-migration/SKILL.md @@ -0,0 +1,345 @@ +--- +name: GCP Project Migration +description: Migration between GCP Projects (DB, Storage, Container Images, Terraform) +--- + +# GCP Project Migration Skill + +This skill guides you through the process of fully migrating from one GCP project to another. + +## Prerequisites + +1. Owner/Editor permissions for **both GCP projects** +2. **gcloud CLI** installed and authenticated +3. **Docker** installed (for container image migration) +4. **Terraform infrastructure must be deployed** to the new project first + +## Migration Steps + +### Phase 1: Enable APIs + +Enable required APIs in the new project: + +```bash +gcloud services enable \ + sqladmin.googleapis.com \ + run.googleapis.com \ + artifactregistry.googleapis.com \ + iamcredentials.googleapis.com \ + cloudtasks.googleapis.com \ + pubsub.googleapis.com \ + storage.googleapis.com \ + compute.googleapis.com \ + vpcaccess.googleapis.com \ + aiplatform.googleapis.com \ + --project=NEW_PROJECT_ID +``` + +### Phase 2: Database Migration + +1. **Export from old project** + ```bash + # Switch to old project account + gcloud config set account OLD_ACCOUNT@gmail.com + + # Grant bucket permissions to Cloud SQL service account + OLD_SA=$(gcloud sql instances describe OLD_INSTANCE --project=OLD_PROJECT --format="value(serviceAccountEmailAddress)") + gcloud storage buckets add-iam-policy-binding gs://MIGRATION_BUCKET \ + --member="serviceAccount:$OLD_SA" \ + --role="roles/storage.objectAdmin" + + # Dump DB + gcloud sql export sql OLD_INSTANCE gs://MIGRATION_BUCKET/dump.sql \ + --database=database_name --project=OLD_PROJECT + ``` + +2. **Import to new project** + ```bash + # Switch to new project account + gcloud config set account NEW_ACCOUNT@gmail.com + + # Grant permissions and Import + NEW_SA=$(gcloud sql instances describe NEW_INSTANCE --project=NEW_PROJECT --format="value(serviceAccountEmailAddress)") + gcloud storage buckets add-iam-policy-binding gs://MIGRATION_BUCKET \ + --member="serviceAccount:$NEW_SA" \ + --role="roles/storage.objectViewer" + + gcloud sql import sql NEW_INSTANCE gs://MIGRATION_BUCKET/dump.sql \ + --database=database_name --user=db_username --project=NEW_PROJECT + ``` + +### Phase 3: GCS Bucket Migration + +```bash +# Grant read permission to new account from old account +gcloud storage buckets add-iam-policy-binding gs://OLD_BUCKET \ + --member="user:NEW_ACCOUNT@gmail.com" \ + --role="roles/storage.objectViewer" + +# Create and sync bucket in new account +gcloud storage buckets create gs://NEW_BUCKET --location=REGION --project=NEW_PROJECT +gcloud storage rsync -r gs://OLD_BUCKET gs://NEW_BUCKET +``` + +### Phase 4: Container Image Migration + +```bash +# Docker Authentication +gcloud auth configure-docker OLD_REGION-docker.pkg.dev,NEW_REGION-docker.pkg.dev + +# Image Pull -> Tag -> Push +docker pull OLD_REGION-docker.pkg.dev/OLD_PROJECT/REPO/IMAGE:TAG +docker tag OLD_REGION-docker.pkg.dev/OLD_PROJECT/REPO/IMAGE:TAG \ + NEW_REGION-docker.pkg.dev/NEW_PROJECT/REPO/IMAGE:TAG +docker push NEW_REGION-docker.pkg.dev/NEW_PROJECT/REPO/IMAGE:TAG +``` + +### Phase 5: Modify Terraform Variables + +Modify only the variables at the top of the `apps/infra/variables.tf` file: + +```hcl +variable "project_id" { + default = "NEW_PROJECT_ID" # ← Change +} + +variable "region" { + default = "your_region" # ← Change if necessary +} +``` + +**Manual Changes Required**: +- `backend.bucket` in `provider.tf` (Terraform limitation) + +### Phase 6: Update GitHub Secrets + +In Repository Settings > Secrets and variables > Actions: + +| Secret | Value | +| -------------------------------- | -------------------------------------------------------------------------------------------------------- | +| `GCP_WORKLOAD_IDENTITY_PROVIDER` | `projects/PROJECT_NUMBER/locations/global/workloadIdentityPools/your-pool/providers/github-provider` | +| `GCP_SERVICE_ACCOUNT` | `your-deployer@NEW_PROJECT.iam.gserviceaccount.com` | + +Retrieve Values: +```bash +# Project Number +gcloud projects describe NEW_PROJECT --format="value(projectNumber)" + +# WIF Provider +gcloud iam workload-identity-pools providers describe github-provider \ + --workload-identity-pool=your-pool --location=global --project=NEW_PROJECT --format="value(name)" +``` + +### Phase 7: Code Changes and Deployment + +1. Change `gcs_bucket_name` default value in `apps/api/src/lib/config.py` +2. Commit and Push +3. GitHub Actions will auto-deploy + +## Automation Script + +Run full data migration at once: + +```bash +./.agent/skills/gcp-migration/scripts/migrate-gcp-project.sh \ + --old-project OLD_PROJECT_ID \ + --new-project NEW_PROJECT_ID \ + --old-region your_old_region \ + --new-region your_new_region +``` + +## References + +- [Migration Guide](references/gcp-migration-guide.md) +- [Migration Script](scripts/migrate-gcp-project.sh) + +## Troubleshooting + +### Cloud Run Job fails with TCP Connection Error + +**Symptoms**: +``` +Is the server running on that host and accepting TCP/IP connections? +``` + +**Cause**: A VPC Connector is required for Cloud Run Jobs to access Cloud SQL Private IP. + +**Resolution**: +```bash +# 1. Enable VPC Access API +gcloud services enable vpcaccess.googleapis.com --project=NEW_PROJECT_ID + +# 2. Check VPC Connector +gcloud compute networks vpc-access connectors list \ + --region=REGION \ + --project=NEW_PROJECT_ID + +# 3. Create VPC Connector (if missing, use same network as Cloud SQL) +gcloud compute networks vpc-access connectors create CONNECTOR_NAME \ + --region=REGION \ + --network=NETWORK_NAME \ + --range=10.9.0.0/28 \ + --project=NEW_PROJECT_ID + +# 4. Attach VPC Connector to Cloud Run Job +gcloud run jobs update JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --vpc-connector=CONNECTOR_NAME \ + --vpc-egress=private-ranges-only +``` + +### Direct VPC and VPC Connector Conflict + +**Symptoms**: +``` +VPC connector and direct VPC can not be used together +``` + +**Resolution**: Remove Direct VPC first, then add VPC Connector: +```bash +# Remove Direct VPC +gcloud run jobs update JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --clear-network + +# Add VPC Connector +gcloud run jobs update JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --vpc-connector=CONNECTOR_NAME \ + --vpc-egress=private-ranges-only +``` + +### Password Authentication Failed + +**Symptoms**: +``` +FATAL: password authentication failed for user "username" +``` + +**Resolution**: +```bash +# Check Cloud SQL user list +gcloud sql users list \ + --instance=INSTANCE_NAME \ + --project=NEW_PROJECT_ID + +# Reset password (if needed) +gcloud sql users set-password USERNAME \ + --instance=INSTANCE_NAME \ + --project=NEW_PROJECT_ID \ + --password="NEW_PASSWORD" +``` + +### Missing Required Environment Variables + +**Symptoms**: +``` +pydantic_core._pydantic_core.ValidationError: Field required +jwt_secret_key +``` + +**Resolution**: Environment variables required for Migration Job: +- `DATABASE_URL` - PostgreSQL connection string +- `JWT_SECRET_KEY` - JWT signing secret + +**IMPORTANT**: Use `--update-env-vars` to preserve existing environment variables: +```bash +# WRONG: Overwrites all existing environment variables +gcloud run jobs update JOB_NAME --set-env-vars="KEY=value" + +# CORRECT: Adds/Updates while keeping existing environment variables +gcloud run jobs update JOB_NAME --update-env-vars="KEY=value" +``` + +### Manual Migration Job Fix (Full Steps) + +Full process for manually fixing the Migration Job: + +```bash +# 1. Enable VPC Access API +gcloud services enable vpcaccess.googleapis.com --project=NEW_PROJECT_ID + +# 2. Check Cloud SQL Private IP +DB_IP=$(gcloud sql instances describe INSTANCE_NAME \ + --project=NEW_PROJECT_ID \ + --format="value(ipAddresses[0].ipAddress)") +echo "Cloud SQL Private IP: $DB_IP" + +# 3. Create VPC Connector (same network as Cloud SQL) +gcloud compute networks vpc-access connectors create default-connector \ + --region=REGION \ + --network=default \ + --range=10.9.0.0/28 \ + --project=NEW_PROJECT_ID + +# 4. Remove Direct VPC (if exists) +gcloud run jobs update JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --clear-network + +# 5. Connect VPC Connector + Set Env Vars +gcloud run jobs update JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --vpc-connector=default-connector \ + --vpc-egress=private-ranges-only \ + --update-env-vars="DATABASE_URL=postgresql+asyncpg://USER:PASS@${DB_IP}:5432/DBNAME,JWT_SECRET_KEY=YOUR_JWT_SECRET" + +# 6. Execute Migration +gcloud run jobs execute JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID + +# 7. Check Status +EXECUTION_ID=$(gcloud run jobs executions list \ + --job=JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --limit=1 \ + --format="value(name)") + +gcloud run jobs executions describe $EXECUTION_ID \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --format="value(status.conditions[0].status,status.conditions[0].message)" +``` + +### Verify Cloud Run Job Configuration + +```bash +# Check current Job configuration +gcloud run jobs describe JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --format="yaml(spec.template.spec.template.spec)" + +# Check only VPC configuration +gcloud run jobs describe JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --format="yaml(spec.template.metadata.annotations)" + +# Check environment variables +gcloud run jobs describe JOB_NAME \ + --region=REGION \ + --project=NEW_PROJECT_ID \ + --format="yaml(spec.template.spec.template.spec.containers[0].env)" +``` + +## Checklist + +- [ ] APIs Enabled +- [ ] Database Migration Completed +- [ ] GCS Bucket Migration Completed +- [ ] Container Image Migration Completed +- [ ] Terraform variables.tf Modified +- [ ] GitHub Secrets Updated +- [ ] config.py Modified and Pushed +- [ ] VPC Connector Created and Attached +- [ ] Migration Job Env Vars Set (DATABASE_URL, JWT_SECRET_KEY) +- [ ] Migration Job Executed Successfully +- [ ] Deployment Verified and Tested diff --git a/.agent/skills/gcp-migration/references/gcp-migration-guide.md b/.agent/skills/gcp-migration/references/gcp-migration-guide.md new file mode 100644 index 0000000..e251d72 --- /dev/null +++ b/.agent/skills/gcp-migration/references/gcp-migration-guide.md @@ -0,0 +1,253 @@ +# GCP Project Migration Guide + +This document outlines the entire process of migrating a GCP project to a new project. + +## 📋 Migration Checklist + +### Phase 1: Preparation +- [ ] Create new GCP Project +- [ ] Enable specific APIs (see list below) +- [ ] Configure infrastructure with Terraform (`terraform apply`) +- [ ] Create Cloud SQL instance and DB + +### Phase 2: Data Migration +- [ ] Cloud SQL database dump and restore +- [ ] GCS bucket data copy (prod, backup) +- [ ] Container image migration (api, web) + +### Phase 3: Configuration Changes +- [ ] Update GitHub Secrets +- [ ] Change source code configuration (`config.py`, etc.) +- [ ] Change Terraform environment variables + +### Phase 4: Deployment and Verification +- [ ] Terraform apply +- [ ] GitHub Actions deployment test +- [ ] API/Web endpoint testing + +--- + +## 🔧 Required API List + +APIs to enable in the new project: + +```bash +gcloud services enable \ + sqladmin.googleapis.com \ + run.googleapis.com \ + artifactregistry.googleapis.com \ + iamcredentials.googleapis.com \ + cloudtasks.googleapis.com \ + pubsub.googleapis.com \ + storage.googleapis.com \ + compute.googleapis.com \ + vpcaccess.googleapis.com \ + aiplatform.googleapis.com \ + --project=NEW_PROJECT_ID +``` + +--- + +## 1️⃣ Database Migration + +### Cloud SQL Export (Old Project) + +```bash +# Grant bucket permissions to old project's Cloud SQL service account +OLD_SA=$(gcloud sql instances describe OLD_INSTANCE \ + --project=OLD_PROJECT --format="value(serviceAccountEmailAddress)") + +gcloud storage buckets add-iam-policy-binding gs://MIGRATION_BUCKET \ + --member="serviceAccount:$OLD_SA" \ + --role="roles/storage.objectAdmin" + +# Create DB Dump +gcloud sql export sql OLD_INSTANCE gs://MIGRATION_BUCKET/dump.sql \ + --database=your-db-name \ + --project=OLD_PROJECT +``` + +### Cloud SQL Import (New Project) + +```bash +# Grant bucket permissions to new project's Cloud SQL service account +NEW_SA=$(gcloud sql instances describe NEW_INSTANCE \ + --project=NEW_PROJECT --format="value(serviceAccountEmailAddress)") + +gcloud storage buckets add-iam-policy-binding gs://MIGRATION_BUCKET \ + --member="serviceAccount:$NEW_SA" \ + --role="roles/storage.objectViewer" + +# DB Import +gcloud sql import sql NEW_INSTANCE gs://MIGRATION_BUCKET/dump.sql \ + --database=your-db-name \ + --user=postgres \ + --project=NEW_PROJECT +``` + +### Direct Import (using psql) + +For Private IP instances, either temporarily enable Public IP or use Cloud SQL Proxy: + +```bash +# Enable Public IP +gcloud sql instances patch NEW_INSTANCE --assign-ip --project=NEW_PROJECT + +# Authorize IP +MY_IP=$(curl -s ifconfig.me) +gcloud sql instances patch NEW_INSTANCE \ + --authorized-networks=$MY_IP \ + --project=NEW_PROJECT + +# Import directly with psql +PGPASSWORD='PASSWORD' psql -h PUBLIC_IP -U your-db-user -d your-db-name -f dump.sql +``` + +--- + +## 2️⃣ GCS Bucket Migration + +```bash +# Grant read permission to new account on existing bucket (using old account) +gcloud storage buckets add-iam-policy-binding gs://OLD_BUCKET \ + --member="user:NEW_ACCOUNT@gmail.com" \ + --role="roles/storage.objectViewer" + +# Create new bucket and sync (using new account) +gcloud storage buckets create gs://NEW_BUCKET --location=REGION --project=NEW_PROJECT +gcloud storage rsync -r gs://OLD_BUCKET gs://NEW_BUCKET +``` + +### Required Bucket List +| Old Bucket | New Bucket | +| ---------------- | ------------------- | +| `PROJECT-prod` | `PROJECT-v2-prod` | +| `PROJECT-backup` | `PROJECT-v2-backup` | + +--- + +## 3️⃣ Container Image Migration + +```bash +# Configure Docker Authentication +gcloud auth configure-docker OLD_REGION-docker.pkg.dev,NEW_REGION-docker.pkg.dev + +# Image Pull -> Tag -> Push +docker pull OLD_REGION-docker.pkg.dev/OLD_PROJECT/REPO/IMAGE:TAG +docker tag OLD_REGION-docker.pkg.dev/OLD_PROJECT/REPO/IMAGE:TAG \ + NEW_REGION-docker.pkg.dev/NEW_PROJECT/REPO/IMAGE:TAG +docker push NEW_REGION-docker.pkg.dev/NEW_PROJECT/REPO/IMAGE:TAG +``` + +--- + +## 4️⃣ Update GitHub Secrets + +Update in Repository Settings > Secrets and variables > Actions: + +| Secret | Format | +| -------------------------------- | ---------------------------------------------------------------------------------------------- | +| `GCP_WORKLOAD_IDENTITY_PROVIDER` | `projects/PROJECT_NUMBER/locations/global/workloadIdentityPools/POOL_ID/providers/PROVIDER_ID` | +| `GCP_SERVICE_ACCOUNT` | `SERVICE_ACCOUNT@PROJECT_ID.iam.gserviceaccount.com` | + +### How to Retrieve Values + +```bash +# Project Number +gcloud projects describe NEW_PROJECT --format="value(projectNumber)" + +# WIF Provider Full Path +gcloud iam workload-identity-pools providers describe PROVIDER_ID \ + --workload-identity-pool=POOL_ID \ + --location=global \ + --project=NEW_PROJECT \ + --format="value(name)" +``` + +--- + +## 5️⃣ Source Code Changes + +### `apps/api/src/lib/config.py` +```python +gcs_bucket_name: str = "NEW_BUCKET_NAME" +``` + +### `apps/infra/compute-*.tf` +```hcl +env { + name = "GCS_BUCKET_NAME" + value = "NEW_BUCKET_NAME" +} +``` + +--- + +## 🚀 Automation Script + +To automate the entire migration: + +```bash +./.agent/skills/gcp-migration/scripts/migrate-gcp-project.sh \ + --old-project OLD_PROJECT \ + --new-project NEW_PROJECT \ + --old-region OLD_REGION \ + --new-region NEW_REGION +``` + +--- + +## ⚠️ Notes + +1. **Cloud SQL Private IP**: Cannot be accessed directly from local. Temporary Public IP enablement required. +2. **Account Switching**: `gcloud config set account` is mandatory when working across old/new projects. +3. **Permission Propagation Delay**: May require waiting a few minutes after IAM changes. +4. **Terraform lifecycle.ignore_changes**: Image changes should be performed separately via `gcloud run deploy`. + +--- + +## 🔧 Terraform Variable Structure + +Terraform files have parameterized project ID and region, so you only need to modify the top of `variables.tf` during migration. + +### Key Variables (Top of `variables.tf`) + +```hcl +variable "project_id" { + description = "GCP Project ID" + type = string + default = "your-project-id" # ← Change to new project ID +} + +variable "region" { + description = "GCP Region" + type = string + default = "your-region" # ← Change if necessary +} +``` + +### Parameterized Resource List + +| File | Parameterized Items | +| -------------- | ----------------------------------------------- | +| `provider.tf` | `project`, `region` | +| `storage.tf` | Bucket name (`${var.project_id}-prod`), Region | +| `database.tf` | `project`, `region`, VPC path | +| `iam.tf` | `project`, SA email | +| `compute-*.tf` | Image path, `GOOGLE_CLOUD_PROJECT_ID` env var | + +### ⚠️ Items Requiring Manual Changes + +The following items cannot be parameterized due to Terraform limitations and require manual changes: + +1. **`provider.tf` - backend bucket** + ```hcl + backend "gcs" { + bucket = "NEW_PROJECT-tfstate" # Manual change + } + ``` + +2. **`database.tf` - Instance Name** (Keep effectively same if retaining old instance, change if creating new) + ```hcl + name = "your-instance-name" + ``` diff --git a/.agent/skills/gcp-migration/scripts/migrate-gcp-project.sh b/.agent/skills/gcp-migration/scripts/migrate-gcp-project.sh new file mode 100755 index 0000000..6bb3d1f --- /dev/null +++ b/.agent/skills/gcp-migration/scripts/migrate-gcp-project.sh @@ -0,0 +1,369 @@ +#!/bin/bash +# ============================================================================= +# GCP Project Migration Script +# ============================================================================= +# This script performs migration from an existing GCP project to a new project. +# +# Usage: +# ./migrate-gcp-project.sh --old-project OLD_PROJECT_ID --new-project NEW_PROJECT_ID +# +# Example: +# ./migrate-gcp-project.sh --old-project your-old-project --new-project your-new-project +# +# Prerequisites: +# - gcloud CLI installed and authenticated +# - Owner/Editor permissions for both projects +# - psql installed (for DB migration) +# - docker installed (for container image migration) +# ============================================================================= + +set -e + +# Colors for output +RED='\033[0;31m' +GREEN='\033[0;32m' +YELLOW='\033[1;33m' +BLUE='\033[0;34m' +NC='\033[0m' # No Color + +# Default values +OLD_PROJECT="" +NEW_PROJECT="" +OLD_REGION="your-old-region" +NEW_REGION="your-new-region" +OLD_BUCKET_PREFIX="your-old-bucket-prefix" +NEW_BUCKET_PREFIX="your-new-bucket-prefix" +SKIP_DB=false +SKIP_STORAGE=false +SKIP_IMAGES=false +SKIP_APIS=false + +# Parse arguments +while [[ $# -gt 0 ]]; do + case $1 in + --old-project) + OLD_PROJECT="$2" + shift 2 + ;; + --new-project) + NEW_PROJECT="$2" + shift 2 + ;; + --old-region) + OLD_REGION="$2" + shift 2 + ;; + --new-region) + NEW_REGION="$2" + shift 2 + ;; + --skip-db) + SKIP_DB=true + shift + ;; + --skip-storage) + SKIP_STORAGE=true + shift + ;; + --skip-images) + SKIP_IMAGES=true + shift + ;; + --skip-apis) + SKIP_APIS=true + shift + ;; + -h|--help) + echo "Usage: $0 --old-project OLD_PROJECT_ID --new-project NEW_PROJECT_ID [options]" + echo "" + echo "Options:" + echo " --old-project PROJECT_ID Old project ID" + echo " --new-project PROJECT_ID New project ID" + echo " --old-region REGION Old project region (default: your-old-region)" + echo " --new-region REGION New project region (default: your-new-region)" + echo " --skip-db Skip database migration" + echo " --skip-storage Skip storage bucket migration" + echo " --skip-images Skip container image migration" + echo " --skip-apis Skip API enablement" + exit 0 + ;; + *) + echo "Unknown option: $1" + exit 1 + ;; + esac +done + +# Validate required arguments +if [[ -z "$OLD_PROJECT" || -z "$NEW_PROJECT" ]]; then + echo -e "${RED}Error: --old-project and --new-project are required${NC}" + exit 1 +fi + +log_step() { + echo -e "\n${BLUE}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}" + echo -e "${GREEN}[STEP]${NC} $1" + echo -e "${BLUE}━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━${NC}" +} + +log_info() { + echo -e "${YELLOW}[INFO]${NC} $1" +} + +log_success() { + echo -e "${GREEN}[SUCCESS]${NC} $1" +} + +log_error() { + echo -e "${RED}[ERROR]${NC} $1" +} + +# ============================================================================= +# Step 1: Enable Required APIs +# ============================================================================= +enable_apis() { + log_step "Enabling required APIs in new project" + + APIS=( + "sqladmin.googleapis.com" + "run.googleapis.com" + "artifactregistry.googleapis.com" + "iamcredentials.googleapis.com" + "cloudtasks.googleapis.com" + "pubsub.googleapis.com" + "storage.googleapis.com" + "compute.googleapis.com" + "vpcaccess.googleapis.com" + "aiplatform.googleapis.com" + ) + + for api in "${APIS[@]}"; do + log_info "Enabling $api..." + gcloud services enable "$api" --project="$NEW_PROJECT" 2>/dev/null || true + done + + log_success "All APIs enabled" +} + +# ============================================================================= +# Step 2: Database Migration +# ============================================================================= +migrate_database() { + log_step "Migrating Cloud SQL Database" + + # Get old instance info + OLD_INSTANCE=$(gcloud sql instances list --project="$OLD_PROJECT" --format="value(NAME)" | head -1) + if [[ -z "$OLD_INSTANCE" ]]; then + log_error "No Cloud SQL instance found in old project" + return 1 + fi + log_info "Old instance: $OLD_INSTANCE" + + # Get new instance info + NEW_INSTANCE=$(gcloud sql instances list --project="$NEW_PROJECT" --format="value(NAME)" | head -1) + if [[ -z "$NEW_INSTANCE" ]]; then + log_error "No Cloud SQL instance found in new project" + return 1 + fi + log_info "New instance: $NEW_INSTANCE" + + # Get service account for old instance + OLD_SA=$(gcloud sql instances describe "$OLD_INSTANCE" --project="$OLD_PROJECT" --format="value(serviceAccountEmailAddress)") + + # Create temp bucket for migration + MIGRATION_BUCKET="gs://${NEW_PROJECT}-migration" + log_info "Creating migration bucket: $MIGRATION_BUCKET" + gcloud storage buckets create "$MIGRATION_BUCKET" --location="$NEW_REGION" --project="$NEW_PROJECT" 2>/dev/null || true + + # Grant old Cloud SQL SA access to migration bucket + log_info "Granting bucket access to old Cloud SQL service account..." + gcloud storage buckets add-iam-policy-binding "$MIGRATION_BUCKET" \ + --member="serviceAccount:$OLD_SA" \ + --role="roles/storage.objectAdmin" \ + --project="$NEW_PROJECT" + + # Export database from old instance + TIMESTAMP=$(date +%Y%m%d_%H%M%S) + DUMP_FILE="$MIGRATION_BUCKET/db_dump_$TIMESTAMP.sql" + log_info "Exporting database to $DUMP_FILE..." + gcloud sql export sql "$OLD_INSTANCE" "$DUMP_FILE" \ + --database=your-old-database \ + --project="$OLD_PROJECT" + + # Get new Cloud SQL SA + NEW_SA=$(gcloud sql instances describe "$NEW_INSTANCE" --project="$NEW_PROJECT" --format="value(serviceAccountEmailAddress)") + + # Grant new Cloud SQL SA access to migration bucket + log_info "Granting bucket access to new Cloud SQL service account..." + gcloud storage buckets add-iam-policy-binding "$MIGRATION_BUCKET" \ + --member="serviceAccount:$NEW_SA" \ + --role="roles/storage.objectViewer" \ + --project="$NEW_PROJECT" + + # Create database if not exists + log_info "Creating database in new instance..." + gcloud sql databases create your-new-database --instance="$NEW_INSTANCE" --project="$NEW_PROJECT" 2>/dev/null || true + + # Import database to new instance + log_info "Importing database from $DUMP_FILE..." + gcloud sql import sql "$NEW_INSTANCE" "$DUMP_FILE" \ + --database=your-new-database \ + --user=postgres \ + --project="$NEW_PROJECT" \ + --quiet + + log_success "Database migration completed" +} + +# ============================================================================= +# Step 3: Storage Bucket Migration +# ============================================================================= +migrate_storage() { + log_step "Migrating GCS Buckets" + + # Define bucket pairs (old -> new) + declare -A BUCKET_PAIRS=( + ["${OLD_BUCKET_PREFIX}-prod"]="${NEW_BUCKET_PREFIX}-prod" + ["${OLD_BUCKET_PREFIX}-backup"]="${NEW_BUCKET_PREFIX}-backup" + ) + + for OLD_BUCKET in "${!BUCKET_PAIRS[@]}"; do + NEW_BUCKET="${BUCKET_PAIRS[$OLD_BUCKET]}" + log_info "Migrating gs://$OLD_BUCKET -> gs://$NEW_BUCKET" + + # Create new bucket if not exists + gcloud storage buckets create "gs://$NEW_BUCKET" \ + --location="$NEW_REGION" \ + --project="$NEW_PROJECT" 2>/dev/null || true + + # Sync data + log_info "Syncing data..." + gcloud storage rsync -r "gs://$OLD_BUCKET" "gs://$NEW_BUCKET" 2>/dev/null || true + + log_success "Migrated $OLD_BUCKET -> $NEW_BUCKET" + done + + log_success "Storage migration completed" +} + +# ============================================================================= +# Step 4: Container Image Migration +# ============================================================================= +migrate_images() { + log_step "Migrating Container Images" + + # Configure docker auth for both registries + log_info "Configuring Docker authentication..." + gcloud auth configure-docker "${OLD_REGION}-docker.pkg.dev,${NEW_REGION}-docker.pkg.dev" --quiet + + # Get running images from old project Cloud Run services + SERVICES=("your-old-image-1" "your-old-image-2" "your-old-image-3") + + for SERVICE in "${SERVICES[@]}"; do + log_info "Processing service: $SERVICE" + + # Get current image + OLD_IMAGE=$(gcloud run services describe "$SERVICE" \ + --region="$OLD_REGION" \ + --project="$OLD_PROJECT" \ + --format="value(spec.template.spec.containers[0].image)" 2>/dev/null) || continue + + if [[ -z "$OLD_IMAGE" ]]; then + log_info "Service $SERVICE not found in old project, skipping" + continue + fi + + log_info "Old image: $OLD_IMAGE" + + # Extract image name and tag + IMAGE_TAG=$(echo "$OLD_IMAGE" | grep -oE ':[^:]+$' | sed 's/://') + + # Determine new image path + case "$SERVICE" in + your-old-image-1|your-old-image-2|your-old-image-3) + NEW_IMAGE="${NEW_REGION}-docker.pkg.dev/${NEW_PROJECT}/your-new-image-1:${IMAGE_TAG}" + ;; + esac + + log_info "New image: $NEW_IMAGE" + + # Pull, tag, push + docker pull "$OLD_IMAGE" + docker tag "$OLD_IMAGE" "$NEW_IMAGE" + docker push "$NEW_IMAGE" + + # Also push as latest + LATEST_IMAGE=$(echo "$NEW_IMAGE" | sed "s/:${IMAGE_TAG}/:latest/") + docker tag "$OLD_IMAGE" "$LATEST_IMAGE" + docker push "$LATEST_IMAGE" + + log_success "Migrated image for $SERVICE" + done + + log_success "Container image migration completed" +} + +# ============================================================================= +# Main Execution +# ============================================================================= +main() { + echo -e "${GREEN}" + echo "╔══════════════════════════════════════════════════════════════╗" + echo "║ GCP Project Migration for $OLD_PROJECT to $NEW_PROJECT ║" + echo "╠══════════════════════════════════════════════════════════════╣" + echo "║ Old Project: $OLD_PROJECT" + echo "║ New Project: $NEW_PROJECT" + echo "║ Old Region: $OLD_REGION" + echo "║ New Region: $NEW_REGION" + echo "╚══════════════════════════════════════════════════════════════╝" + echo -e "${NC}" + + # Confirm + read -p "Proceed with migration? (y/N) " -n 1 -r + echo + if [[ ! $REPLY =~ ^[Yy]$ ]]; then + echo "Migration cancelled" + exit 0 + fi + + # Enable APIs + if [[ "$SKIP_APIS" == false ]]; then + enable_apis + fi + + # Migrate database + if [[ "$SKIP_DB" == false ]]; then + migrate_database + fi + + # Migrate storage + if [[ "$SKIP_STORAGE" == false ]]; then + migrate_storage + fi + + # Migrate images + if [[ "$SKIP_IMAGES" == false ]]; then + migrate_images + fi + + echo -e "\n${GREEN}" + echo "╔══════════════════════════════════════════════════════════════╗" + echo "║ Migration Complete! ║" + echo "╚══════════════════════════════════════════════════════════════╝" + echo -e "${NC}" + + echo -e "${YELLOW}Next Steps:${NC}" + echo "1. Update GitHub Secrets:" + echo " - GCP_WORKLOAD_IDENTITY_PROVIDER" + echo " - GCP_SERVICE_ACCOUNT" + echo "" + echo "2. Update app configuration files:" + echo " - apps/api/src/lib/config.py (gcs_bucket_name)" + echo " - apps/infra/variables.tf (project_id, region variables)" + echo "" + echo "3. Run Terraform apply in new project" + echo "" + echo "4. Test API and Web endpoints" +} + +main diff --git a/.github/workflows/review.yml b/.github/workflows/review.yml index 00338df..297026e 100644 --- a/.github/workflows/review.yml +++ b/.github/workflows/review.yml @@ -85,7 +85,18 @@ jobs: working-directory: apps/web run: | pnpm biome check src --reporter=json 2>&1 | \ - reviewdog -f=biome -reporter=github-pr-review -fail-level=error + jq -r '.diagnostics[] | { + message: .message, + location: { + path: .location.path.file, + range: { + start: { line: .location.span[0].line, column: .location.span[0].column }, + end: { line: .location.span[1].line, column: .location.span[1].column } + } + }, + severity: (if .severity == "error" then "ERROR" elif .severity == "warning" then "WARNING" else "INFO" end) + }' | \ + reviewdog -f=rdjsonl -reporter=github-pr-review -fail-level=error ruff: name: Ruff (Python) @@ -109,7 +120,19 @@ jobs: run: | uv sync --frozen uv run ruff check --output-format=json . 2>&1 | \ - reviewdog -f=ruff -reporter=github-pr-review -fail-level=error + jq -r '.[] | { + message: .message, + location: { + path: .filename, + range: { + start: { line: .location.row, column: .location.column }, + end: { line: .end_location.row, column: .end_location.column } + } + }, + code: { value: .code }, + severity: "WARNING" + }' | \ + reviewdog -f=rdjsonl -name="ruff-api" -reporter=github-pr-review -fail-level=error - name: Run Ruff with reviewdog (Worker) env: @@ -118,7 +141,19 @@ jobs: run: | uv sync --frozen uv run ruff check --output-format=json . 2>&1 | \ - reviewdog -f=ruff -reporter=github-pr-review -fail-level=error + jq -r '.[] | { + message: .message, + location: { + path: .filename, + range: { + start: { line: .location.row, column: .location.column }, + end: { line: .end_location.row, column: .end_location.column } + } + }, + code: { value: .code }, + severity: "WARNING" + }' | \ + reviewdog -f=rdjsonl -name="ruff-worker" -reporter=github-pr-review -fail-level=error flutter: name: Flutter Analyze