-
Notifications
You must be signed in to change notification settings - Fork 47
SV1_SCAN_BIGQUERY deployment var; allow BQ-only cleanup #1050
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Stephen Compall <[email protected]>
Signed-off-by: Stephen Compall <[email protected]>
Signed-off-by: Stephen Compall <[email protected]>
Revocations still not enough: - gcp:sql:User sv-1-user-bqdatastream deleting (1s) error: sdk-v2/provider2.go:566: sdk.helper_schema: Error, failed to deleteuser bqdatastream in instance sv-1-cn-apps-pg-2964ec6: googleapi: Error 400: Invalid request: failed to delete user bqdatastream: . role "bqdatastream" cannot be dropped because some objects depend on it Details: 32 objects in database scan_sv_1., invalid: [email protected] Signed-off-by: Stephen Compall <[email protected]>
Signed-off-by: Stephen Compall <[email protected]>
…are-devnet-bigquery [ci] Signed-off-by: Stephen Compall <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR prepares for conditional activation (and later deactivation) of BigQuery scanning for the SV1 service by adding an environment flag and refactoring SQL import commands.
- Introduce
SV1_SCAN_BIGQUERYenv flag to togglescanBigQueryconfig insvConfigs.ts - Add
retainOnDeleteto the Postgres replicator user and extract common shell command chunks intodatabaseCommandBracketinbigQuery.ts - Refactor
createPublicationAndReplicationSlotsto use the new header/footer helper
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| cluster/pulumi/common-sv/src/svConfigs.ts | Add SV1_SCAN_BIGQUERY flag and conditional scanBigQuery |
| cluster/pulumi/canton-network/src/bigQuery.ts | Refactor SQL import commands into databaseCommandBracket, update replicator user creation |
Comments suppressed due to low confidence (2)
cluster/pulumi/common-sv/src/svConfigs.ts:63
- Consider adding unit or integration tests to verify that enabling or disabling
SV1_SCAN_BIGQUERYcorrectly includes or omits thescanBigQueryconfiguration.
...(sv1ScanBigQuery
? { scanBigQuery: { dataset: 'mainnet_da2_scan', prefix: 'da2' } }
cluster/pulumi/canton-network/src/bigQuery.ts:314
- [nitpick] The constant
nameis very generic. Rename it to something likeresourceNameoruserResourceNamefor clearer intent.
const name = `${postgres.namespace.logicalName}-user-${replicatorUserName}`;
| } | ||
| ); | ||
| } | ||
|
|
Copilot
AI
Jun 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Add a doc comment above databaseCommandBracket explaining its purpose and the structure of the header/footer shapes to improve readability and maintainability.
| /** | |
| * Generates a set of shell commands for managing temporary SQL files and importing them into a Cloud SQL database. | |
| * | |
| * @param postgres - The CloudPostgres instance containing database and service account details. | |
| * @returns An object with `header` and `footer` properties: | |
| * - `header`: A shell script segment that sets up a temporary GCS bucket, grants access, and prepares a SQL file. | |
| * - `footer`: A shell script segment that uploads the SQL file to GCS, imports it into the database, and cleans up resources. | |
| */ |
ray-roestenburg-da
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
* SV1_SCAN_BIGQUERY env option * remove bqdatastream cleanup, just leave the user alive Revocations still not enough: - gcp:sql:User sv-1-user-bqdatastream deleting (1s) error: sdk-v2/provider2.go:566: sdk.helper_schema: Error, failed to deleteuser bqdatastream in instance sv-1-cn-apps-pg-2964ec6: googleapi: Error 400: Invalid request: failed to delete user bqdatastream: . role "bqdatastream" cannot be dropped because some objects depend on it Details: 32 objects in database scan_sv_1., invalid: [email protected] Signed-off-by: Stephen Compall <[email protected]>
* SV1_SCAN_BIGQUERY env option * remove bqdatastream cleanup, just leave the user alive Revocations still not enough: - gcp:sql:User sv-1-user-bqdatastream deleting (1s) error: sdk-v2/provider2.go:566: sdk.helper_schema: Error, failed to deleteuser bqdatastream in instance sv-1-cn-apps-pg-2964ec6: googleapi: Error 400: Invalid request: failed to delete user bqdatastream: . role "bqdatastream" cannot be dropped because some objects depend on it Details: 32 objects in database scan_sv_1., invalid: [email protected] Signed-off-by: Stephen Compall <[email protected]>
Fixes #1060. Tested by deploying scratchb with the new
SV1_SCAN_BIGQUERYvariable, manually removing the replication slot and publication from the CloudSQL instance (and disabling the variable), and reapplying.Pull Request Checklist
Cluster Testing
/cluster_teston this PR to request it, and ping someone with access to the DA-internal system to approve it./hdm_teston this PR to request it, and ping someone with access to the DA-internal system to approve it.PR Guidelines
Fixes #n, and mention issues worked on using#nMerge Guidelines