Skip to content

Commit 6405808

Browse files
authored
Merge pull request #2 from derek10cloud/create-s3-gcs-bucket-example
Add a usecase for creating s3 and gcs(multi-cloud stack)
2 parents 6fe7502 + 5ad5f39 commit 6405808

File tree

5 files changed

+232
-1
lines changed

5 files changed

+232
-1
lines changed

.gitignore

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,7 @@ stackql
66
stackql-zip
77
stackql-aws-cloud-shell.sh
88
stackql-azure-cloud-shell.sh
9-
stackql-google-cloud-shell.sh
9+
stackql-google-cloud-shell.sh
10+
.envrc
11+
12+
.DS_Store
Lines changed: 75 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
# `stackql-deploy` starter project for `aws`
2+
3+
> for starter projects using other providers, try `stackql-deploy my_stack --provider=azure` or `stackql-deploy my_stack --provider=google`
4+
5+
see the following links for more information on `stackql`, `stackql-deploy` and the `aws` provider:
6+
7+
- [`aws` provider docs](https://stackql.io/registry/aws)
8+
- [`stackql`](https://github.com/stackql/stackql)
9+
- [`stackql-deploy` PyPI home page](https://pypi.org/project/stackql-deploy/)
10+
- [`stackql-deploy` GitHub repo](https://github.com/stackql/stackql-deploy)
11+
12+
## Overview
13+
14+
__`stackql-deploy`__ is a stateless, declarative, SQL driven Infrastructure-as-Code (IaC) framework. There is no state file required as the current state is assessed for each resource at runtime. __`stackql-deploy`__ is capable of provisioning, deprovisioning and testing a stack which can include resources across different providers, like a stack spanning `aws` and `azure` for example.
15+
16+
## Prerequisites
17+
18+
This example requires `stackql-deploy` to be installed using __`pip install stackql-deploy`__. The host used to run `stackql-deploy` needs the necessary environment variables set to authenticate to your specific provider, in the case of the `aws` provider, `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY` and optionally `AWS_SESSION_TOKEN` must be set, for more information on authentication to `aws` see the [`aws` provider documentation](https://aws.stackql.io/providers/aws).
19+
20+
> __Note for macOS users__
21+
> to install `stackql-deploy` in a virtual environment (which may be necessary on __macOS__), use the following:
22+
> ```bash
23+
> python3 -m venv myenv
24+
> source myenv/bin/activate
25+
> pip install stackql-deploy
26+
> ```
27+
28+
## Usage
29+
30+
Adjust the values in the [__`stackql_manifest.yml`__](stackql_manifest.yml) file if desired. The [__`stackql_manifest.yml`__](stackql_manifest.yml) file contains resource configuration variables to support multiple deployment environments, these will be used for `stackql` queries in the `resources` and `resources` folders.
31+
32+
The syntax for the `stackql-deploy` command is as follows:
33+
34+
```bash
35+
stackql-deploy { build | test | teardown } { stack-directory } { deployment environment} [ optional flags ]
36+
```
37+
38+
### Deploying a stack
39+
40+
For example, to deploy the stack to an environment labeled `sit`, run the following:
41+
42+
```bash
43+
stackql-deploy build \
44+
examples/aws/aws-stack sit \
45+
-e AWS_REGION=ap-southeast-2
46+
```
47+
48+
Use the `--dry-run` flag to view the queries to be run without actually running them, for example:
49+
50+
```bash
51+
stackql-deploy build \
52+
examples/aws/aws-stack sit \
53+
-e AWS_REGION=ap-southeast-2 \
54+
--dry-run
55+
```
56+
57+
### Testing a stack
58+
59+
To test a stack to ensure that all resources are present and in the desired state, run the following (in our `sit` deployment example):
60+
61+
```bash
62+
stackql-deploy test \
63+
examples/aws/aws-stack sit \
64+
-e AWS_REGION=ap-southeast-2
65+
```
66+
67+
### Tearing down a stack
68+
69+
To destroy or deprovision all resources in a stack for our `sit` deployment example, run the following:
70+
71+
```bash
72+
stackql-deploy teardown \
73+
examples/aws/aws-stack sit \
74+
-e AWS_REGION=ap-southeast-2
75+
```
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
/*+ exists */
2+
SELECT COUNT(*) as count FROM google.storage.buckets
3+
WHERE name = '{{ gcp_bucket_name }}'
4+
AND project = '{{ gcp_project_id }}'
5+
6+
/*+ create */
7+
INSERT INTO google.storage.buckets
8+
(
9+
data__name,
10+
data__location,
11+
data__labels,
12+
project
13+
)
14+
SELECT
15+
'{{ gcp_bucket_name }}',
16+
'{{ gcp_region }}',
17+
'{{ bucket_labels }}',
18+
'{{ gcp_project_id }}';
19+
20+
21+
/*+ statecheck, retries=3, retry_delay=1 */
22+
SELECT COUNT(*) as count
23+
FROM google.storage.buckets
24+
WHERE name = '{{ gcp_bucket_name }}'
25+
AND project = '{{ gcp_project_id }}'
26+
AND LOWER(location) = LOWER('{{ gcp_region }}');
27+
28+
29+
/*+ exports, retries=3, retry_delay=1 */
30+
SELECT
31+
projectNumber as "gcp_project_number",
32+
name as "gcs_bucket_name",
33+
id as "gcs_id",
34+
labels as "gcs_labels"
35+
FROM google.storage.buckets
36+
WHERE LOWER(location) = LOWER('{{ gcp_region }}')
37+
AND name = '{{ gcp_bucket_name }}'
38+
AND project = '{{ gcp_project_id }}';
39+
40+
41+
/*+ delete */
42+
DELETE FROM google.storage.buckets
43+
WHERE bucket = '{{ gcp_bucket_name }}';
Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
/*+ exists */
2+
SELECT COUNT(*) as count FROM
3+
(
4+
SELECT bucket_name
5+
FROM aws.s3.buckets
6+
WHERE region = '{{ aws_region }}'
7+
AND bucket_name = '{{ aws_bucket_name }}'
8+
) t;
9+
10+
/*+ create */
11+
INSERT INTO aws.s3.buckets (
12+
BucketName,
13+
Tags,
14+
region
15+
)
16+
SELECT
17+
'{{ aws_bucket_name }}',
18+
'{{ bucket_tags }}',
19+
'{{ aws_region }}';
20+
21+
/*+ statecheck, retries=3, retry_delay=1 */
22+
SELECT COUNT(*) as count FROM
23+
(
24+
SELECT bucket_name,
25+
arn
26+
FROM aws.s3.buckets
27+
WHERE region = '{{ aws_region }}'
28+
AND bucket_name = '{{ aws_bucket_name }}'
29+
) t;
30+
WHERE arn = 'arn:aws:s3:::{{ aws_bucket_name }}';
31+
32+
/*+ exports, retries=3, retry_delay=1 */
33+
SELECT aws_bucket_name, arn, aws_tags FROM
34+
(
35+
SELECT bucket_name as "aws_bucket_name", arn, tags as "aws_tags"
36+
FROM aws.s3.buckets
37+
WHERE region = '{{ aws_region }}'
38+
AND bucket_name = '{{ aws_bucket_name }}'
39+
) t;
40+
41+
/*+ delete */
42+
DELETE FROM aws.s3.buckets
43+
WHERE data__Identifier = '{{ aws_bucket_name }}'
44+
AND region = '{{ aws_region }}'
Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
#
2+
# aws starter project manifest file, add and update values as needed
3+
#
4+
version: 1
5+
name: "s3-gcs-bucket"
6+
description: description for "s3-gcs-bucket"
7+
providers:
8+
- aws
9+
- google
10+
globals:
11+
# aws global values
12+
- name: aws_region
13+
description: aws region
14+
value: "{{ AWS_REGION }}"
15+
- name: global_tags
16+
value:
17+
- Key: Provisioner
18+
Value: stackql
19+
- Key: StackName
20+
Value: "{{ stack_name }}"
21+
- Key: StackEnv
22+
Value: "{{ stack_env }}"
23+
- Key: Provider
24+
Value: aws
25+
# google global values
26+
- name: gcp_region
27+
description: gcp region
28+
value: "{{ GCP_REGION }}"
29+
- name: gcp_project_id
30+
description: google project name
31+
value: "{{ MY_PROJECT_NAME }}"
32+
- name: global_labels
33+
value:
34+
provisioner: stackql
35+
stack_name: "{{ stack_name }}"
36+
stack_env: "{{ stack_env }}"
37+
provider: gcp
38+
resources:
39+
- name: example_s3_bucket
40+
props:
41+
- name: aws_bucket_name
42+
value: "{{ stack_env }}-{{ stack_name }}"
43+
- name: bucket_tags
44+
value:
45+
- Key: Name
46+
Value: "{{ stack_env }}-{{ stack_name }}"
47+
merge:
48+
- global_tags
49+
exports:
50+
- aws_bucket_name
51+
- arn
52+
- aws_tags
53+
- name: example_gcs_bucket
54+
props:
55+
- name: gcp_bucket_name
56+
value: "{{ stack_env }}-{{ stack_name }}"
57+
- name: bucket_labels
58+
value:
59+
name: "{{ stack_env }}-{{ stack_name }}"
60+
merge:
61+
- global_labels
62+
exports:
63+
- gcp_project_number
64+
- gcs_bucket_name
65+
- gcs_id
66+
- gcs_labels

0 commit comments

Comments
 (0)