Skip to content

[WIP] Co-execution client for Google Cloud Platform Batch v1 #404

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions docs/containers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -132,6 +132,22 @@ GA4GH TES

GA4GH TES job execution with Conda dependencies for the tool and no message queue.

Google Cloud Platform Batch
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. figure:: pulsar_gcp_coexecution_deployment.plantuml.svg

GA4GH TES job execution with a biocontainer for the tool and no message queue.

.. figure:: pulsar_gcp_deployment.plantuml.svg

GA4GH TES job execution with Conda dependencies for the tool and no message queue.

Pulsar job destination options to configure these scenarios:

.. figure:: job_destination_parameters_gcp.png


AWS Batch
~~~~~~~~~~

Expand Down
18 changes: 18 additions & 0 deletions docs/gen_erd_diagrams.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import os
import sys

import erdantic as erd

sys.path.insert(1, os.path.abspath(os.path.join(os.path.dirname(__file__), os.pardir)))

from pulsar.client.container_job_config import (
GcpJobParams,
)

DOC_SOURCE_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__)))
class_to_diagram = {
GcpJobParams: "job_destination_parameters_gcp",
}

for clazz, diagram_name in class_to_diagram.items():
erd.draw(clazz, out=f"{DOC_SOURCE_DIR}/{diagram_name}.png")
Binary file added docs/job_destination_parameters_gcp.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
170 changes: 170 additions & 0 deletions docs/pulsar_gcp_coexecution_deployment.plantuml.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
48 changes: 48 additions & 0 deletions docs/pulsar_gcp_coexecution_deployment.plantuml.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
@startuml

!include plantuml_options.txt

component galaxy as "galaxy" {

}

storage disk as "Object Store" {

}

note as disknote
Disk is unrestricted and does
not need to be shared between
Pulsar and Galaxy.
end note

disk ... disknote

cloud cluster as "Google Cloud Platform" {
queue api as "Google Batch v1 API" {

}

frame pod as "Job" {

component staging as "pulsar Task" {
}

component tool as "biocontainer Task" {
}

}

note as stagingnote
Pulsar Tasks run with a batch_v1.AllocationPolicy.Disk of type "local-ssd"
that is used for Pulsar's staging directory and shared across pulsar stagings and
tool/biocontainer containers.
end note
pod ... stagingnote
}

galaxy --> disk
galaxy --> api : create, delete, get (for status)
api -[dashed]-> pod : [manages]
staging --> galaxy : stage in and out
@enduml
Loading
Loading