Skip to content
Discussion options

You must be logged in to vote

Unfortunately that is not currently possible. Databricks only allows one to reuse a job cluster for tasks that are in the same Databricks workflow. A Dagster step which is configured with the databricks_pyspark_step_launcher maps to a single Databricks workflow with a single task. Put differently, Databricks does not allow one to keep a job cluster alive between workflows, and every Dagster step configured with the databricks_pyspark_step_launcher is submitted to Databricks in its own workflow.

I too would like to be able to reuse job clusters across steps in a Dagster job, but it seems that the step launcher may not be the right abstraction. It's a bit difficult to imagine how one could …

Replies: 3 comments 11 replies

Comment options

You must be logged in to vote
3 replies
@weberdavid
Comment options

@zyd14
Comment options

Answer selected by geoHeil
@geoHeil
Comment options

Comment options

You must be logged in to vote
3 replies
@zyd14
Comment options

@geoHeil
Comment options

@zyd14
Comment options

Comment options

You must be logged in to vote
5 replies
@zyd14
Comment options

@geoHeil
Comment options

@geoHeil
Comment options

@zyd14
Comment options

@geoHeil
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants