-
|
Hi we are currently building out our Dagster-ecs solution and we are looking for some advice on how to manage multiple modules. We currently have pipelines that are categorized by a module and some of them have similar job names within a module like run-all. When trying to build out the dagster ecs solution and trying to get all modules to run by passing in the -m flag for each module we run into issues with pipelines names mismatching and other issues. Some of the issues we are not sure of since there are no errors anywhere and we simply only see a few of the jobs available. One way we have seen to get around this is by deploying each modules as a separate code location but in the dagster-ecs we have to deploy a separate container for each module. Is there a way we can get around this by not having to run multiple containers for each module ? The question was originally asked in Dagster Slack. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
|
It is not currently possible to have a code location (or deployed container in ECS) that loads from mutiple modules - each code location needs to accept a single python file or python module as an entry point module. One thing that you can do is have a single entry point module that loads definitions from each of your other modules - then use that entry point module as the one that you specify in your workspace.yaml. |
Beta Was this translation helpful? Give feedback.
It is not currently possible to have a code location (or deployed container in ECS) that loads from mutiple modules - each code location needs to accept a single python file or python module as an entry point module.
One thing that you can do is have a single entry point module that loads definitions from each of your other modules - then use that entry point module as the one that you specify in your workspace.yaml.