| title | Speculate |
|---|---|
| emoji | 💨 |
| colorFrom | blue |
| colorTo | yellow |
| sdk | docker |
| pinned | true |
| license | gpl-3.0 |
| short_description | A sirocco emulator for astrophysical spectra |
An emulator for Sirocco for faster inference of an observational spectrum's outflow parameters
This method automatically handles system dependencies (Git LFS) and NVIDIA drivers.
- Create the environment:
conda env create -f environment.yml conda activate speculate_env git lfs install && git lfs pull
### Option B: Standard Python (pip)
If you are not using Conda, you must point to the NVIDIA server and handle LFS manually.
-
Set up virtual environment:
python -m venv speculate_env source speculate_env/bin/activate -
Install python libraries:
pip install --extra-index-url [https://pypi.nvidia.com](https://pypi.nvidia.com) -r requirements.txt
-
Fetch assets (images/videos) - 🚨 Ensure you have git-lfs installed on your system first. You can verify this with
git lfs version
If not, ensure you download the correct setup for your system until the command returns a verison number. Afterwards, run:
git lfs install && git lfs pull
LFS isn't critical for functionality. It is use to load Speculate's image/video assets. If you can live without them (i.e it doesn't look very pretty), the conda/virtual environment with install packages from the requirements.txt file is all you need.
Simply follow the link: https://huggingface.co/Sirocco-rt Then click on the Speculate Space. Hint: The Huggingface Space may go to sleep if unused for a while. If so, just restart the space when prompted!
To get started, navigate to speculate's root directory, activate the speculate environment created when installing, and then simply run:
python run.pySpeculate doesn't require a GPU to run, however, for the larger models, CPU computational time may be prohibitive.
This requires a couple extra steps as you will have to port forward the interface to your local browser. Also, as good practice, DON'T run speculate on login nodes!
First boot up an interactive compute node: TODO
Check out marimo at https://github.com/marimo-team/marimo Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference