Skip to content

luminolabs/ml-experiments

Repository files navigation

lumino-scripts

All the Lumino Scripts

Inference script with command-line interface.

To run the inference script which has a command-line like interface, run the following command (from repo root):

> python -m inference.llm.inference-lora-llm --original_model_path meta-llama/Llama-3.1-8B-Instruct --max_length 500

In the above command-line, the original_model_path can be any value accepted by from_pretrained as documented here.

Help for additional parameters can be seen using command:

> python -m inference.llm.inference-lora-llm -h

About

All the Lumino Scripts

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published