Skip to content

egentway/llama_cpp_orin

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 

Repository files navigation

llama-cpp-orin

Nix flake to run llama.cpp with CUDA acceleration on the Jetson Orin Nano.

To run the default server config:

nix run

To run llama-server in router mode:

nix run .#llama-server-router

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages