Have you ever had a real conversation with a local LLM? Or even taken a VoIP (SIP) phone call with one? #576
mrs83
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Check out Kurtis E1: A Fully On-Device MLX Voice Agent.
The entire stack runs on-device, leveraging MLX-LM on Apple Silicon:
ethicalabs/Kurtis-E1.1-Qwen2.5-3B-Instruct
This showcases the power of local AI/ML. I am also actively developing the SIP #VoIP integration (now in testing). The goal? To let you take a phone call and talk directly with your private agent, even without a computer or internet connection.
While Kurtis isn't built for math/coding, it shows a valuable path forward for on-device workflows.
We are actively looking for partners and clients to build out these POCs into real-world use cases.
https://www.ethicalabs.ai/ isn't a startup. We are not looking for VCs, equity deals, or grants: we're an open-source project.
If you like the R&D, you can support the R&D directly: https://github.com/sponsors/ethicalabs-ai?frequency=one-time
Beta Was this translation helpful? Give feedback.
All reactions