Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is great and similar to what I was thinking of doing at some point. I just wasn't sure if it needed to be specific to Sweep Local or if it could be a generic llama.cpp provider.


I was thinking about this too. Zed officially supports self-hosting Zeta, and so one option would be to create a proxy that uses the Zeta wire format, but is packed by llama.cpp (or any model backend). In the proxy you could configure prompts, context, templates, etc., while still using a production build of Zed. I'll give it a shot if I have time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: