Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Only a matter of time before using coding agents with local LLMs is a viable alternative.


I’m quite happy with my offline AI solution:

https://news.ycombinator.com/item?id=45845049


[dead]


Why it needs to work only on a Mac? And why is that better than running the gpt oss with llama.cpp and codex on my Linux box?


Our model is bigger and more capable than gpt OSS and can run at full context at 40 tokens / s.

We are rolling out to Mac to start with plans to release windows and Linux within 3 months.


"Join the Waitlist"


Mac only :(


We will have windows and Linux early next year! Just starting with Mac for first beta testers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: