Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have an M4 MBP and I also think Apple is set up quite nicely to take real advantage of local models.

They already work on the most expensive Apple hardware. I expect that price to come down in the next few years.

It’s really just the UX that’s bad but that’s solvable.

Apple isn’t having to pay for each users power and use either. They sell hardware once and folks pay with their own electricity to run it.



Your comment made me realize that there's also the benefit of not having to handle the hardware depreciation, it's pushed to the customer. And now Apple has renewed arguments to sell better machines more often ("you had ChatGPT 3-like performance locally last year, now you can get ChatGPT 4-like performance if you buy the new model").

I know folks who still use some old Apple laptops, maybe 5+ years old, since they don't see the point in changing (and indeed if you don't work in IT and don't play video games or other power-demanding jobs, I'm not sure it's worth it). Having new models with some performant local LLM built-in might change this for the average user.


Lmao it took Apple like a decade or soaafter everyone else to offer 16gb ram as default.

You won't be getting cheap Apple machines chock full of ram any time soon, I can tell you that. That goes against Apple's entire pricing structure/money making machine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: