Hacker Newsnew | past | comments | ask | show | jobs | submit | Bolwin's commentslogin

They're not though. They're supporting debian and bazzite which is fedora based and have worked with fedora extensively. See https://frame.work/de/en/blog/framework-sponsorships

Third party providers rarely support caching.

With caching the expensive US models end up being like 2x the price (e.g sonnet) and often much cheaper (e.g gpt-5 mini)

If they start caching then US companies will be completely out priced.


Ah yes, so the only ones left is who, Israel?

The health ministry is generally being quite conservative, and this is only confirmed, properly documented deaths. More realistic death tolls are estimated over 200K


Lovely writing. Posts like these make me hopeful about tech again.

A little strange that for someone who cares so much about privacy you only post on Instagram and substack


That's bizarre because I would expect the opposite. For reasoning you go step by step, and when you're done quickly diffuse the answer


Unification in logic programming isn't a forwards-only process, so there's no reason to expect deduction in an AI to proceed in a sort of procedural step by step fashion either. What ultimately matters is that all of the various deductions unify coherently in the end.


Exactly.

If you add a "cheat" rule that lets you deduce anything from something else, then replacing these cheat rule applications with real subgoal proofs is denoising for Natural Deduction.


However after step 4 you might notice that you made a mistake in step 2 and revise it. You might think in steps, but the state you are building is formed a bit diffusion-like

Pretty sure css has a sin() fn, that's half your work


Yes, it trades blows with glm for the best open source model


Really? They must've switched recently cause that was around before kimi came out


Yes, this is recent. Before it was other model(s), not sure which.


I use it extensively. For years Whatsapp had a lovely native windows app and now they're replacing it with this horrible bloated thing


I don't speak any faster than I type, despite what the transcription companies claim


Most people don't at 150wpm, the typically speaking speed, even agmonst technical people. For regular questions without that don't invovle precise syntax like in maths and programming, speech would be faster. Though reading the output would be faster than hearing it spoken


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: