Hacker Newsnew | past | comments | ask | show | jobs | submit | niutech's commentslogin

Llamafile (https://llamafile.ai) can easily exceed 4GB due to containing LLM weights inside. But remember, you cannot run >4GB executable files on Windows.

> I doubt it’ll be radically different from other implementations of AI + Browser in such a way that it stands above the others.

Mozilla distinguishes itself by allowing to use any AI, including open source private, local LLMs in their chatbot, not only proprietary Gemini like Chrome or Copilot like Edge. This is privacy-friendly.


Why not generate good old RTF files instead of PDF? They are much simpler and support more than ASCII charset.

How about comparing with Duktape (https://duktape.org)?

There are still Espruino JS devices.

And also Duktape (https://duktape.org)

Tauri also targets the lightweight Servo engine, so you don't need to budle a bloated CEF.


Why noy use Flatpak instead of AppImage then?


But Win32 API is stable, I can run Windows 95 apps in Windows 11!


There are even more lightweight alternatives for Electron than Tauri: NodeGui, Neutralino.js, Sciter, DeskGap and more: https://github.com/sudhakar3697/awesome-electron-alternative...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: