I would be delighted to beta test. I would want to work on my spoken Portuguese so I could interact more naturally with my Brazilian colleagues.
I study French, German, Swedish, Mandarin, Japanese, Portuguese, Latin, dabbling in Polish, though it’s hard to find shows dubbed in Latin :). Someday will get to Russian and maybe learn Icelandic as a way of getting closer to the roots of English… but alas life is not forever.
I’ve written some LLM-based software for generating podcasts (www.anyglot.com, but the server currently offline). This project showed me that GPT4 is excellent at generating content in English, and in doing various NLP tasks but not translation which was better left for Google. ElevenLabs voices are fantastic but their Japanese would invent weird Kanji readings, tho that was when Multilingual V2 just came out so maybe they’ve fixed that already.
Not exactly. Most Docker containers rely on distro package managers. You're usually running apt or apk inside your Dockerfile. And the container system rootfs needs to be laid out. It's a non-trivial amount of work to do that and keep it up to date.
I am a fan of building a traditional native package in a multi-step Docker build, and the final container artifact can be a simple `RUN deb -i my-pkg.deb`
I find that targeting traditional system packages has benefits. 1) it's not really that hard, and 2) it forces you to lay things out consitently, or, at the very least, the distros conventions are helpful.
NOPE. Docker neatly encapsulates the problem and allows you to somewhat ship a reproducible deployment... until something needs to be updated. Now either you rebuild your image (which may not be reproducible) or patch it (which comes with its own share of problems). Caching can also be a nightmare if your image is built from common stages. Dealing with vulnerabilities is also a pain especially for things already in production.
Docker (or container images in general) are great but they solve a limited set of problems well and tend to hide others.
No, I don't understand why this myth persists. Docker fetches tarballs, runs commands, and tars up directories. Often, Docker is used to run package management commands (e.g. `apt`, `dpkg`, `yum`, `cargo`, `mvn`, `nix`, `cabal`, `sbt`, `pip`, `npm`, `gradle`, `stack`, `guix`, etc.); the latter are the actual package managers.
Docker "solves" package management in the same way Bash scripts "solve" package management: you can use them to run actual package managers; but also, you probably shouldn't (e.g. Nix is better at creating Docker images than Docker is, for example).
The question is a bit short, so I'm inclined to say: no, the things they solve is only mildly related. But maybe you have a specific thing in mind that Docker solves, so feel free to share what you think so someone or me can say something more useful about this!
It absolutely is? They wrote this piece, and context matters because it can shed light many things, including hypocrisy. In this case, the commenter is pointing out just that.