I just learned about the whole homelab thing a week ago; it's a much deeper rabbit hole than I expected. I'm planning to setup ProxMox today for the first time in fact and retire my Ubuntu Server setup running on a NUC that's been serving me well for last couple years.
I hadn't heard about mealie yet, but sounds like a great one to install.
I've set up half a dozen different home labs over the years but never used anywhere near the compute or disk capacity I had. It was more about learning things, I guess. I laughed when he mentioned the number of cores he has available.
I used to have a large server serving a couple important things.
I was able to put everything on a fanless zotac box with a 2.5" sata SSD, and it has served well for many years. (and QUITE a bit less electricity, even online 24/7)
My PBS server has 2 datasources - one local external drive & Backblaze B2. I snapshot to the local drive frequently throughout the day & B2 once in the evening.
Yeah I don't backup any of my media zpool. It can all be replaced quite easily, not worth paying for the backup storage.
In my scenario, PBS runs on a VM on my Synology. My Synology does automated backups to Backblaze B2 daily. It averages about $5/TB for B2 storage costs for me.I only backup the critical stuff I don't want to lose.
If you want to go another, related rabbit hole, check out the DataHoarder subreddit. But don't blame me, if you’re buying terabytes of storage over the next few months :)
Data Hoarding is a bit more involved than just a homelab. Don't want your data hoard to go down or missing, whole you're labbing new techs and protocols.
I can vouch for Mealie. My wife and I run it locally for family recipes and to pull down recipes from websites. I have a DNS ad blocker running, but most recipe sites are still a mess to navigate on mobile.
You can also distill recipes down. I find a lot of good recipes online that have a lot of hand-holding within the steps which I can just eliminate.
As others have said, Mealie is an excellent app for any homelab. My wife and I use the meal planning feature and connect it to our Home Assistant calendar that is displayed on a wall-mounted tablet. The ingredient parsing update is amazing and being able to scale recipes up/down is such a time saver.
I've had a ton of fun with CasaOS in the past few months. I don't mind managing docker-compose text files, but CasaOS comes with a simple UI and an "App Store" that makes the process really simple and doesn't overly-complicate things when you want to customize something about a container.
I have Proxmox running on top of a clean Debian install on my NUC, I wanted to allow Plex to use the hardware decoding and it got a bit funny trying to do that with Plex running in a VM, so it runs on the host and I use VMs for other stuff
The only downside is that you essentially lock the GPU to 1 VM which there is nothing wrong with doing. At least with LXC, you can share device across multiple containers.
I have an Intel (12th Gen i5-12450H) mini-pc and at first had issues getting the GPU firmware loaded and working in Debian 12. However upgrading to Debian 13 (trixie) and doing apt update and upgrade resolved the issue and was able to pass the onboard Intel GPU through Docker to a Jellyfin container just fine. I believe the issue is related to older linux kernels and GPU firmware compatibility. Perhaps that’s your issue.
I've had the same experience - I spent 6 months last year really digging into Rust and came to the conclusion that for the software I'm writing it's trying to save me from problems that I just don't run into enough to make it worth it.
I ended up jumping over to Zig and have been really enjoying it. I ported the same hobby 2D game engine project from C++ to Rust, and then over to Zig. A simple tile map loader and renderer took me about a week to implement in Rust and 3 hours in Zig. The difference was a single memory bug that took 15 minutes to figure out.
I dual booted Windows on my desktop and laptop for a few years and also noticed lots of weird issues - reduced battery life on my laptop, sleep/hibernate being broken, GRUB occasionally just dying on me. I eventually got rid of Windows all together and now just run Manjaro. I was surprised that suspend issues and battery life on my laptop, for instance, completely went away.
The main thing that kept me on Windows for years was games, but once I jumped into using Proton via Steam on Linux (and now the tweaked Proton GE), I can run almost all of my game library at full speed. The few games I can't play are due to anti-cheat software like Battleye.
Fastmail was blown offline by a couple of DDoS attacks recently. Both of them impacted my ability to access Fastmail, but I suppose you didn't happen to try to access your account during those attacks.
Nice work. FPGA design appears to be very similar to GPU shader programming. First time I've read anything about FPGA design that connected. Usually FPGA stories get lost in data flow jargon and I learn nothing.
You'll notice I didn't apply the term 'programming' to FPGA. Reading the post I noted this was a likely hang up among FPGA designers and carefully employed the preferred jargon. I imagine this sensitivity is the product of much frustration with forever being conflated with mere programmers. Must be awful.
Then no programming exists at all. When writing a C code you are describing a program that runs on the C abstract machine. The same thing holds for all "programming" languages.
Sorry, I am not ready for philosophical discussion. We can take definition from Wikipedia: https://en.m.wikipedia.org/wiki/Computer_programming Programming involves code execution on computer. There is no computer in FPGA.
Technically FPGA is a piece of memory. The functionality of the FPGA device depends how the bits in this memory are set. Size of this memory is constant This information is not public, brave hackers are working hard to reverse engineer this. You can make a CPU in FPGA, no way for the opposite performance wise. Complex simulation with couple 4k resolution pictures takes days.
Edit: the people here are decent enough to start a discussion.
There is a term “variable program” in your link. When you add peripherals to the chip on the printed circuit board it looses flexibility very fast. The whole system is made to very specific task. But yes, you convinced me that FPGA might be treated as a computer in an extreme case.
FPGA accepts "variable program"s. What you are talking about are peripherals. A CPU with certain peripherals can also be completely inflexible. That is completely outside the scope of what a CPU or an FPGA are though.
Thanks! I too think GPU programming is quite similar in that you need to think of things in a more data streaming sense. It's sort of functional that way too; building up pipelines of transforms.
The IceStick is also nice and a bit cheaper at ~$25 but it has a smaller 1k logic element Ice40 FPGA on it whereas the TinyFPGA-BX has a larger 8k logic element FPGA.
I agree about using C++ for actual ip block implementation. My experience has been pretty mixed. Mostly because the tools (Intel HLS in my case) don't always give you a great idea of what constructs cause you to generate inefficient hdl code.
For example, passing a variable by reference in one context cost me an extra 10% logic blocks, and in another lowered it by 10%. It became a bit of a shotgun approach to optimising
One does not pass a variable in an HDL design ;-). Trying to pluck software principles onto FPGAs is wasting so much performance. Get one with the underlying hardware and map your problem onto them, not an intermediate SW-like representation. Like some other comment mentioned, get one with the clock and your design will fly.
I find this to be true for a lot of applications. Some times it seems like fpgas are hammers looking for a nail.
Where they can shine is if you need some odd combination of peripherals attached to a microcontroller: think of something like a uc with 4 uarts or multiple separate i2c buses.
Anywhere you need a lot of parallel processing that you can guarantee won't be interrupted, like a video processing pipeline is also a good fit.
Really excited to see this out. I've been learning Elixir and Phoenix with the 1.3rc and have really been enjoying it!
I'm a fan of Contexts myself as that is typically how I architect apps on mobile as well. I like having everything separated more explicitly and testable individually and Contexts seem to promote that in a really nice way.
I hadn't heard about mealie yet, but sounds like a great one to install.