I think perhaps you are conflating user-friendly and user-focused.
Linux, and open source in general, is infinitely more user-focused than anything from Microsoft, since open source is often built for users and by users.
But if you don't have great computer skills already, Linux can be extremely un-friendly the moment you step off the beaten path.
I mean, unless you know the various arcane aspects of Windows, it's pretty hilariously un-friendly when you step off the path, too. After a decade of using Gnome exclusively, whenever a friend asks for help with Windows, all I can do is shrug and suggest reinstalling and/or living with the pain.
It's user-focused in the sense that the user's goals drive the design. The good non-profit distributions, such as Debian and Arch, would never even try to require or push an online account, since that is contrary to the user's interests.
Not disagreeing with you, but your comment brought back memories of Ubuntu One, and the amazon spyware(?) search thing. Ubuntu is kind of the Windows of the GNU/Linux world in that they repeatedly do user-hostile things that test everyone's limits.
Yeah, I would not use Ubuntu if I can help it. I'd still rather use it over Windows. This is why I specifically said "The good non-profit distributions," and not "Linux distributions" or some other broader phrase.
I'm sure that's why they weren't included in the examples of "the good non-profit distributions". It's not like Ubuntu is going to be overlooked. But they are malicious.
The snap disaster really was the final nail in the coffin for me.
That bug report about ~/snap has to be the hottest bug in their bugtracker, and they simply don't seem to give a shit and pretend it's fine.
All the while naive users like my father or colleagues at my workplace shoot themselves in the foot by thinking "what's that folder doing in my home directory? Delete."
I'm not sure if that's still the case, but there was a time when that simply hosed your whole snap installation.
It's also completely ridiculous when you run "docker run ubuntu; apt install whatever" only to find out that "whatever" is now a snap and won't run w/o getting into nested containerization.
For packages that got the snap treatment, window tracking for the Gnome dash was broken for ages if, god forbid, you wanted to create a custom .desktop file to add some parameters. Completely broke the custom launchers I created.
I created bug reports, I tried to work with them. Others did, too. Some of these reports approach 10 years now.
I am purging Ubuntu from all of my employers systems, replacing it with RockyLinux.
Only one major application still to go.
Friends and family get Debian, that transition is already completed.
I want to do the same, but there was some heavy discord at the top of the community a year or so ago that left me fearing for the org's future. If there was a satisfactory resolution, I haven't heard about it.
That's concerning to hear. What discord? The number one thing I want from Debian is predictability, dependability. Other than that, it's not even that great of a distro. I don't use it for my own machines.
Nobody forces you to use Ubuntu. Thats the thing. If Ubuntu fucks up, I can switch to another distro at the blink of an eye and nothing of value was lost.
If user is linux nerd well yes. For more casual users there is way too many weird annoyances and problems. Maybe not with single version, but migrating between or at end of LTS support...
I beg to differ. There is less corporate BS on Linux than any mainstream OS.
The software if largely by users for users.
Obviously it caters to the power user, but it also works well for extremely novice users. It’s those savvy with Win/Mac that get screwed switching. I’d encourage them to put a bit more into trying.
I wanted to validate whether there’s real market interest and if people would actually be willing to pay for a tool like this, since it solves a real problem.
User can give example documents... have sbert test against dewey decimal classifications, or library of congress, also categories for home users (bills, manuals, bank documents, etc) and standard business categories (HR, production and so on). User verifies categories.
Onboarding would ask user about their work, research, hobby interests. LLM could generate word lists asking user if it matches their understanding. And so on.
It's not clear to me why that is so. An LLM trained on IR for the purpose of compilation is not quite what we're looking for here but it is in the same territory.
And, please, it isn't code it is a Hardware Description Language. We all use "code" for short but let's not lose sight of what it is.
Many people come to FPGA design treating the thing as software. It isn't software. It's a hardware design and it is a hardware description language. Maybe it helps that my work in electrical engineering predates FPGA's and even PAL/PLA's. In other words, I spent years designing "raw" electronics.
When typing Verilog I think about circuits not software and I don't make a lot of mistakes because the circuits are designed on paper before typing code. Code is the hardware description, not the design environment.
I find that older hardware engineers are far better at this. Younger engineers treat it like software and go into this crazy type->debug->type->debug cycle that simply isn't the way you design hardware. Decades ago you had to know your shit. You couldn't throw a bunch of chips at a board and have to redesign it due to simple mistakes. Again, it ain't software.
So, no, I have no issues with Verilog debugging. I can't remember any serious debugging events in, say, twenty years.
What you are describing is just one aspect of programming with HDL that sounds similar to what one would do with a schematic editor. Fortunately, with HDL you can work at either a primitive level, or a more abstract behavioral level or anywhere in between. What if someone wants to design a library without knowing the specific device that will be targeted? You could use behavioral algorithmic style code using parameterized functions and lots of generate statements that could support multiple architectures. In that case there would be a lot of 'code' that does not have anything to do with the actual circuit but is still perfectly valid HDL. It is the software-like functionality of HDL that made schematic editors obsolete in my opinion.
Same with the "did nuclear fusion in their bedroom" crowd. Although both (planes and fusion) demonstrate a certain level of capability. But the steps to take are already known, doing it takes initiative, ability to understand the steps and the money to pay for it.