Why is slop assumed inevitable? These models are plagiarization and copyright laundering machines. We need a great AI model reset whereby all published works are assumed to opt-out of training and companies pay to train on your data. We've seen what AI can do, now fund the creators.
Good luck, there are too many forces working against that.
Only big creative companies like Disney can play the game of making licensing agreements. And they are ok with it because it gives them an edge over smaller, less organized creators without a legal department.
I'd rather we democratize ownership [1]. Instead of taxing the owning class and being paid UBI peanuts, how about becoming the owning class and reaping the rewards directly?
> reflects the software engineering philosophies of the 1980s.
It has a microkernel architecture. That's already an improvement over the "modern" monolithic kernels we are stuck with today. Given Big Tech's interest in hardening security and sandboxing you'd think this would get more attention.
True but it's not exactly new. I remember Andrew Tanenbaum and Linus Torvald's heated discussions in the early 90s :) Minix featured a microkernel before linux existed.
Yeah, but we are still far off making it mainstream beyond some key use cases, QNX, INTEGRITY, language runtimes on top of type 1 hypervisors, all kernel extension points being pushed into userspace across Apple,Google,Microsoft offerings, Nintendo Switch,....
Given the tectonic shift in priorities for Linux kernel development over the past decade, I'm willing to bet that many key developers would be open to a microkernel architecture now than ~25+ years ago. CPUs now have hardware features that reduce the overhead of MMU context changes which gets rid of a significant part of the cost of having isolated address spaces to contain code. The Meltdown and Spectre attacks really forced the security issue to the point where major performance costs to improve security became acceptable in a way that was not the case in the '90s or '00s.
I'm being pedantic, but on modern hardware, the ISA is an abstraction over microarchitecture and microcode. It's no longer a 1-to-1 representation of hardware execution. But, as programmers, it's as low as we can go, so the distinction is academic.
reply