Hacker Newsnew | past | comments | ask | show | jobs | submit | trabant00's commentslogin

Yawn, another article which hand picks success stories. What about the failures? Where's the graph of flying cars? Humanoid house servant robots? 3D TVs? Crypto decentralized banking for everyone? Etc.

Anybody who tells you they can predict the future is shoveling shit in his mouth then smiling brown teeth at the audience. 10 years from now there's a real possibility of "AI" being remembered as that "stuff that almost got to a single 9 reliability but stopped there".


> the real "value" being delivered by the commercial software providers is often the setup, support, and hand-holding provided to customers who pay the crazy amounts

That is also possible and even usual with open source. The difference is you can choose the provider for each of those things, they can be different, you are not locked in.


> It feels like we have sometimes accreted an amalgam of these pithy takes based on very small, one off, studies (never replicated) that let us comfortably assemble an affirmation of our broader takes.

The patterns are there and are hard to deny. The reasoning and explanations of these types of books? Don't take them for granted, do your own research if anything is of particular interest, think for yourself, etc. The books can be of value without being 100% correct.


> “It seems that perfection is attained not when there is nothing more to add, but when there is nothing more to remove.”

If we go by the above then Sagrada Familia is far from perfect. I guess it depends on taste but I found it extremely kitschy. The lighted signs inside make me think more of a bar than a church. And I found the actual Barcelona Cathedral beautiful. There's also a pretty heavy discussion if the present thing is what Gaudi intended.


I'm in a similar minority - I simply don't understand Gaudi's visions. Touring another building he designed left me in a similar state of confusion. His work strikes me as kitschy and impractical. The trick lighting is genuinely cool, but it strikes me in the same way as a Disney show. It's a skillfully designed light show that is designed to temporarily overwhelm your senses. Like a Disney show, I don't feel moved by the beauty - it's more a sense that you just saw something cool that had a lot of attention to detail. Unlike the Disney show (which isn't supposed to serve a purpose beyond entertainment), I leave a Gaudi building confused because the tour guide kept touting how practical everything is and I just don't see it that way.


> Failure to prove dolus special

This one is sufficient for me. And I think classifying it as genocide is a big mistake if your goal is protecting the civilians in Gaza. An easily proven wrong accusation overshadows the fact Israel could have taken things more slowly an carefully. Which I think (with little experience or knowledge) they could since the power difference is huge between the sides.


I was hoping for a review from a server perspective. That's where Debian shines in my opinion. I feel like the desktop part is a secondary priority for them. That's not a criticism, there's no other distribution I would use in production if it where my choice. On the desktop though they are a bit too stable. Even if one uses testing or unstable the focus on long term versions is still there.


Long term usually equates to a bit stale/out of date with distributions that only release every few years. Appropriate for stuff you don't really care about.

That's why I use rolling release distributions on my Desktop. For Debian, people recommend Debian testing usually. And that's fine. Maybe they should just call it Debian rolling releases and rename stable to Debian LTS. I think it's more appropriate to how people actually use these things.

Manjaro is not without issues but I've had it on one of my laptops for the last four years and it's nice to have the latest driver updates, kernels, etc. working together. It also helps that the community is just focused on current versions of stuff and fixing minor integrations with released packages rather than working around issues in some long forgotten release with distribution specific patches, etc. You find relatively little of that in Arch (which underlies Manjaro).

For production servers, the server just needs to boot my docker containers and get out of the way. IMHO There's no need to support > 10K packages for god knows what there. Most of that stuff probably has no business being installed on a server. I'm actually leaning towards immutable distributions and servers for that reason. The business of manually fiddling with servers in a production environment is something I'm trying to avoid/do less off. They shouldn't need a package manager if they are properly immutable.


> On the desktop though they are a bit too stable.

You're obviously correct here. But perhaps there are users who prefer stable packages on the desktop too. Corporate users most likely (yes, there are such users too). It helps with their security strategy and a development environment similar to their server.

To be very honest, I think the stable security-oriented approach is better than that of a rapid update distro. You should probably use an overlay package manager like flatpak, mise (for dev tools) or even Nix/Guix for anything modern. Preferably something with minimal installs and good sandboxing features. Please let us know if anybody has better suggestions to offer.


I'm such a user. Been mostly running on debian/stable since the 90-ies. At work and privately. I cheated when I got a new computer in the beginning of August this year and installed Trixie a couple of weeks before release.

My reasoning is quite simple: I really don't need the latest versions of everything. Were computers useful two years ago? Yeah? OK then, then a computer is obviously useful today with software that is two years old. I'll get the new software eventually, with most of the kinks ironed out. And I've had time to read up on the changes before they just hit me in the face.

Sure, it was a bit painful with hardware support some twenty years ago or so, but I can barely remember the last time that was an issue.

For the very few select pieces of software where stable doesn't quite cut it there's backports, fasttrack and other side channels.


I prefer stable packages on my desktop and laptop, both for professional and for personal use. I hate the current Javascript/Python/Rust bleeding-edge, left-pad, if you haven updated to yesterday’s latest version which breaks compatibility with everything culture.

I like to build things which last. I like to craft a software system and then use it for decades, moving it from machine to machine and intentionally upgrading the components at my pace.


Same opinion. I tried Fedora and I really liked it. But the constant cache updating frustrated me quickly. I just want something that worked that I can update without doing more than running the command.


I use Debian Stable on my laptop and workstation. Most packages you don't need newer versions. I don't need the latest version of Gnome or Gedit or whatever.

I don't understand why people like the rigmarole of constantly updating their systems. The only things that come down the wire are security updates.

Installer newer software can be managed. I use the following strategy:

- For Discord / Slack / <something that needs to be the newest>. I can normally use Flatpak.

- Use a third party repo. For Brave, Node and some other things. I use their repository.

- Open source stuff. For smaller stuff that is easy to compile from source e.g. vim / neo-vim I just compile from source so I have the newest versions.

- Python Apps / NPM tooling. I install them in my local user directory.

- Docker is installed in rootless mode.


> On the desktop though they are a bit too stable.

>> You're obviously correct here.

It's neither obvious nor correct, the "stability vs. features" expected is completely subjective. I run Debian Stable on my desktop because I've almost never encountered needing newer versions of anything, and when I did I could usually jump to testing (i.e. the upcoming release) rather than unstable, and even then the next release usually wasn't that far away, so it was still very stable.

As other commenters have pointed out, you can run Debian Sid (unstable), but I'll also agree that if that is what you want long-term then maybe running something like Arch makes more sense anyway.


I'm one of those users, but only because I don't need the be on the bleeding edge.

The only problem I had on Debian 11 desktop was related to the new openssh libraries. I could not install the latest nodes and rubies because 11 had older libraries. However there are workarounds related to providing some environment variables (from memory: some legacy_providers_*) so after a little googling I made them work on my dev machine (and on some old server from a customer of mine.) I'm installing Debian 13 in these days so no more workarounds, for a few years.

Everything else worked fine. I don't install much on this machine: no flatpacks, no appimages, no snaps (I left Ubuntu because of them.) Only debs and docker images. I install languages through their language manager, never through the OS: I could have only one version of them, which is useless. Same about databases. There are hardly two projects on the same language and db version. I could be using LibreOffice and GIMP from 20 years ago: they already had all the features I need.


I use incus for my dev needs. But for work computers, I’ve mostly needed one version of everything.


In my experience, corporate users have moved on to using containers(or VMs) for their development environments.

It's a tricky thing to solve. One the one hand, you don't want your system to stop working due to an update but also want to keep the software you use updated, both in terms of security and functionality.

Mark Shuttleworth talked about this many years ago before snaps were introduced as a solution to this. The idea at the time was that a rolling release distro is too much of a hassle to maintain and even the 6-month cycle was getting to be too much. So he talked about having a stable core with a long release cycle and rolling releases for software that need to be frequently updated, both desktop and server software. The idea was great but the details of the execution left a bitter taste for many users.


Atomic distribution can be a nice solution for that. But the current portal ecosystem is a bit poor for integration between flatpack.


Indeed, with the tmpfs move (tmp in RAM) however it sounds like they have more Desktops in mind.

You don't want to use RAM for tmp files for which you probably can't do capacity planning, and you don't to enable swap on server either.


I honestly don't understand that change, as most desktops are RAM limited as well, especially as Debian is regularly used for older machines, which aren't supported by Windows 11 anymore.


Is it common for scripts to download multiple gigabytes to /tmp?

I sometimes manually changed the /tmp to be in memory, or used /dev/shm which by default is in memory. Did not run into any problems just yet, but then again it's just a home server.


Not sure about scripts, but I download and store everything I know I'll only need until the next reboot in /tmp and naturally that tends to be quite a lot from time to time. That worked fine for decades, so I'm not sure what's the benefit if storing the contents of /tmp in memory instead.


Now you can use /var/tmp I think.


I define AGI as the only real AI that can exist. Anything less is just mimicry. To give a stupid/simplified example: a truck driving specialized "AI" would not be able to decide when to stop or not when somebody steps in front of it, making it trivial to rob "AI" driven trucks. To decide it needs to understand the kinds of people that exist, their motivations, the laws, etc. So it needs to be an AGI. Otherwise it will make horrible mistakes we don't even think are possible, or are so uncommon that when they happen to a human they make the news.


> I don’t think there is a single key to intelligence but rather that, unfortunately for both the philosophers and dreamers, intelligence is a vast, complex collection of simpler processes.

I don't think intelligence can be separated from the physical body, world and the interactions between them. A human brain grown in a jar would not be intelligent. Even if you could somehow communicate with it. Human abstractions that stray too far from empirical experience are nothing but hallucinations.

Nor can intelligence be separated from general intelligence. A domain specific "AI" will always have unacceptable shortcommings. For example a programming "AI" not being able to deal with the X Y problem.

TLDR: I am betting on AI being at least a century away. And not being a sure thing even in a millennium.


Might not be worth much but I just want to thank you for being willing to put in the work to make such discussions possible even though clearly (wink) the vast majority of comments don't want to have a discussion. I would have shut it down writing it off as too much work for almost no result.

I don't even want to comment on-topic because I already know nobody will seriously consider my point of view, but just downvote and attack me.


If you watch TheQuantifiedScientist you must have found out by now that optical sensors on the wrist have no chance of ever being accurate enough for health and fitness tracking. No matter how much they massage their algorithms they simply don't have the right sensors at the right positions on the body.

At the same time the fitness features add cost, bulk, the uncomfortable sensor bump and cost battery life. The original Pebble didn't have any of that and in my opinion was better for it. I also see little point in competing with the already existing numerous options for fitness tracking, even if you only look at the ones without a subscription.


Modern optical sensors are pretty dang accurate on garmin watches.


I've watched it and TheQuantifiedScientist is totally missing the point. Current optical sensors on the wrist are plenty accurate enough for general health and fitness tracking. If you don't believe me then you can literally count your pulse with your finger and compare against the watch: very close. Optical sensors aren't great for high-intensity training so for those activities everyone knows you need to use a chest strap if you want accurate data.

For a more practical take on heart rate accuracy see the DC Rainmaker reviews instead.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: