Hacker Newsnew | past | comments | ask | show | jobs | submit | hyperbovine's commentslogin

On the flip side, if the rumored AI crashdepressionapocalyspe does in fact materialize, those things will become super cheap.

but no one has money anymore to buy it. :(

I don't understand why this comment is downvoted, it is undeniably true.

GNOME and KDE have stepped up with their design and user experience. I recommend you give them another try.

Often, when we don't understand something, asking questions helps us learn. Happy to answer any you might have, to help you understand.

Whereas on my laptop and my distro it works. And a lot of other people probably feel the same way. I use Linux at work and have never had issues with it in the last 6 years. Prior to that, yes.

Because for most of us, it's simply not true. It's as stable, if more, than MacOS, by far.

The word "stable" literally does not appear in the comment to which I was responding.

Maybe I'm just scarred from laboring much too hard in the 90s and aughts to get desktop and laptop Linux working, but here is my current take:

- Yes there is fragmentation. Perhaps there are not hundreds of Linux distros but, off the top of my head: Debian, Ubuntu, Mint, Fedora, RHEL, CentOS, Rocky, Alma, Arch, Manjaro, openSUSE, Kali, PopOS, elementary OS, Zorin, Gentoo, Alpine, NixOS are all viable options. Next, pick a desktop: GNOME, KDE Plasma, Xfce, LXQt, Cinnamon, MATE, Budgie, Pantheon, Deepin, Enlightenment. Each has different UX conventions, configuration systems, and integration quality. There is no single Linux desktop and its bewildering. - Power management now "works" in the sense that, when you close your laptop lid and re-open it, yay! the machine (mostly) comes back to life instead of just crashing. It took us at least 15 years to get to that point. However, PM does not work in the sense that battery like on my M4 Macbook Air is literally 2x what I would get from a comparably priced Linux laptop. Part of that is better hardware, but _a lot_ of that is better power management. - Audio now mostly works without glitching, just like it did in OS X circa 2002. But God help you if you're not using a well-supported setup and find yourself manually having to dick around with kernel drivers, ALSA, Pulseaudio. (Just typing these words gives me PTSD.) Here is a typical "solution" from *within the past year* for audio troubles in Linux: https://www.linux.org/threads/troubleshooting-audio-problems.... There are thousands more threads like this to be found online. For typical, 99%-of-the-time use cases, experiences of this sort are rarely if ever encountered on Mac. - Printing is arguably the closest because, as previously noted, they are both using the same underlying system. But printing, thanks to AirPrint, is still smoother and more pain-free on Mac than on Linux. - Don't even get me started on Bluetooth.

It's not that I'm anti-Linux, I wanted sooo bad for Linux on the desktop and laptop to succeed, for a variety of reasons. But Steve J came along 25-30 years and completely pulled that rug out from under us.


But hey, he also owns Sees Candy.

I still don't understand why certain performance aspects of the CUDA platform are so poorly documented. Why is successfully pushing the hw to its performance envelope considered a novel research result? Shouldn't I be able to look this stuff up on the Nvidia website?


One reason is clearly the fast past at which nvidia is evolving the hardware. I would consider cuda a very well documented platform in general. What they lack is low level tutorials, but this is where posts like this one can be a good resource


The bar is low.


> 1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.

No way that is true any more. Five years ago, maybe.

https://www.reddit.com/r/pcmasterrace/comments/1izlt9w/nvidi...


on the contrary. this is the place they can try out new tech, new cores, new drivers, new everything with very little risk. driver crash? the gamer will just restart their game. the AI workload will stall and cost a lot of money.

basically, the gaming segment is the beta-test ground for the datacenter segment. and you have beta testers eager to pay high prices!

we see the same in CPUs by the way, where the datacenter lineup of both intel and amd lags behind the consumer lineup. gives time to iron out bios, microcode and optimizations.


LOL of course they don't want to own Anthropic, else they themselves would be responsible for coming up with the $10s of billions in Monopoly money that Anthropic has committed to pay AMZN for compute in the next few years. Better to take an impressive looking stake and leave some other idiot holding the buck.


Isn't taking an impressive looking stake, in effect, leaving them holding the buck?


Now I’m no big city spreadsheet man but I bet you “company that owes us billions went belly up” looks better on the books than “company we bought that owes us billions went belly up.”


It’s pretty crazy that Amazon’s $8B investment didn’t even get them a board seat. It’s basically a lot of cloud credits though. I bet both Google and Amazon invested in Anthropic at least partially to stress test and harden their own AI / GPU offerings. They now have a good showcase.


Yeah. I bet there’s a win-win in the details where it gets to sound like a lot of investment for both parties to look good but really wasn’t actually much real risk.

Like if I offered you $8 billion in soft serve ice cream so long as you keep bringing birthday parties to my bowling alley. The moment the music stops and the parents want their children back, it’s not like I’m out $8 billion.


Echoes of Enron-style accounting "tricks"...


Why does everybody keep insisting on this “Enron accounting” stuff. LLM companies need shitloads of compute for specialized use case. Cloud vendor wants to become a big player in selling compute for that specialized use case, and has compute available.

Cloud provider gives credit to LLM provider in exchange for a part of the company.

These are really normal business deals.


Amazon gave away datacenter time share in exchange for stock in a startup. That has nothing to do with electricity futures and private credit revolvers.


They said echoes of the tricks.


I've read thousands of pages of Enron history, this has nothing to do with it.


This is my thought too. They de-risked any other AI startup from choosing AWS as their platform. If the hype continues AWS will get their 30% margin on something growing like rocket emoji, if they don't at least they didn't miss the boat.


No! Anthropic goes bankrupt or at least teeters and Amazon buys for pennies


this might be another possibility -- but why even bother to buy anthropic anyways, if not for the patents?

the talent will move out naturally -- amazon can just scoop up with its bucket (*not s3)


[flagged]


Why would they buy Anthropic when they already have access to all the tech and source-code of Anthropic for free ?

Not only the models but also training data, model architecture, documentation, weights and latest R&D experiments ?

Take an instance -> Snapshot -> Investigate.

Unless they get caught it is not illegal.


GP is just commenting on the use of a mixed metaphor.


hell if passing the buck is the opposite of holding the bag then maybe we should mix em

maybe the full array of options is: pass the hot potato, hold the buck, or drop it like a bag.


Agree.


There's nothing to solve. The CoD kills you no matter what. P=NP or maybe quantum computing is the only hope of making serious progress on large-scale combinatorial optimization.


Eh, those applications (incl. protein folding) existed for a decade-plus before LLMs came onto the scene, and there was absolutely nothing like the scale of capex that we're seeing right now. It's like literally 100-1000x larger than what GPU hosting providers were spending previously.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: