Hacker Newsnew | past | comments | ask | show | jobs | submit | balaam's commentslogin

I don't really want robot weapons but I would love to see a firing range demonstration with one, just to see what's possible when you get super-human aiming.

I don't see a future for humanoid bots in war. Missiles and drones seems to render anything else redundant. Policing in a few more generations seems possible, at least at a technical level.


If you can't tell who the mark is, it's you.


In most cases the server code is home rolled, especially for realtime networked games. Handling multiplayer at scale is hard to solve in house, especially for popular titles and that's usually handled by third parties such as Multiplay (https://multiplay.com/about/)


C++ fills a niche and probably if you use it all on your own you don't even see the problems. But if you use it at scale with multiple developers you can see it's layers of leaky abstraction laid down since the 80s that can never really be cleaned because too much is built on this foundation.

This slightly over 100 page book on move semantics https://leanpub.com/cppmove shows some of the iceberg like complexities dotted all over the place. Some developers probably never use and have never even heard of move semantics.

I've been programming in C++ pretty much my entire career. Personal projects or small projects I'd use Ruby or Python first. I'd only ever use C++ (or a very specific subset of it) if I want to make something extremely performant.

C++ will eventually peak and die off but the timeframe is probably measured in decades and I'm sure there'll places where it holds on even then, like Fortran today. I certainly would't recommend it to a programmer starting out today, instead C or Rust would be better places to get an idea of system level programming.


> This slightly over 100 page book on move semantics https://leanpub.com/cppmove shows some of the iceberg like complexities dotted all over the place. Some developers probably never use and have never even heard of move semantics.

I think Scott Meyers' "Effective Modern C++" which is in large part a collection of caveats and description of things which don't fit always together also underlines that impression:

https://www.amazon.com/Effective-Modern-Specific-Ways-Improv...


Hard agree. I was enthusiastic about reading that book when I got it, as I'd been very late to the bandwagon of new C++11 and C++14 features.

Ended the book in a pessimistic tone. It looked more like a cookbook of pitfalls _everywhere_.

Every new feature looked exciting, but came with a list of cases where the language decides to leave you on your own when it gets too uncomfortable (the "well that's undefined behaviour, I'm sorry, you suck! bye!" escape hatch). So in the end it's like you said: a new list of gotchas to learn by heart.


I think in Scott Meyers case it is he basically never used C++ in production, which in some sense is ok, since he makes his living as a language expert, but sometimes this shines through; also from a D conference talk I remember, even he got sick of it and switched to D.

Also let me use this comment to recommand John Lakos, his talks, his books. He knows how to build large scale applications and if you saw his talk on the 'is_working_day' function (if I remember correctly) you know you were not overthinking.


Thanks for the recommendation, I'll have a look at his work, for sure.

Also I'd be curious to see that talk you mention, do you remember anything about it that could help in searching for it?


I think same talk but different recording: https://www.youtube.com/watch?v=MZUJsCh1wOY I think around 1:08:30 onwards (maybe even before) the example I had in mind.


For me, that disillusion has been one of several steps.

Scott Meyers book was one point.

Then I started to learn Clojure, motivated by the idea that we need better concepts for the upcoming massively parallel hardware (a big influence for me was the article "The free lunch is over" by Herb Sutter: http://www.gotw.ca/publications/concurrency-ddj.htm). I now think that immutability by default is clearly the better way to go, even in close-to-the machine applications like embedded devices and industrial control applications.

Then, I saw this article on the different options and syntax for initializing variables in modern C++: http://mikelui.io/2019/01/03/seriously-bonkers.html

And I was like, no, this can't be serious. I think this was the point where I began to distance from the language (though I still use it at work when I need to).

I got also the impression that the actual language use in C++ is undergoing a serious split. Compare the C++ core guidelines with Google's C++ style guide:

https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines

https://google.github.io/styleguide/cppguide.html

This does already look somewhat like different languages, only that they can be compiled with the same compiler.

And then, the C++17 and C++20 standard iterations, at this point it is just like "this is too much! What is this good for?"


My entry in professional C++ programming was accompanied by Scotts earlier books. The moment gradually to refrain from the language as my main tool again coincides (coincidence?) with his retreat.


> if you use it all on your own you don't even see the problems

100x this. If >90% of your serious coding is in one language, you'll become desensitized to its bad parts but you'll be hyper sensitive to anything that seems inconvenient in any alternative. I've seen this over the years with Fortran and C coders. C++ coders do it to C all the time. Lisp coders are notorious for it. It's Selective Perception 101, counting the hits and ignoring the misses. We're all prone to it, and have to correct for it.

If you really want to compare any two languages, you have to give both a more than casual try, and keep an open mind throughout. Otherwise your conclusions are likely to be unhelpful at best.


One can even think of long compile times as a feature (more time for coffee breaks), well or that nightly builds sounds actually cool.


FORTRAN is still quite popular in many fields of scientific computing and C++ hasn't become ubiquitously mainstream so there's probably quite a lot of potential here.


> C++ will eventually peak and die off but the timeframe is probably measured in decades and I'm sure there'll places where it holds on even then, like Fortran today. I certainly would't recommend it to a programmer starting out today, instead C or Rust would be better places to get an idea of system level programming.

Unlikely. The only candidates for replacement are Rust and D, none of these are anywhere near the adoption of C++.

What C++ can do is get better at being a language and then maybe give tools for developers to phase out outdated constructs or relics from C. It is not going away cause it's the best replacement for C humanity will ever get, at least for the next century.


> Unlikely. The only candidates for replacement are Rust and D, none of these are anywhere near the adoption of C++.

Neither C nor C++ have replaced Fortran in that sense, either. Yet, as far as I can tell, Fortran has peaked a long time ago.


This is the best way to use twitter.

You get a clear stream of useful, relevant content. Unfortunately this also gets framed as a "filter bubble".

I want a filter bubble.

The problem in practice is that twitter actively fights you doing this. It tries to pour raw sewage into your nice clean stream and you can't stop it. You see what other people have liked, or what's popular or lots of other things I don't want to see.

At this point in time the biggest value of twitter is it's userbase, it's critical mass, people are on twitter, they're not on mastodon.


> The problem in practice is that twitter actively fights you doing this. It tries to pour raw sewage into your nice clean stream and you can't stop it. You see what other people have liked, or what's popular or lots of other things I don't want to see.

Even with all the barriers in place, the truth is that Twitter still works better through a third party application.

Your feed comes through relatively unmolested, with just the things you want to follow in the order you want to follow them.


> It tries to pour raw sewage into your nice clean stream and you can't stop it. You see what other people have liked, or what's popular or lots of other things I don't want to see.

Not if you use Tweetdeck.

https://tweetdeck.twitter.com/


> You see what other people have liked, or what's popular or lots of other things I don't want to see.

If you use a third-party app, you won't get most of that. You'll just get a reverse-chronological timeline of people's tweets and retweets, and I think some allow you to filter out retweets.


> ... Unfortunately this also gets framed as a "filter bubble". > I want a filter bubble.

I think describing a "topic-focused, outrage-free" curation strategy as 'filter bubble' is somewhat asinine.

"Filter bubble" is a dirty term because it suggests at best an isolated, distorted view of how things actually are, at worst active exclusion of valid dissent. The point is that important details aren't being noticed in the domain of what has attention.

I understand a lot of outrage comes from righteous minds who want to make the world a better place (or prevent it from becoming a worse place). -- All the same, just excluding outrage feels more like it's (at worst) ignoring 'allegedly important, unrelated things', rather than ignoring 'important, related things'.


Relevant from my old comment here:

I have thought about this before. I believe internet lacks enough echo chambers. In real world, people automatically segregate based on their income bracket, community, values, lifestyle, job, family (kids or not?), healthcare, accessibility to various things, etc. People have all sorts of stereotypes and things they expect other people to conform to based on visible factors. This isn't possible online. There is too little information and our stereotypes will never be correct (too big number). It shocks people. In real life, oh that's a catholic person. Of course I would expect them to say this. Too much difference in opinions leads to defensiveness rather than acceptance. You cannot accept 180 degrees but you can accept 10 degrees slightly left.

https://news.ycombinator.com/item?id=23937973


> "topic-focused, outrage-free" curation strategy

In the old days this work was done by an editor and published in a periodical fashion. My recurring quip about social media is that it’s helping us collectively realize the value of good editors.


Try using List feature of Twitter. It still is free from Twitter's other crap. Besides, it helps you organize account you follow and doesn't show you any ad (at least this was the case till recently).


You can more or less remove the likes from your feed, by using the 'Not interested in this Tweet', then 'Show less likes from XXX'. Twitter will show those again in a few days, but at least your feed is cleaner for a few days. (I'm only browsing Twitter on browser, so I don't know how that works on the app)

Or maybe using lists


Twitter is a filter bubble in its entirety.


For me, Windows 10 is where I felt like I was no longer in control of my own computer.

Microsoft doesn't do discrete OS updates anymore but I definitely feel like it might time to give Linux another go next time I upgrade my hardware. There's less and less software the holds me to MS. This is due to so many programs going to the web - Google Docs suite, dictionaries etc and partially due to better Linux support for software I use such as Unity or games via Wine. What's held me back until now has been poor graphics driver support.


IMO Linux still has pretty poor graphics drivers, and the whole Wayland thing is in a weird spot right now too. My primary use my my PC is a work machine, for which Linux is very well suited. I also game on my PC. For that, I have tried limiting my self to native Linux games, Wine, directed assigned graphics with a windows VM, and plain old dual booting. In the end I keep end up dual booting. Gaming graphics cards don't work well under Linux. However, I've had great luck with low end workstation graphics cards, especially those running the new AMDGPU driver.

If you are an expert, there has never been a better time to move to Linux. If you are a novice, there still hasn't been a better time, but its also still not for everyone.


> IMO Linux still has pretty poor graphics drivers

The situation is like this: https://preview.redd.it/79011affulj11.png?width=800&format=p...

nVidia: I've been gaming on nVidia proprietary drivers without any issues. Though most demanding games were CS:GO and The Talos Principle.

AMD: At work I'm on workstation with some 2 years old Radeon and AMDGPU drivers (mainline). No issues so far.

Intel: I'm using it on my laptop. No issues so far.

What poor graphics drivers do you have in mind?


I mean you linked the meme? That pretty well sums things up. Like I said in my original comment, workstation graphics cards tend to do great, as do older graphics cards.


I've used mix of old and new graphics cards and I have never experienced issues with drivers related to how recent the graphics card was. The few issues I had were always with the driver itself. And I've never used workstation-grade graphics card, always consumer ones.


not to get too offtopic but performance in Linux for Vulkan routinely beats native windows video performance. https://www.protondb.com/

https://www.phoronix.com/scan.php?page=article&item=dota2-ma...

at present there are about 6500 working games in Linux that take advantage of advanced 3d graphics. These include the latest Doom, Fallout 76, and Borderlands 3. Im currently running a Radeon RX 470 and playing everything from Aragami to P.A.M.E.L.A. and Wolfenstein New Colossus at 60fps or higher.


Yeah, performance certainly has come a long way. The bigger issues these days are stability and compatibility. Both nouveau and nVidia proprietary are pretty crashy (and not just for me). nVidia proprietary isn't great with wayland. Catalyst is pretty bad. AMDGPU is great, but I haven't really pushed it beyond a low end workstation card. Intel is fine, but intel doesn't offer high end graphics.


> the whole Wayland thing is in a weird spot right now too

You can still use X though, speaking from experience Nvidia drivers don't work at all on Wayland.

Furthermore this is a great resource which will certainly be useful to OP

https://www.forbes.com/sites/jasonevangelho/2019/03/15/linux...


I actually bought a second graphics card for use with linux because I was sick of deciding between slow (nouveau) and crashy (nv proprietary). I have a WX2100 and it has been completely rock solid. I can't use X11 without some as-yet-to-be-discovered workaround because it won't give up on using my nvidia card, even though I have that card attached to vfio-pci. Wayland is solid though.


> IMO Linux still has pretty poor graphics drivers.

IMO Intel is generally very well supported with the open-source kernel driver and amdgpu is getting there. This feels like a trope that always gets repeated but never updated, kind of like "you need CLI to use Linux".

> the whole Wayland thing is in a weird spot right now too

The whole Wayland thing is already much smoother for many than X11 ever was and it actually takes security seriously. I am running GNOME on Wayland full time since 2018 and by this point there's nothing I miss from X11, any specifics?


> there's nothing I miss from X11, any specifics?

There are plenty of applications that still depend on Xwayland, and the nvidia proprietary can't accelerate these. There are some other weird stability issues related to the nvidia driver as of mid last year at least. Wayland is missing a bunch of features from X, which bothers some people. I personally miss network transparency from X. That said, I do run wayland full time, and have for a few years now. I just can't recommend it without reservations.

I can't comment on catalyst, or AMDGPU on current generation AMD gaming graphics cards. What I can say is that you can pretty much blindly assume that whatever gpu you want to buy will work well under Windows.

> IMO Intel is generally very well supported

I would add AMDGPU to that list as well. Still, AMDGPU is new, and Intel doesn't offer a high end graphics card. nVidia proprietary causes problems, such as those it causes for wayland. I am very happy with my AMDGPU compatible WX2100 for linux, but it is certainly not a good gaming GPU.


Sure, nvidia is problematic, hence Linuses now famous rant, but that's to be expected, they don't cooperate at all.

I understand it may be the only option for high-end gaming, but in that case one probably doesn't have switchable graphics, which causes the most issues, so X/Xwayland is the one compromise one has to make.

This however does not mean that Linux graphics is a mess in general. I have both Intel and amdgpu machines that work out of the box, no problem. And this does not ever gets asked in relation to macOS, because everyone "knows" nvidia just wasn't a thing there in recent years. So perhaps we should assume nvidia just isn't a thing as well, unless you're willing to put up with crap?

So does Linux and Linux on the desktop work for everybody? No. Does it work as good for a high-end gaming machine as Windows? No. Does Windows work as the most productive fronend/backend/systems OS for programming out there? No. Does Linux? Yeah.

So while they both have their strengths and weaknesses, Linux works perfectly well for a lot of use-cases, the graphics driver situation included.

As for Wayland not being network-transparent, the fact that it worked in X in the way it did was in fact a massive hack and a security hole and while sometimes convenient, I personally didn't need GUI over SSH almost ever, apart from playing around, so I'd take a proper security architecture, like Wayland has, over network transparency.

This is also not a feature supported by Windows/macOS, so not really a point against Linux in my book.

The early complaints, like screenshots etc. re Wayland were all resolved some time ago too.


> X/Xwayland is the one compromise one has to make.

No you don't. You could use Windows.

> So perhaps we should assume nvidia just isn't a thing as well.

Sure, that is what I do. How do you explain that to a novice that is just trying to run linux on their computer for the first time?

Look, I use linux full time for most tasks, both at home and at work. Professionally I am a kernel engineer. I am very pro linux, and pro new linux users. I think that we do linux a disservice by pretending that there aren't issues. The simple fact is that the linux graphics experience is not seamless like it is on other operating systems. We shouldn't pretend that it is just because experts can make it work.


> No you don't. You could use Windows.

Of course, provided you don't mind the ads, the telemetry, the updates that delete your files [1], this is assuming you're actually interested in Linux.

> How do you explain that to a novice that is just trying to run linux

That perhaps you do look for Linux HW before purchasing and do not install Linux on any old crap?

Look, I could hand somebody a Chromebook and ask them to try how well Windows would run on it. The experience wouldn't be great. Same on a Talos II or some other obviously problematic HW for Windows where Linux fares much better, but it's a pointless game.

Most people don't run into this situation because BestBuy laptops come preinstalled with Windows. If they came with Linux, there'd be a reverse situation.

But with macOS people somehow don't assume they can run it on a random POS HW and indeed know that it's best to get one designed to run macOS.

Why is the standard Linux is evaluated against always different?

Just get a Linux-compatible machine and you're good, that's it. I know as I've done it many times now.

It's not about pretending there aren't issues. But I've realized we're being held back by always trying to snatch this mythical 'new user' with his BestBuy laptop, or someone with a knowingly problematic HW, (unless nVidia cooperates, there's little we could do) and Windows will continue to come preinstalled on the majority of PCs, so a mass exodus isn't happening anytime soon.

What we could do, instead of being this "honest" and just reinforcing the idea that Linux doesn't work, we could be even more frank and just say avoid this vendor, but this and this one is fine.

Reinforcing the idea that Linux is crap because of some shitty corporation that refuses to support it gives too much importance and power to said corporation when they don't deserve it.

1 - https://www.tomsguide.com/news/massive-windows-10-fail-new-u...


> Gaming graphics cards don't work well under Linux

They work fine, haven't had a problem with them for years, using nvidia cards. Most distros just ship the proprietary drivers now, and those just work.

There just aren't masses of native games. That said, Steam now has a built in version of wine called "Proton" that's worth checking out.


I actually did after I installed windows 8. And never installed windows 10. Dont understand me wrong, linux IS a mess not the kernel but the whole userspace is a catastrophy, from systemd to the networkmanager. I never liked it but I started to use it, not becoase it became better but as windows became so much worse. I would prefer freebsd as on my servers but as most of development is going on on linux it doesnt stand a chance. Which is a pitty.


I have never really understood the "Linux is a mess" argument. While I do agree that it is a mess, it is not as though Windows is any better. The big difference is that the Linux development process is open and the end user is more likely to tinker so a lot of criticism bubbles to the surface. Contrast that to Windows, where there is very little point in discussing how things work under the hood. You may have to know enough about it for setup, maintenance, and troubleshooting. That is about all.

The arguments also ignore some of the great things about Linux. One of the things that I marvel at is how easy it is to setup and maintain a Linux system. Things like software typically being installed and updated through a single source is a tremendous time saver. While many device drivers are lacking specialized features, most hardware will work out of the box. There are also fewer concerns about what is happening behind the scenes. Walkthroughs like this one are fairly common in the Windows world: disable advertising, delay updates, block telemetry, improve performance.

None of this is meant to imply Linux is good and Windows is bad. Both platforms have their benefits and drawbacks. Neither is ideal for everyone. It's all about which trade-offs you're willing to accept.


Just listen to this. I bet you will recognize the speaker.

https://youtu.be/5PmHRSeA2c8?t=474

It is 100% to the point and nothing has changed in 6 years since (and not only on debian), except that the mess is being solved by docker. Which is crazy. And windows is heading into same direction (when I have seen the manifests the first thing I have said was... omg... .dll.1.4.3)

Just additional info, to help you get into my mindset - system level developer for linux AND windows for 25 years.

(anyway, I have thrown tomatoes into directions of two largest fanboy camps, guess what will happen? =/ )


As someone who uses all 3 OSes to various degrees, the while "mess" thing just strikes me as something that's supposed to be said around Linux and mic drop, as if that in itself meant something.

Linux is often installed on shoddy hardware that Windows no longer runs acceptably on and then people complain why it runs shoddily compared to their new Macbook/DELL XPS that they just purchased.

This very topic shows that Windows is a mess too, there's now been multiple updates that deleted user data as well. Why is that not a mess? Or why is so much more attention paid to the Linux mess? macOS allowed users to log in without a password, leaked encrypted HDD secrets, borked curl & SSH, borked user machines with an update, throttled user machines etc. and that's just from the top of my head. And that's bearing in mind that Apple controls almost all the variables.


Well, to be fair, these user space components are often judged as catastrophes from the point of view of Linux/UNIX users/admins. In other words, Windows components for same tasks were never scrutinized at this level. Windows users used often scoff at me when I am troubleshooting problems with my Linux boxes. But the fact is, I can troubleshoot and in most cases fix problems with systemd/NetworkManager, whereas when corresponding problems arise on Windows boxes there's really not much you can lean on, because inner workings of these components are not public, so 95% of Google searches result in total crap, and the remaining 5% reveal that design of these components isn't at all that perfect. (NetworkManager equivalent in Windows is big pile of steaming crap.)

So yeah, I don't like design of systemd. But hell, compared to dealing with Windows, it's precious.


> I don't like the design of systemd

I feel like many who level this charge have unfortunately never really looked into its design or compared it realistically to what came before, because I can tell you that if I am comparing the ability to write a systemd service in a declarative way and having it working with the same set of commands across all major distros, I'll take that over having shoddy quality bash scripts that vary across distros just slightly enough to annoy and that rely on a patchwork of PID files that is very easy to mess up, I am taking systemd any day of the week.


I did all that and I like unit files and all the rest that you mention. But there are little things here and there that suck.


Oh sure. But that's the case with all software I can think of. And there was a lot more of these things in SysVinit than systemd.

I am not saying it's perfect. I am just saying it is generally better than what came before.


"For me, Windows 10 is where I felt like I was no longer in control of my computer."

If Microsoft cannot not stop Windows computers from becoming controlled by a botnet, the logical solution is to create their own. Well, that is what it looks like.

Big tech companies will tell users that they will "protect" them from this, that and the other thing (including protecting users from themselves), but they cannot protect users from the company. Obviously, showing ads demonstrates that the user, what Microsoft knows about her, is now a "product". Nevertheless, it is fair to ask why any user would prefer to control their own computer 100% instead of letting Microsoft control it for them. It is possible that, outside of HN commenters and voters, most Windows users are ambivalent.

I can remember when Windows had no way to access the internet, only local area networks. It was much faster. Why not temporarily bring down the interface to the internet gateway when it is not needed, or just delete the default gateway from the routing table. For example, if one is only using a program that does not need internet access, e.g., a word processor, spreadsheet, dictionary, calculator, etc., there is no need to have a connection to the internet. A script could be used to automate this procedure. This way the user has some control over when Microsoft can install updates, send telemetry data, etc., i.e., when the company can "take control". When the user is done with her offline work or game playing, she can bring up the interface or re-add the default gateway and let Microsoft have control.


> I can remember when Windows had no way to access the internet, only local area networks. It was much faster.

> When the user is done with her offline work or game playing, she can bring up the interface or re-add the default gateway and let Microsoft have control.

By that reasoning deleting System32 folder is also an option, and should make your computer 10 times faster /s


Windows has always been like that. Want to shutdown? Oh I'm sorry windows I didn't realise you planned on doing an update! How about I pull the power cord out and we discuss this again later? Computers are our slaves, I don't want to know what is most convenient for them.


Installing updates during a shutdown or restart is a recent phenomenon in the lifetime of the Windows NT family. Windows XP post-SP2, if memory serves, attempted to offer-up updates via the Explorer shutdown/restart/logoff dialog. Windows 8 (again, if memory serves) was the first version that was more "militant" about forcing update installation. Windows 7 was the last of the client-oriented versions of Windows that permitted you to (easily) defer updates indefinitely.


Windows 7 regularly wanted to force me to update on shutdown. I remember it well, because at the time, I was doing my first startup and had a train to catch in the evening, so I regularly had to hit the power because the train wasn't going to wait for Windows to update. It infuriated me so much.


Windows 7 update installation was easy to bypass, however, and still get a clean shutdown. You could use the command/line shutdown command to restart or shutdown w/o installing updates. In Windows 8 and following they put that idiocy into logonUI.exe (which runs outside your logon session) to force update installation.


Well, if there was a way to bypass it, I never found it. Often when I shut down, it gave me the "windows is updating, please wait and don't turn your computer off" message with no options.


It's the reason I developed the muscle memory for Windows-key / R / shutdown -r -t 1 -f / Enter.


Shift-click, it would shut down without loading them.


A few years too late. Pity it wasn't discoverable, because I never knew about it. I'm happily running anything but windows nowadays. (MacOS for work, Linux at home)


Crazy feature. Many times you are just shutting down to dash out the door, but no, windows wants to amend my schedule.


Input to computers used to be treated as commands. You would command computers to do things and they did them. Now, our input is treated either as unreliable ("Are you sure?" dialogs), treated more like a suggestion ("close app" vs "force close"), misinterpreted (oh you didn't want to run update when you issued a shutdown command?) or ignored entirely. It's gotten me irrationally angry.

I remember getting angry when I first had to ask the computer "pretty please, could you shut down," and having that possibly fail, rather than just throwing the power switch.

I remember getting angry when I first had to ask the computer "pretty please, could you eject the disk," with a non-zero chance that no, the computer is busy and you can't have your disk back.

I remember when undelete was an option, and delete was delete.

Now everything is a request. Everything second-guesses the user. Every button and switch is soft button that puts "software with lots of excuses" between you and what you want to command the computer to do. Every software iteration puts the user further and further into the backseat as a passenger rather than where they should be as the driver.


For me, OS X has always been worse about that. Want to shutdown? Better get ready to manually close every single application, because if any of them disagree then that shutdown ain't happening.


Windows does that sometimes too though. Is there no “This program isn’t responding. Do you want to force kill it?” like thing?


There technically is, but the dialog takes too long to pop up, and sometimes never does for some inexplicable reason.


Linux: sudo shutdown -h now

It does my bidding.


I've always typed poweroff. I guess that's because of my time at Sun, since that is the fastest way to shutdown a Solaris machine.

I was always under the impression that shutdown -h ran more shutdown scripts?


I was about to say—this is how I shut down macOS when it doesn't feel like cooperating.

It would of course be nice if the GUI just consistently did what it's supposed to, but oh well...


This also works in osx, and it’s much faster than the gui based shutdown.


> Want to shutdown? Oh I'm sorry windows I didn't realise you planned on doing an update!

Also recently "shut down" became "hibernate without fully taking the filesystem offline". Because clearly, that's what I meant. Not actually shut down.


I wish I had a tenner for every time I swore I disabled the last power-sucking feature only to find my ThinkPad battery was again dead after being shut down for three days.


Don't forget that Ubuntu is moving into same OUR_USERS_ARE_IDIOTS model. Obligatory reminder: https://forum.snapcraft.io/t/disabling-automatic-refresh-for... Avoid Ubuntu at all cost or you're just going to step from frying pan into fire.


Debian is a great distro that works, does what it is told.


Flatpaks work perfectly fine on Ubuntu, just remove the snaps and you'll be fine.

```flatpak permission-set flatpak updates $APPID no```


Which is why I wish people would stop recommending Ubuntu as "the" Linux distro.


at this point Windows 10 LTSC will be the last version of Windows I use


They are trying to kill the idea of local apps and exe's, aka they can finally turn "files" into property by using active directory and NTFS for the entire internet. You all seem clueless as to the last 20 years of software theft in the game industry.

Everyone 20 year ago on slashdot was worried about software and hardware drm, windows 10 is the first version of windows where they are trying to turn the PC into a mobile locked down platform because of the success of walled gardens like steam, world of warcraft, and the appstores like apple/google play.

The level of stupid on hacker news is disturbing, we used to get complete PC games in the 90's and early 2000's before the public fell for mmo scam of the late 90's which put PC gaming on the path of massive game theft.

Valve is basically a criminal game stealing empire by infecting games with client-server code. The whole industry wants to take us to mainframe dumb client computing.

https://tifca.com/wp-content/uploads/2019/09/ClienttoCloud_V...


> Valve is basically a criminal game stealing empire by infecting games with client-server code.

Uhh.. Valve release all their game server binaries for anyone to run, and sometimes even the source code (Half-Life, Half-Life 2). If you're going to rag on any games company, it shouldn't be them.


(Replying to dead comment.)

> Valve did so because they had steam money to rely on, they were boiling the frog slowly.

Citation needed. Valve have been very good to the TF2/Dota 2 communities. Again, you _can_ run the servers for these products. Worst comes to worst, if Valve turns evil, then pirating the software is easy.

There's good points to be had that Overwatch is terrible for not releasing the server, but Valve are not.


You don't seem to understand VALVE is an outlier because they have so much power, for the rest of the AAA game industry, dedicated servers largely have been under attack. Because client-server and drm ridden software allows them to put in game stores. They don't want people to have dedicated servers if it interferes with their microtransaction business model. AKA in game stores means further eroding software ownership.

You don't grasp the reason quake champions is a server locked game is because of gamers buying client-server coded software.

You're a corporate fanboy. You don't grasp the reason Doom 2016 and no level editor is specifically because of the last 15 years of the war on PC game ownerhsip that began with mmo's in the late 90's.

That all those "MMO/freetoplay" games on steam would have been boxed products with lan/server exe's in a former era, of the public had not taken them up to begin with, valve would have never come up with STEAM. Steam was a direct reaction to ultima online.

Valves long term plan was to remove ownership from his customers, and valve no longer needs to produce games because their long term agenda was making money, they want to be the middleman that skims money from every game sold.


Valve did so because they had steam money to rely on, they were boiling the frog slowly.

Notice what they did to TF2 and now the are making artifact (totally microtransaction based game).

Notice how all that money from Dota 2 and TF2 were the end game, you don't get that STEAM malware client, was for valve to ultimately to get to in game stores. Valve was very careful in how they manipulated you into not owning your ow games so they could shove in game stores and mtx into them.

For other games like battlefield or modern warfare, and even overwatch... the problem remains that they can shut down the game remotely, aka Overwatch is not coded in an honest way where you own the game outright, you can't use its multiplayer as a stand alone app, aka you're using stolen software that is coded in a criminally screwed up way so that Activision maintains control.

The whole goal for the game industry was in game stores and mtx in every game and to do that they need to undermine game ownership.

Fortnite, Dota 2, and league of legends would have been fully boxed product games with LAN multiplayer/dedicated servers in a past era, aka for most AAA games we no longer get server exe's and get matchamking.

Quake champions is literally a stolen quake game where you don't get to own it, quake fans wanted a quake they owned and controlled like the first 4 quakes, with QeRadiant, dedicated servers, etc.

Quake champions can only exist in a world where companies have won and games are being coded in criminally underhanded ways to deny local application game files.


I think the GP means steam "ecosystem".

Over the last 10 years and 1000+ of hours of gaming, I've literally never used a single "feature" or had a single positive interaction with steam other than downloading the games (for which a zipped installer would have worked just fine), yet I had many cases where I couldn't play because steam forced update but couldn't download it due to slow connection/really wanted to go online on the plane because offline mode used to be buggy/had the update install and completely break steam/couldn't login/connect to valve servers, etc.

They were fixing these but new stuff always comes up (and is 0% justified), and some stuff is by "design" e.g. "offline" mode still installs steam updates. At this point I have steam completely firewalled out and I turn that off only when I want to get a specific game or a DLC.

Steam is malware (so are Win10 ads, forced updates, etc.)


Unrelated to the "walled-garden" debate Valve has invested heavily into their Linux game compatibility tools and invested in the upstream projects that make them happen. That allows people who game on PC to break out of the pending Windows walled garden dystopia that the original commenter fears.


The polemic style writing is really off-putting, but I wholeheartedly agree with this.

I know Microsoft keeps trying and failing, but I think they will eventually succeed with a separate version of Windows that is clearly a separate "walled garden" product, that is far more secure, far better performance, only supports UWP apps, and is uncoupled from being in the backwards-compatibility hell that the "main" Windows variant is.

Such a product "powered by Microsoft Office" would be an incredibly compelling Chromebook competitor. As to whether or not it will support ARM, support AD, or even be open source, only time will tell. I could definitely see something like Windows X or a rebuilt Windows S (locked into S) being a glimpse into this future.


It sounds a lot like Windows RT.

Marketing it would be an incredibly tiny needle to thread-- promoting the OS can run Microsoft Office while immediately changing the message when they ask about every old no-source-available compiled-for-Windows-98 app.

It feels like the people who can switch easily probably jumped to Chromebooks already.


It is a lot like Windows RT, because I feel like they've been trying this model every year. Windows RT, Windows 10 S Mode, Windows 10X...

The latter (designed for dual screen laptops) is coming to single screen devices. https://www.theverge.com/2020/5/4/21246561/microsoft-windows...


I don't personally find it off-putting. Op made some great points. and they deserve to be addressed on their merits, not on their tone.


You can make great points without saying everyone else on the site is "levels of stupid". That's unnecessary, unhelpful, and just escalates the debate immediately. OP is the one who brought an incensed and polemical tone to this thread - asking everyone else to only react to the substance just gives them a free pass on what is innately anti-social behavior that detracts from a civil discussion.


No, I know. I get the ideological desire for the point you make. But I find some levels of rancor endearing, genuine and humanizing especially when the points seem to merit it.


Some of it is also the OP seems to have a history of just going around making incendiary posts and insulting the intelligence of anyone they disagree with. It's endearing the first time - when you've got a pattern of it, and often don't actually bother to reply or discuss with anyone, you're just a flame-bait.


In what way is World of Warcraft a "walled garden"? It's a single game. If anything, it's one of the most extensible online games made by a major studio. Do you just mean "walled garden" as "not FOSS"?

It's a game as a service, sure, but that doesn't make it a scam. That's like saying that renting an apartment is a scam because you don't own it. You can question if it's a good financial decision, but that doesn't make it not a fair exchange of money for a product.


But... when I buy a gun in one game, I'd like to be able to use it in another.


I was with you until MMO Scam. The people behind Mazewar, Ultima Online, MUDs, etc. were not trying to scam you out of money and ownership. They were innovating heavily on what games could look like in a networked platform. It was revolutionary at the time.

Not everyone in the industry is on one side here, and if you want to fight this war efficiently and intelligently, you need to know precisely who your enemies are.


Also with such highly populated games they have to do a TON of ongoing work to support the infrastructure to the point that they would lose money on customers in a matter of months if they didn't charge subscription fees.


Hot rhetoric aside, this comment is insightful, so I'm vouching it.


Seems like this is a continuation of another account which has posted nothing but the same kind of thing: https://news.ycombinator.com/threads?id=som33


There is still CD Project Red which seems to be the last bastion of DRM free games. I hope Activision (aka Tencent) will never buy them out. China is a cancer also in gaming industry.


You should be able to (within reason) use your computer without an internet connection.


Orbital Operations by Warren Ellis http://www.warrenellis.com/orbital-operations-a-new-newslett...

Various musings from author Warren Ellis - comic books, script writing, weird fiction are probably the mainstays but it can include all sorts.


I agree with this. A basic income program could have an unexpected dividend of a new innovation boom.


> You need very good practices to make C++ compile fast.

I would reframe this as the design of C++ makes it hard to write code that compiles fast.


That makes mse think that are probably a lot of easy wins in the funnel of getting people successfully through the course.


I think you sometimes need a new word or term to better capture an idea. I quite like the idea of mircomastery. We'll have to wait and see if it's one that survives though.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: