Hacker Newsnew | past | comments | ask | show | jobs | submit | orourke's commentslogin

This has been my experience exactly. Even over just the last few weeks I’ve noticed a dramatic drop in having to undo what the agents have done.


I think the concern in this case is that, unlike before where machines were built for other people to use, we’re now building machines that may be able to use themselves.


Not that much of a difference tbh. If one traditional machine allows one worker to do the work of twenty in half the time, that's still a big net loss in those jobs, even if it technically creates one.

The real issue is that AI/robotics are machines that can theoretically replace any job -- at a certain point, there's nowhere for people to reskill to. The fact that it's been most disruptive in fields that have always been seen as immune to automation kind of underscores that point.


The concern is the same, people want to be taken care of by society, even if they don't have a job, for whatever reason.


In the old times, this was a "want" because the only people without work were those unqualified or unable to work. In the new times, it will be a "need" because everyone will be unemployed, and no one will be able to work competitively.


Why would someone try to hurt this guy? This site is great.


I have found that there are people who just want to watch the world burn. There are many reasons, but, at its most basic, hurt people hurt people.

That's something I like to keep in mind, when I'm reacting to someone being ... less than friendly ... By reacting badly, I then make it all right for them to justify doing it again, to someone else. I've found that I can defend myself, without becoming a foaming-at-the-mouth maniac. We can enforce our boundaries with water pistols, most of the time. We don't need nukes.

Everything is connected. This chap may be naive, but he's actually trying to set good connections in motion. I applaud that.


I feel like a lot of people desperately want to make some kind of impact on the world. Something that causes some number of people to acknowledge their existence (or the effects of it).

It takes real effort to do that in a positive way with a society built around surfacing negativity.


I think we need to recognize that there are people who genuinely get off on hurting others, and it really isn't any more interesting than that.


I dunno, that reads a little too simple to me. People aren't magical black boxes of mysterious drives and unfathomable causes, they're components of a larger system and reflections of their environment.

Speaking as a reformed 'teen who wanted to watch the world burn', for some it isn't simple omnidirectional malice, but rather a deep and confusing sense that the world is out to get you (spoiler; in some ways it absolutely is) and an instinct to throw a haymaker just so you feel you didn't go down without a fight.

Once this kind of person begins investigating the causes of their discontent - I myself have come to the conclusion that outdated institutions and capitalism are prime suspects - you can do quite a bit more to focus down that energy on the deserving. If you're young and/or dumb enough to not know the difference between the mynoise guy and 'the system' it's almost a forgivable mistake.

That said, from a practical standpoint, yes. Some people just kinda suck real bad. The why isn't always going to get you closer to a cure.


Most folks find that it’s a lot easier to tear down, than build up.

I’ve always really enjoyed building up, but it’s definitely not the easiest path.

I have managed to make a couple of mid-sized splashes, but many folks have no idea that I was behind them, which is fine with me.


“One wants to be loved, in lack thereof admired, in lack thereof feared, in lack thereof loathed and despised. One wants to instill some sort of emotion in people. The soul trembles before emptiness and desires contact at any price.”

— From Doctor Glas (1905) by Hjalmar Söderberg


> I have found that there are people who just want to watch the world burn. There are many reasons, but, at its most basic, hurt people hurt people.

I'm not sure that it's even malicious. I think many hackers look at a website or a service as a game to play. They aren't thinking so far as the person that this action affects, just as far as "I wonder if I could get all the data off that site?" or something similar. And on top of that, some view the rate-limiting as a challenge.

I think it's the same thing that drives the excessive snark or cruelty in comments. They don't think of the person on the other end as a person, they think of them as an endpoint.


You are correct about the way we dehumanize others on the Internet, but I think hackers have changed, quite a bit, since War Games.

Hacking, these days, isn’t just for the lulz. Hackers have a purpose, and that’s usually monetary or military (sometimes both).

Hacking crews, these days, run professional organizations that would make a lot of SV C-Suiters green with envy.


It’s not targeted. Having maintained a site, my experience is the internet is a wasteland of AI crawlers, script kiddies trying to turn every form into an amplification vector, or vulnerability probers.


Right? I got his email too and man, I love what he does, I'm glad he's recovering from his illness etc. I've donated many times. But it's insane for him to think this was targeted. This was some bot/AI gone rogue. Or similar.

If someone wants to take you DOWN they will. And not by downloading a bunch of a files a heap.


It's just possible that someone at some big AI company just pressed a button to add this to their collection of training material. And lazily or otherwise just hit the checkboxes for 'repeat' and 'forever'.


The article explains that it started as a hacking attempt with requests to inject code. It’s not a simple AI company scraper.


Without knowing more about it, this coulda also just be one of the uncountable amounts of automated scanners that are iterating the whole internet constantly trying to find wp-admin etc. type vulnerabilities.


If all my years on the internet have taught me anything, it's that some people are just severely mentally unwell and will attempt to destroy anything they can get their hands on, purely because they can. Sometimes it's for attention, sometimes they just want to watch the world burn, but either way, asking "what did their target do to deserve it?" is pointless because the attacker likely never asked themselves that question either, and could very well just be a straight up sociopath.

As the internet grows, so grows the number of such people on it. In days gone, these people would've been rightly shunned from society, and their ability to cause harm to others was severely limited, unless they were willing to resort to more... extreme methods that would usually come with serious consequences. But the internet has given them a new outlet, a new way to ruin things for people from across the world that would've been far, far beyond their reach before, usually without any risk of punishment.


In the case of unsubscribe links I think it’s more about having your sending reputation destroyed by ISPs because they will penalize you heavily if people have to use the spam button to unsubscribe. Our company makes it as easy as possible and practically encourage people to unsubscribe because of this.


I’ve been working with React Native and Flutter and every time I have to interact directly with iOS/Android, I find that Android is much easier to work with and feels much better designed from a software/api/config perspective. Where Apple wins, however, imho is in hardware. The iPhone is a masterpiece and users can tell, even ~16 years in. I feel that when Apple finally chokes on hardware, or some player in the Android spaces releases something incredible, the game will change quickly.


It’s highly unlikely for Apple to choke on hardware given their cash.

And as someone who’s done native for both, Android’s native SDK is a mess that even Android devs actually hate it.

Meanwhile, iOS’ SDK is incredibly exhaustive and coherent. I don’t know what your basis is for “better designed software”, but being able to fork a desktop OS from 20 years prior, make it into a mobile OS, then to a tablet OS, then to a watch and a headset OS, and then have billions of users on it all and make a trillion-dollar company out of it⸺does that not sound like good engineering to you? All while the competition can hardly build anything that actually lasts.


> being able to fork a desktop OS from 20 years prior, make it into a mobile OS, then to a tablet OS, then to a watch and a headset OS, and then have billions of users on it all and make a trillion-dollar company out of it⸺does that not sound like good engineering to you?

Microsoft and Google basically did the same thing, and in neither case it's really a testament to how "good" their respective software is engineered. If the amount of driver cruft on MacOS is anything to go by, the engineering underneath iOS and WatchOS is probably a fucking nightmare in most respects.


People round here hate this, but it's true.

I used to be "the Android guy" at a big games publisher. In my time the billing component had to be rewritten three times solely because of Google changes. The Apple one was written once and left alone.

We can't even discuss why those Google changes happened because doing so would get you shot, or worse.

The tech direction that was going on at Apple was enormously better than other companies. It does feel like they've gone off the rails a bit, but things like Swift are underappreciated entirely because they're so successful, just with the wrong sort of developer.


> It’s highly unlikely for Apple to choke on hardware given their cash.

It just means that it will take a while, like Intel, or what is happening with search and Google.


Apple is, apparently, good at marketing


Interesting, I have the exact opposite: I'm also a React Native developer and it's _always_ Android that creates all sorts of problems when developing where iOS is just fine. And it's not me: many devs in my team (and all the teams that I've also worked in the past) think the same way.

Though I'd agree with provisioning+codesigning can be a mess with iOS.


I think that this boils down to people wanting a handheld computer that sometimes can make phone calls (android), or a phone that can do other stuff (apple).

Just compare how android and iOS handle backgrounding.


I think that comparison would need some support. It is exceedingly rare that I hear any normal person mention doing something on Android that they couldn’t do on iOS, and the number of enthusiasts isn’t enough to drive a market that large.


Can I easily put a compiler on either platform? My understanding is that you can't, which makes both platforms kind of bad.


You can easily put compilers on Android. Put the whole system on it if you want https://play.google.com/store/apps/details?id=tech.ula


Huh! I really was under the impression that you couldn't. It's been many years since I last checked though.

Thank you very much for correcting me!


It depends on the intended use case: compiling source code isn't the intended use case of neither of those platforms. It doesn't make them "bad".


As of today, there is no player in the Smartphone space who has even remotely the amount of secured income to come up with a similarly volume-scaled device, and there is little incentive for anyone to enter this space.

A new entrant would be unable to secure the investment, because even if he would produce the exact same piece of hardware with the same quality, the carrier distribution channels, the brand-image and (walled garden) ecosystem of Apple will prevent users to even notice and adopt the product, and the press would jump onto it and rip it to pieces.

So how would this normally work?

--> You disrupt the market by doing something particularly good, while being average in other areas, succeed, then iterate.

But this doesn't work in the Smartphone space as:

1.) iOS users are unlikely to leave their ecosystem because they can't take _anything_ with them

2.) the Google ecosystem leaves little room to disrupt and secure return-of-investment, and

3.) for Android you need to (re)build your own ecosystem to _match_ Google/Apple from the start.

That's why it's not a competitive market anymore, and needs to be (wait for it:) regulated to restore an even competition field for Hardware, Applications and Services.

But yeah...not a popular opinion here, I know...


What would I take with me? My photos and email will move just fine. The last app I bought was a while ago, and it was an app to block Google AMP. I’m honestly not sure I use any other paid apps.


So no Apps.

Also no iTunes, Apple Music, Apple Messages, Apple Pay, Apple Fitness, any kind of native Mac integration (Safari Bookmark sharing, Shared Bluetooth devices, clipboard sharing, Continuity Camera, AirPlay,...)?

No Apple Wireless charger, Apple Watch, Airpods, Apple-specific Accessories, Apple App-based carkeys or Apple CarPlay?

That's quite rare.


As an experiment I recently switched from iPhone (last 10 years?) to Android. It's been a little painful but:

- nearly all apps support Android as well. The ones I used (navionics, banking apps, WhatsApp) you just log in on Android, no cost involved. - most Apple first party apps have a Google equivalent (google wallet, google keep notes, google messaging etc.) that is very similar - my AirPods work equally well with android


Fine - but that took Google billions and a decade of work to reach near-parity. A new entrant will not have any of that. Web apps can do much more than they could 10 or 15 years ago but still takes massive effort.


Sorry to be rude, but what are you smoking?

Google's been ahead of Apple on tons of core user-facing features since the start (widgets, backgrounds, folders). The two platforms have extremely slowly converged to near-total feature parity. The only "advantage" of Apple's total ecosystem lock-in is relative seamlessness due to the vertical integration between their various services.

The thing is, it's barely any harder to set up an equivalent Google/Android ecosystem and has been for well over a decade as well. The real issue on the Google side of things is the renaming/shifting of services. Messages -> Gmail Chat -> Talk -> Duo -> Messages, Google Play Music -> Youtube Music, etc.

The feature parity's been there


Do you use MFA? How about meetings (zoom/teams)? What about MS Office or Google Apps? Is the new email client up to snuff? All of these are much better as apps.

Users do not want to browse the web on mobile for all their activities, when Apps are generally faster, more secure, and has all their prefs recorded EVEN if a webapp is functionally equivalent (and most are only 70-90% equivalent)

So the new entrant has to curry favor with all these large software vendors (some of whom are now competitors) and offer something for some key uses of a smartphone.


You're right that apps can be better, but phone apps seem to always miss functionality compared to desktop web versions of the same thing. Even phone web version of Google doesn't have functional parity with desktop web Google. The phone app for Google is even worse.


If you used iMessage, Apple phones remember this and will continue sending you iMessages which you won't receive on your non-Apple phone.



A Flutter developer only sees the shitty parts of iOS and Android anyway. I imagine as a dumb carrier for Flutter Android is nicer.


60hz lmao


Not a series of prime numbers?


What’s most fascinating to me is the certainly with which they believed that other worlds were inhabited. Horace Walpole, for example, even abandoning his faith over the theological difficulties it presented. Hundreds of years later, however, we’re left with the Fermi Paradox.


>Hundreds of years later, however, we’re left with the Fermi Paradox.

The only paradox about the Fermi Paradox is why so many people who claim to be educated take it seriously.

Faster-than-light travel is impossible.

There aren't galaxy-spanning civilizations because creating a galaxy-spanning civilization is impossible. You cannot have a civilization where it takes 106,000 years for a message to traverse it.

If there are multi-stellar civilizations it is impossible to detect them at present. If humans lived on an earth-like planet orbiting Proxima Centauri, we would not be able to detect them at all, even with every single radio telescope on earth acting as a giant interferometer.

I assert that any intelligence that has mastered the mass-energy equivalence skills needed for interstellar travel (again, at slow-ass speeds) no longer cares about planets or biological life anyways. They would park off the ends of masses that are radiating energy and harvest it to turn into endless expanses of fantastical constructions that make planets look like jokes.


The Fermi Paradox doesn't demonstrate that other worlds aren't inhabited, just that our assumptions about infinite exponential growth and resource consumption and the inevitability of interstellar civilizations emerging as a function of intelligent life may not apply.


Because “carbon” in these conversations is a proxy to “stuff” as an important input to everything. Of course, the rich have more stuff. Interestingly, focusing on carbon allows for applying a moral dimension to having more stuff: it’s not only wrong to have more because I have less and don’t like that, but because you are hurting is all.


No not really. Carbon emissions refers to carbon emissions which are put into our shared air and affect our shared environment.


All code may be technical debt but the interest rate can vary.


“sub-nanosecond synchronization”

“typical distances of 10 km between nodes”

Light travels 30cm in a nanosecond. How do they achieve sub-nanosecond accuracy over long distances?


Because they know the speed of light and the distance between nodes so they can account for the propagation delays of light due to distance.

They're not talking of sub millisecond latency in communications.


No, we don't know the distance between nodes (although we could deduce it). But using timestamps, we can know the round-trip time.

See https://www.ohwr.org/project/white-rabbit/uploads/2b9d42b664... (page 9 and later for the principle).

If you want all the details, see https://ohwr.org/project/white-rabbit/uploads/6a357829064b9e...


When synchronizing two nodes A and B, where there is a persistent difference in the travel times A->B and B->A, how do you achieve synchronization when knowing A->B->A or B->A->B?


Delay symmetry is a critical assumption in any two-way time transfer process. White Rabbit goes to extreme lengths to maintain that property.

This includes mandating use of cables that share a single optical fiber, with specific wavelength pairs and fiber types so you can calibrate for unavoidable differences in propagation time.

More info on their wiki:

https://ohwr.org/projects/white-rabbit/wiki/SFP


For a first approximation, you can assume A->B and B->A travel times are equal.

And because optical links are used, the asymmetry is mainly due to the wavelength difference which is known.


you can't. You can only assume that they are equal and attempt to make them as equal as possible. (the same issue arises when measuring the speed of light: it's actually not possible to distinguish if the speed of light is different in one direction to another, we only know accurately the average of each direction)


What happens when the roundtrip time isn't consistent?


The roundtrip time is never consistent. Light travels with different speed in fiber depending on the temperature. This is why you calibrate every second.


Even better, the actual in situ delays are measured and compensated for, and it works independent of the physical connection (and through fiber/copper, switch layers, etc.).


Only over fiber. Copper SFPs are not deterministic enough to precisely synchronize networks.


And even better: they suggest the use of a single medium for both transmission and receiving (1000BASE-BX10) to minimize asymmetry.

https://ohwr.org/project/white-rabbit/wikis/SFP


Indeed. It's exactly the same (albeit on a different scale) as NTP synchronization, where you can frequently (ha!) reach a few ms accuracy over a hundred ms latency network.


It seems that you're implying that nodes cannot be synchronized within the time it takes for light to travel between the nodes.

Images both nodes having their own atomic clocks. Now allow them to timestamp transmitted and received messages with very high precision.


In a white-rabbit network you don't need atomic clocks on each node. One atomic clock is enough, its frequency is distributed over the network.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: