This is too short and the justification provided is flimsy at best.
I predict that normal people will began to get comfortable with ignoring SSL errors, even more than they already are. Perhaps we will see the proliferation of https-stripping proxies too.
> We know exactly how to do these things digitally. Many European countries have had stored-value payment schemes in the 90s. Japan still does today.
But how do they prevent people double spending the same amount? Say someone has 100$ and boards on a plane. During the trip, this person buys a bag of potato chips sold for 90$. At the same time, his bank account is automatically charged 90$ for a bill.
With credit cards, handling this case is baked into the system. As far as I am aware, direct debt has no equivalent.
> But how do they prevent people double spending the same amount?
Both payment cards and merchant terminals (essentially also using embedded or removable smartcards) are tamper-resistant and hold symmetric keys only known to the payment scheme or issuer.
The terminal essentially creates a cryptographic secure channel between two smartcards, and they transactionally agree to decrement the balance on one, and increment the one on the other correspondingly.
The really neat thing is that this theoretically even works without the need for central accounts, and is as such very privacy friendly. (Practically, even just one key leaking would have catastrophic consequences though, and to detect whether that has happened, systems usually aggregate all transactions asynchronously and check money movements for plausibility.)
Iiuc, you have to "transfer" the funds from your bank account to the card device (wallet, in a way) from where you can then spend these funds without interaction with the bank? That would work.
Otherwise, without the initial withdrawal from your bank account, you could spend the money twice.
there's already some fraud, waste, loss, inefficiencies, accidents (packages lost, chargebacks by mistake, package arrives weeks later)
....
that said the chips have some physical protection, it's not trivial to clone them
and the chip has a variable where it stores how much more you can use without online confirmation
of course, these are cheap protective measures, but to crack it you would need more effort probably than the total credit that's assigned for offline spending
What's the information theory connection in your view?
> these are cheap protective measures,
They're holding up extremely well. I'm not aware of any cryptographic or physical key extraction compromise in EMV, for example. All known bugs are protocol design oopsies, as far as I'm aware.
I'll just repost what I posted in the last thread:
> There is no on-device scanning without compromising privacy. Scanning that can detect child abuse can also detect human rights activists, investigative journalists, and so on. I imagine this technology can be easily used by the government to identify journalists by scanning for material related to their investigation.
> On-device scanning is a fabrication that Apple foolishly introduced to the mainstream, and one that rabid politicians bit into and refuse to let go.
And I'll add this: Citizens lose of their right to privacy is the death of their democracy.
This is my plan as well. If we loose the ability to run GrapheneOS, I'll get an iphone and treat it as a locked down home router, and use a linux phone with open source apps for my calls/messaging.
I will also do everything in my power to halt support for Android in favor of web apps. No sense duplicating work for two separate platforms if one is just a crappy clone of another.
This is the right answer. I'm willing to stick my head out and assert that languages with a "minimal" standard library are defective by design. The argument of APIs being stuck is mood with approaches like Rust's epocs or "strict mode".
Standard libraries should include everything needed to interact with modern systems. This means HTTP parsing, HTTP requests, and JSON parsing. Some laguages are excellent (like python), while some are half way there (like go), and some are just broken (Rust).
External libraries are for niche or specialized functionality. External libraries are not for functionality that is used by most modern software. To put your head in the ground and insist otherwise is madness and will lead to ridiculous outcomes like this.
> Standard libraries should include everything needed to interact with modern systems.
This is great when the stdlib is well-designed and kept current when new standards and so on become available, but often "batteries included" approaches fail to cover all needs adequately, are slow to adopt new standards or introduce poorly designed modules that then cannot be easily changed, and/or fail to keep up-to-date with the evolution of the language.
I think the best approach is to have a stdlib of a size that can be adequately maintained/improved, then bless a number of externally developed libraries (maybe even making them available in some official "community" module or something with weaker stability guarantees than the stdlib).
I find it a bit funny that you specifically say HTTP handling and JSON are the elements required when that's only a small subset of things needed for modern systems. For instance, cryptography is something that's frequently required, and built-in modules for it often suck and are just ignored in favor of external libraries.
EDIT: actually, I think my biggest issue with what you've said is that you're comparing Python, Go, and Rust. These languages all have vastly different design considerations. In a language like Python, you basically want to be able to just bash together some code quickly that can get things working. While I might dislike it, a "batteries included" approach makes sense here. Go is somewhat similar since it's designed to take someone from no knowledge of the language to productive quickly. Including a lot in the stdlib makes sense here since it's easier to find stuff that way. While Rust can be used like Python and Go, that's not really its main purpose. It's really meant as an alternative to C++ and the various niches C/C++ have dominated for years. In a language like that, where performance is often key, I'd rather have a higher quality external library than just something shoved into the stdlib.
The tradeoff of “batteries included” vs not is real: Python developers famously reach for community libraries like requests right away to avoid using the built-in tooling.
And yet, there are times where all I've had access to was the stdlib. I was damn glad for urllib2 at those times. It's worth it to have a batteries included stdlib, even if parts of it don't wind up being the most commonly used by the community.
The fact that there is a 'urllib2' implies that there's a 'urllib', which tells us something pretty important about the dangers of kitchen-sink standard libraries.
But nothing prevents a language to have rich and OPTIONAL stdlib, so that devs can choose different solutions without linking bunch of junk they do not use.
Really, good stdlib still allows you to use better suited 3rd party libraries. Lack of good stdlib doesn't add anything.
> This is the right answer. I'm willing to stick my head out and assert that languages with a "minimal" standard library are defective by design.
> Standard libraries should include everything needed to interact with modern systems. This means HTTP parsing, HTTP requests, and JSON parsing.
There is another way. Why not make the standard library itself pluggable? Rust has a standard library and a core library. The standard library is optional, especially for bare-metal targets.
Make the core library as light as possible, with just enough functionality to implement other libraries, including the interfaces/shims for absolutely necessary modules like allocators and basic data structures like vectors, hashmaps, etc. Then move all other stuff into the standard library. The official standard library can be minimal like the Rust standard library is now. However, we should be able to replace the official standard library with a 3rd party standard library of choice. (What I mean by standard library here is the 'base library', not the official library.) Third party standard library can be as light or as comprehensive as you might want. That also will make auditing the default codebase possible.
I don't know how realistic this is, but something similar is already there in Rust. While Rust has language features that support async programming, the actual implementation is in an external runtime like Tokio or smol. The clever bit here is that the other third party async libraries don't enforce or restrict your choice of the async runtime. The application developer can still choose whatever async runtime they want. Similarly, the 3rd party standard library must not restrict the choice of standard libraries. That means adding some interfaces in the core, as mentioned earlier.
This is the philosophy used by the Java world. Big parts of the standard library are plugin-based. For example, database access (JDBC), filesystem access (NIO), cryptography (JCA). The standard library defines the interfaces and sometimes provides a default implementation, but it can be extended or replaced.
It works well, but the downside of that approach is people complaining about how abstract things are.
That makes sense. Just adding a clarification here. I wasn't suggesting to replace the standard library with interfaces (traits in this case). I was saying that the core library/runtime should have the interfaces for the standard library to implement some bare minimum functionalities like the allocators. Their use is more or less transparent to the application and 3rd party library developers.
Meanwhile, the public API of the selected standard library need not be abstract at all. Let's say that the bare minimum functionality expected from a 3rd party standard library is the same as the official standard library. They can just reimplement the official standard library at the minimum.
> External libraries are not for functionality that is used by most modern software.
Where do you draw the line though? It seems like you mostly spend your time writing HTTP servers reading/writing JSON, but is that what everyone else also spends their time doing? You'll end up with a standard library weighing GBs, just because "most developers write HTTP servers", which doesn't sound like a better solution.
I'm willing to stick my head the other way, and say I think the languages today are too large. Instead, they should have a smaller core, and the language designed in a way that you can extend the language via libraries. Basically more languages should be inspired by Lisps and everything should be a library.
That's exactly npm's problem, though. What everybody is avoiding to say is that you need a concept of "trusted vendors". And, for the "OSS accelerates me" business crowd, that means paying for the stuff you use.
But who would want that when you're busy chasing "market fit".
I don't think that's the problem with npm. The problem with npm is that no packages are signed, at all, so it ends up trivial for hackers to push new package versions, which they obviously shouldn't be able to do.
Since Shai-Hulud scanned maintainers' computers, if the signing key was stored there too (without a password), couldn't the attackers have published signed packages?
That is, how does signing prevent publishing of malware, exactly?
I don't think things being libraries (modular) is at odds with a standard library.
If you have a well vetted base library, that is frequently reviewed, under goes regular security and quality checks, then you should be minimally concerned about the quality of code that goes on top.
In a well designed language, you can still export just what you need, or even replace parts of that standard library if you so choose.
This approach even handles your question: as use cases become more common, an active, invested* community (either paying or actively contributing) can add and vet modules, or remove old ones that no longer serve an active purpose.
But as soon as you find yourself "downloading the web" to get stuff done, something has probably gone horribly wrong.
Doing it the right way would create friction, developers might need to actually understand what the code is doing rather than pulling in random libraries.
Try explaining to your CTO that development will slow down to verify the entire dependency chain.
I'm more thinking C# or Java. If Microsoft or Oracle is providing a library you can hope it's safe.
You *could* have a development ecosystem called Safe C# which only comes with vetted libraries and doesn't allow anything else.
Except that "clearance" invariably consists of bureaucratic rubber stamping and actually decreases security by making it harder and slower to fix newly discovered vulnerabilities.
> Doing it the right way would create friction, developers might need to actually understand what the code is doing rather than pulling in random libraries.
Then let's add friction. Developers understanding code is what they should be doing.
CTOs understand the high cost of ransomware and disruption of service.
Java is around for much longer, has exactly same architecture re transitive dependencies, yet doesn't suffer from weekly attacks like these that affect half of the world. Not technically impossible, yet not happening (at least not at this scale).
If you want an actual solution, look for differences. If you somehow end up figuring out its about type of people using those, then there is no easy technical solution.
> Standard libraries should include everything needed to interact with modern systems.
So, databases? Which then begs the question, which - Postgres, MySQL, SQLite, MS SQL, etc.? And some NoSQL, because modern systems might need it.
That basically means you need to pull in everything and the kitchen sink. And freeze it in time (because of backwards compatibility). HTML, HTTP parsing, and SHA1024 are perfectly reasonable now; wait two decades, and they might be as antiquated as XML.
So what your language designers end up, is having to work on XML parsing, HTTP, JSON libraries rather than designing a language.
If JS way is madness, having everything available is another form of madness.
It is not madness. Java is a good example of rich and modular standard library. Some components of it are eventually deprecated and removed (e.g. Applets) and this process takes long enough. Its standard library does include good crypto and http client, database abstraction API (JDBC) which is implemented by database drivers etc.
Yeah, and Java was always corporately funded, and to my knowledge no one really used neither the http client nor the XML parser. You basically have a collection of dead weight libs, that people have to begrudgingly maintain.
Granted some (JDBC) more useful than the others. Although JDBC is more of an API and less of a library.
HttpClient is relatively new and getting HTTP/3 support next spring, so it’s certainly not falling into the dead weight category. You are probably confusing it with an older version from Java 1.1/1.4.
As for XML, JAXP was a common way to deal with it. Yes, there’s Xstream etc, but it doesn’t mean any of standard XML APIs are obsolete.
Spot on, I rather have a Python, Java,.NET,.. standard library, that may have a few warts, but works everywhere there is full compliant implementation, than playing lego, with libraries that might not even support all platforms, and be more easily open to such attacks.
Is java.util.logging.Logger not that great?
Sure, yet everyone that used it had a good night rest when Log4J exploit came to be.
There is no on-device scanning without compromising privacy. Scanning that can detect child abuse can also detect human rights activists, investigative journalists, and so on. I imagine this technology can be easily used by the government to identify journalists by scanning for material related to their investigation.
On-device scanning is a fabrication that Apple foolishly introduced to the mainstream, and one that rabid politicians bit into and refuse to let go.
Apple has never supported your privacy though, not really. Spyware company issues spyware, news at 11. They're better than Google, but they're not good.
That is exactly the problem. I still can imagine that they come up with some scheme as a compromise, that particularly targets particularly encrypted group chats along with all kind of server side automatic scanning, that as you mention could be abused at least by intelligence to track non CSAM content. I wonder what other 'compromise' will actually be effectively possible.
I predict that normal people will began to get comfortable with ignoring SSL errors, even more than they already are. Perhaps we will see the proliferation of https-stripping proxies too.