Why? Disrupting a speaker to the level that prevents them speaking (or prevents listening) is an action that clearly infringes on rights of others and restricts freedom of speech. Regardless of content.
Speak at the same time, but in a slightly different place? If you really care about your speech (and not silencing the speech of others), that is clearly acceptable.
I'm not silencing you. We both have the freedom to speak at the same time, in the same place. Such as a political speaker and opposing protesters. Both have the freedom.
Programming is free if you do not consider price of your time. If you consider it, it is much higher than AI-associated costs. And even with AI-associated costs, it is still much cheaper than most other engineering professions, where physical realization is orders of magnitude more costly.
If you were a young person with plenty of time, the best way you could spend it would be learning to program without AI- whether you have money or not.
LLMs aren't a requirement though.. and if you're learning, you're probably better off without the things. I was pretty down and out after the .com bubble burst and was staying in a house a friend was renovating without internet access for a while... I learned C# from a big fat book and the beta command line compiler... for years, I knew the language and tools better than my peers.
You can't get that level of depth with an LLM... because you generally won't be digging in... for that matter, if you're vibe coding, you're even further removed from the details of how things are being done for better or worse.
LLM providers are interested in maximizing their profits, not minimizing your costs. The eventual goal of the providers, and the reason that they have trillion-dollar valuations, is because the objective is to capture the market and then increase the price to capture the value of any time you may be saving by using them. In other words, if your time savings amounts to $100 per hour by using LLMs, their goal is to eventually charge you $99.99 per hour for the privilege of using them.
> As a non-English speaker I can really relate to this.I think the real mistake was Apple allowing to enter a non-ASCII password in the first place.
As a non-English speaker (Czech, actually), it is clear to me to not use non-ASCII characters in passwords, or generally not use characters that are at different position on default English keyboard and locally used keyboards, i.e. use only ASCII alphanumeric chars except 'Y' and 'Z'.
As keyboard setting is per-user setting, keyboard may be different on login screen than on regular desktop (and once-login password prompts).
Also, most devices nowadays ARE single user. And most (all?) OSes allow you to use alternative keyboards at the user-selection screen.
Also, all orgs recommend special characters in passwords. Czech keyboards default to accented letters on the top row instead of numbers, so why wouldn't your average Czech use those?
> It is a profoundly erroneous truism ... that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them.
As bits are generally not addressable / not ordered, it makes no sense to call CPU architecture big/little bit-endian. That makes sense only for serial lines/buses.
'Arabic' numbers comes originally from India, from Brahmi numerals. And Brahmi script was left to right. So big-endian was 'normal' even originally, it was Arabs who kept left-to-right numbers within right-to-left script (and therefore use little-endian relative to direction of Arabic script).
To give an example, you could read 1234 as either 'one thousand and two hundred and four and thirty' or 'four and thirty and two hundred and one thousand'.
Now that I think about it though, I've only seen the latter way used for the year in a date.
There is one reason not mentioned in the article why it is worth testing code on big-endian systems – some bugs are more visible there than on little-endian systems. For example, accessing integer variable through pointer of wrong type (smaller size) often pass silently on little-endian (just ignoring higher bytes), while read/writ bad values on big-endian.
Taking aside you certainly can do radiative cooling in desert at night just fine - you have air, which even if hot to desert standards during the day is still by magnitudes more effective for cooling via direct heat transfer than radiating heat away in vacuum.
reply