Hacker Newsnew | past | comments | ask | show | jobs | submit | jjmarr's commentslogin

wouldn't the curators need SCBA gear and airlocks?

Turning museums into a Resident Evil house is a cool idea.


I installed Gentoo in 2014 and getting PulseAudio working was much easier than ALSA. It was also much better.

I get ALSA followed the Unix philosophy of doing one thing but I want my audio mixer to play multiple sounds at once.


"Filling in timesheets" sucks until you want to qualify for R&D credits.

I've heard real estate people call this legalized extortion, since you have to select a graffiti artist with enough reputation that others don't mess with the piece.

>real estate people call this legalized extortion

I hope they know what some say of the real estate agent.


I’ve heard such reputations involve not only the caliber of the art, but also the retributive consequences the artist and friends are thought to impose on people who deface their work…

Virtually all of the top performers at my school left for the USA immediately after graduation.

I think somewhere between 70-90% of Waterloo graduates in CS leave every year.

Turns out doubling or tripling your take home compensation is absolutely worth it.

You can buy a house instead of renting an apartment with roommates. You can afford to marry and have children. You can buy all the things the government would've provided you had it not been dysfunctional.

Plus, there are just more jobs in SWE in the USA. Many of my classmates graduating last year in June are still unemployed since you have to be exceptional to get a job here.

Pretty much anyone who can get TN1/H1B/L1B does, unless you were born wealthy, have an extreme sense of patriotism, or have a very strong attachment to family.


That's definitely true, though probably only limited to CS (and maybe the top 5% of people going to investment banking). The vast majority of the most intelligent people at any Canadian university will stay in Canada, and given the current political situation, I think that's probably only going to become more true.

Also, anecdotally, I would wager that schools with significant numbers of Americans (e.g., McGill) probably have more US students staying in Canada than vice versa at this point (with perhaps the exception of CS).


> Virtually all of the top performers at my school left for the USA immediately after graduation. [...] Turns out doubling or tripling your take home compensation is absolutely worth it. [...] You can buy a house instead of renting an apartment with roommates. You can afford to marry and have children. You can buy all the things the government would've provided you had it not been dysfunctional.

And how does the "dysfunction" of the current Canadian government compare to what is happening in the US, in your eyes?

> Plus, there are just more jobs in SWE in the USA.

There is the rational answer... for graduates in software.


I am a GPU programmer (on the compute side), and the biggest challenge is lack of tooling.

For host-side code the agent can throw in a bunch of logging statements and usually printf its way to success. For device-side code there isn't a good way to output debugging info into a textual format understandable by the agent. Graphical trace viewers are great for humans, not so great for AI right now.

On the other hand, Cline's harness can interact with my website and click on stuff until the bugs are gone.


(Shamless plug) I've been using my debugger-cli [1] to enable agents to debug code using debuggers that support the Debug Adaptor Protocol. It looks like cuda-gdb supports DAP so I'd love to add support. I just need help from someone who can test it adequately (kernels/warps/etc don't quite translate to a generic DAP client implementation).

[1] https://github.com/akiselev/debugger-cli


This is great. I hate LLMs fiddling around with logging calls to get some debugging capability.

Now they can be promoted from junior coders into mid-level coders :)


> For device-side code there isn't a good way to output debugging info into a textual format understandable by the agent

Seems Codex works just fine with nsys and sqlite3 available to it, I've had success using it for debugging CUDA code that was crashing, and also for optimizing code.


there is no guarantee `char` is 8 bits, nor that it represents text, or even a particular encoding.

If your codebase has those guarantees, go ahead and use it.


> there is no guarantee `char` is 8 bits, nor that it represents text, or even a particular encoding.

True, but sizeof(char) is defined to be 1. In section 7.6.2.5:

"The result of sizeof applied to any of the narrow character types is 1"

In fact, char and associated types are the only types in the standard where the size is not implementation-defined.

So the only way that a C++ implementation can conform to the standard and have a char type that is not 8 bits is if the size of a byte is not 8 bits. There are historical systems that meet that constraint but no modern systems that I am aware of.

[1] https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2023/n49...


That would be any CPU with word-addressing only. Which, granted, is very exotic today, but they do still exist: https://www.analog.com/en/products/adsp1802.html

Don't some modern DSPs still have 32bit as minimum addressable memory? Or is it a thing of the past?

If you're on such a system, and you write code that uses char, then perhaps you deserve whatever mess that causes you.

char8_t also isn't guaranteed to be 8-bits, because sizeof(char) == 1 and sizeof(char8_t) >= 1. On a platform where char is 16 bits, char8_t will be 16 bits as well

The cpp standard explicitly says that it has the same size, typed, signedness and alignment as unsigned char, but its a distinct type. So its pretty useless, and badly named


Wouldn't it be rather the case that char8_t just wouldn't exist on that platform? At least that's the case with the uintN_t types, they are just not available everywhere. If you want something that is always available you need to use uintN_least_t or uintN_fast_t.


It is pretty consistent. It is part of the C Standard and a feature meant to make string handling better, it would be crazy if it wasn't a complete clusterfuck.

There's no guarantee char8_t is 8 bits either, it's only guaranteed to be at least 8 bits.

> There's no guarantee char8_t is 8 bits either, it's only guaranteed to be at least 8 bits.

Have you read the standard? It says: "The result of sizeof applied to any of the narrow character types is 1." Here, "narrow character types" means char and char8_t. So technically they aren't guaranteed to be 8 bits, but they are guaranteed to be one byte.


Yes, but the byte is not guaranteed to be 8 bits, because on many ancient computers it wasn't.

The poster to whom you have replied has read correctly the standard.


What platforms have char8_t as more than 8 bits?

Well platforms with CHAR_BIT != 8. In c and c++ char and there for byte is atleast 8 bytes not 8 bytes. POSIX does force CHAR_BIT == 8. I think only place is in embeded and that to some DSPs or ASICs like device. So in practice most code will break on those platforms and they are very rare. But they are still technically supported by c and c++ std. Similarly how c still suported non 2's complement arch till 2023.

That's where the standard should come in and say something like "starting with C++26 char is always 1 byte and signed. std::string is always UTF-8" Done, fixed unicode in C++.

But instead we get this mess. I guess it's because there's too much Microsoft in the standard and they are the only ones not having UTF-8 everywhere in Windows yet.


char is always 1 byte. What it's not always is 1 octet.

you're right. What I meant was that it should always be 8 bit, too.

std::string is not UTF-8 and can't be made UTF-8. It's encoding agnostic, its API is in terms of bytes not codepoints.

Of course it can be made UTF-8. Just add a codepoints_size() method and other helpers.

But it isn't really needed anyway: I'm using it for UTF-8 (with helper functions for the 1% cases where I need codepoints) and it works fine. But starting with C++20 it's starting to get annoying because I have to reinterpret_cast to the useless u8 versions.


First, because of existing constraints like mutability though direct buffer access, a hypothetical codepoints_size() would require recomputation each time which would be prohibitively expensive, in particular because std::string is virtually unbounded.

Second, there is also no way to be able to guarantee that a string encodes valid UTF-8, it could just be whatever.

You can still just use std::string to store valid encoded UTF-8, you just have to be a little bit careful. And functions like codepoints_size() are pretty fringe -- unless you're not doing specialized Unicode transformations, it's more typical to just treat strings as opaque byte slices in a typical C++ application.


Perfect is the enemy of good. Or do you think the current mess is better?

std::string _cannot_ be made "always UTF-8". Is that really so contentious?

You can still use it to contain UTF-8 data. It is commonly done.


I never said always. Just add some new methods for which it has to be UTF-8. All current functions that need an encoding (e.g. text IO) also switch to UTF-8. Of course you could still save arbitrary binary data in it.

How many non-8-bit-char platforms are there with char8_t support, and how many do we expect in the future?

Mostly DSPs

Is there a single esoteric DSP in active use that supports C++20? This is the umpteenth time I've seen DSP's brought up in casual conversations about C/C++ standards, so I did a little digging:

Texas Instruments' compiler seems to be celebrating C++14 support: https://www.ti.com/tool/C6000-CGT

CrossCore Embedded Studio apparently supports C++11 if you pass a switch in requesting it, though this FAQ answer suggests the underlying standard library is still C++03: https://ez.analog.com/dsp/software-and-development-tools/cce...

Everything I've found CodeWarrior related suggests that it is C++03-only: https://community.nxp.com/pwmxy87654/attachments/pwmxy87654/...

Aside from that, from what I can tell, those esoteric architectures are being phased out in lieu of running DSP workloads on Cortex-M, which is just ARM.

I'd love it if someone who was more familiar with DSP workloads would chime in, but it really does seem that trying to be the language for all possible and potential architectures might not be the right play for C++ in 202x.

Besides, it's not like those old standards or compilers are going anywhere.


Cadence DSPs have C++17 compatible compiler and will be c++20 soon, new CEVA cores also (both are are clang based). TI C7x is still C++14 (C6000 is ancient core, yet still got c++14 support as you mentioned). AFIR Cadence ASIP generator will give you C++17 toolchain and c++20 is on roadmap, but not 100% sure.

But for those devices you use limited subset of language features and you would be better of not linking c++ stdlib and even c stdlib at all (so junior developers don't have space for doing stupid things ;))


Green Hills Software's compiler supports more recent versions of C++ (it uses the EDG frontend) and targets some DSPs.

Back when I worked in the embedded space, chips like ZSP were around that used 16-bit bytes. I am twenty years out of date on that space though.


How common is it to use Green Hills compilers for those DSP targets? I was under the impression that their bread was buttered by more-familiar-looking embedded targets, and more recently ARM Cortex.

Dunno! My last project there was to add support for one of the TI DSPs, but as I said, that's decades past now.

Anyway, I think there are two takeaways:

1. There probably do exist non-8-bit-byte architectures targeted by compilers that provide support for at-least-somewhat-recent C++ versions

2. Such cases are certainly rare

Where that leaves things, in terms of what the C++ standard should specify, I don't know. IIRC JF Bastien or one of the other Apple folks that's driven things like "twos complement is the only integer representation C++ supports" tried to push for "bytes are 8 bits" and got shot down?


> but it really does seem that trying to be the language for all possible and potential architectures might not be the right play for C++ in 202x.

Portability was always a selling point of C++. I'd personaly advise those who find it uncomfortable, to choose a different PL, perhaps Rust.


> Portability was always a selling point of C++.

Judging by the lack of modern C++ in these crufty embedded compilers, maybe modern C++ is throwing too much good effort after bad. C++03 isn't going away, and it's not like these compilers always stuck to the standard anyway in terms of runtime type information, exceptions, and full template support.

Besides, I would argue that the selling point of C++ wasn't portability per se, but the fact that it was largely compatible with existing C codebases. It was embrace, extend, extinguish in language form.


> Judging by the lack of modern C++ in these crufty embedded compilers,

Being conservative with features and deliberately not implementing them are two different thing. Some embedded compilers go through certification, to be allowed to be used producing mission critical software. Chasing features is prohibitively expensive, for no obvious benefit. I'd bet in 2030s most embedded compiler would support C++ 14 or even 17. Good enough for me.


> Being conservative with features and deliberately not implementing them are two different thing.

There is no version of the C++ standard that lacks features like exceptions, RTTI, and fully functional templates.

If the compiler isn't implementing all of a particular standard then it's not standard C++. If an implementation has no interest in standard C++, why give those implementations a seat at the table in the first place? Those implementations can continue on with their C++ fork without mandating requirements to anyone else.


> If the compiler isn't implementing all of a particular standard then it's not standard C++.

C++ have historically been driven by practicalities, and violated standards on regular basis, when it deemed useful.

> Those implementations can continue on with their C++ fork without mandating requirements to anyone else.

Then they will diverge too much, like it happened with countless number of other languages, like Lisp.


> Then they will diverge too much, like it happened with countless number of other languages, like Lisp.

Forgive me if I am unconvinced that the existence of DSP-friendly dialects of C++ will cause the kinds of language fracturing that befell Lisp.

DSP workloads are relatively rare compared to the other kinds of workloads C++ is tasked with, and even in those instances a lot of DSP work is starting to be done on more traditional architectures like ARM Cortex-M.


Non-8-bit-char DSPs would have char8_t support? Definitely not something I expected, links would be cool.

Why not? except it is same as `unsigned char` and can be larger than 8 bit

ISO/IEC 9899:2024 section 7.30

> char8_t which is an unsigned integer type used for 8-bit characters and is the same type as unsigned char;


> Why not?

Because "it supports Unicode" is not an expected use case for a non-8-bit DSP?

Do you have a link to a single one that does support it?


The exact size types are never present on platforms that don't support them.

TI C2000 is one example

Thank you. I assume you're correct, though for some reason I can't find references claiming C++20 being supported with some cursory searches.

char on linux arm is unsigned, makes for fun surprises when you only ever dealt with x86 and assumed char to be signed everywhere.

This bit us in Chromium. We at least discussed forcing the compiler to use unsigned char on all platforms; I don't recall if that actually happened.

I recall that google3 switched to -funsigned-char for x86-64 a long time ago.

A cursory Chromium code search does not find anything outside third_party/ forcing either signed or unsigned char.

I suspect if I dug into the archives, I'd find a discussion on cxx@ with some comments about how doing this would result in some esoteric risk. If I was still on the Chrome team I'd go looking and see if it made sense to reraise the issue now; I know we had at least one stable branch security bug this caused.


Related: in C at least (C++ standards are tl;dr), type names like `int32_t` are not required to exist. Most uses, in portable code, should be `int_least32_t`, which is required.

Because agentic AI can parse unstructured data and make purchasing decisions regardless of whether your site allows it, which avoids the chicken-and-egg problem.

It's similar to DoorDash. If your restaurant didn't want to sign up, they added you anyways and took orders on your behalf, then sent a physical courier over with a prepaid card to order takeout. Sometimes the menus were parsed incorrectly and customers blamed the restaurant.

This forced restaurants to sign up, claim their page, and keep their menus up to date, since not offering delivery wasn't an option.

At least 1 agentic AI tool will ignore these new terms and buy stuff on eBay anyways. Inevitably there'll be bugs or it won't get the best deal. At first this won't matter, but eventually competitors will offer a bug-free purchasing experience and consumers will move over.


>Because agentic AI can parse unstructured data and make purchasing decisions regardless of whether your site allows it, which avoids the chicken-and-egg problem.

People can do that too, and also benefit from actual structured data. But the avoidance of the chicken and egg problem didn't seem to result in widespread structured data stores beating out the SEO-spam-style listings.


The revealed preference of players is for terrible AI so games are easier. That's why AI has been going downhill.

Payday 2 is my favourite example since they've had a bug since Day 1 that lobotomizes the AI subsystem.

Specifically, there's a global cap of 1 action for all enemies per game tick, so when there's too many enemies the reaction time is 5 seconds.

The mod Full Speed Swarm fixes this bug and the game is unplayable without collaboration and a lot of skill.

It's also unnoticeable until you die or if you do a ton of research on the AI. I used to host pubs with the mod to troll other players who suddenly found the game impossible to play at lower difficulties.

I think it's possible we get an AI driven RTS but the demand is too small right now unless its a recruitment vehicle for the military.


Whether better AI is fun or not depends on (A) the sorts of advantages/disadvantages it is given and (B) its skill level compared to the player.

When people say they want better AI, they mean they want smarter + human-like AI without "cheating" (more resources, perfect aim, rubber banding, perfect knowledge, etc).

And even if you build a human-like AI, it's not going to be fun for players if it's too good, just like how humans have the most fun playing against other players at their level, not against the top-ranked players in the world.

So to keep AI fun, there is always going to be artificial limitations, and "people want terrible AI" doesn't really capture that.

One example might be games where you can hear footsteps or a branch snap. With stereo audio, you could technically derive exactly where someone is, possibly even which room they are in. When that raw signal is provided directly to a machine, it might accidentally be too good, know which bush you crawled into, and shoot into it.

Yet humans would only be able to go "I think I heard a branch snap" or "someone must be creeping up on me" even though they are given the same signal as the AI.


I don't know about this... I think that the "expert" AIs on RTS games just get more resources (not compute resources, but in-game resources) so they can create more units. I'd love to play against an AI component that has the same resource collection speed as the human player.

I still remember the mod scene in original StarCraft (+ Brood War) - it got to the point of supplying alternative AI algorithms, and there were plenty of non-cheating ones that were a noticeable step up in difficulty. At that point I already played competitively with friends and on-line, so they were only very challenging to me; for a new player, or someone casually doing the single player campaign, or just me a year earlier, they'd be impossible to beat.

I wonder how strong such an agent would be today if years ago AlphaStar was on par with pros. All those fun projects (the one with Dota 2) died out when LLMs took the scene and RL died out.

Yeah, cheating with resources or vision is disappointingly common. I remember the Green Tea AI for Starcraft 2 being pretty hard, but it looks like above "medium" it cheats (but the TL wiki does say "very easy", which will give you 10 minutes to set up, is equivalent to Blizzard's "harder").

I have 1 and 8 on my cheap RAV4 from 7 years ago. Heated seats too.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: