I've heard real estate people call this legalized extortion, since you have to select a graffiti artist with enough reputation that others don't mess with the piece.
I’ve heard such reputations involve not only the caliber of the art, but also the retributive consequences the artist and friends are thought to impose on people who deface their work…
Virtually all of the top performers at my school left for the USA immediately after graduation.
I think somewhere between 70-90% of Waterloo graduates in CS leave every year.
Turns out doubling or tripling your take home compensation is absolutely worth it.
You can buy a house instead of renting an apartment with roommates. You can afford to marry and have children. You can buy all the things the government would've provided you had it not been dysfunctional.
Plus, there are just more jobs in SWE in the USA. Many of my classmates graduating last year in June are still unemployed since you have to be exceptional to get a job here.
Pretty much anyone who can get TN1/H1B/L1B does, unless you were born wealthy, have an extreme sense of patriotism, or have a very strong attachment to family.
That's definitely true, though probably only limited to CS (and maybe the top 5% of people going to investment banking). The vast majority of the most intelligent people at any Canadian university will stay in Canada, and given the current political situation, I think that's probably only going to become more true.
Also, anecdotally, I would wager that schools with significant numbers of Americans (e.g., McGill) probably have more US students staying in Canada than vice versa at this point (with perhaps the exception of CS).
> Virtually all of the top performers at my school left for the USA immediately after graduation. [...] Turns out doubling or tripling your take home compensation is absolutely worth it. [...] You can buy a house instead of renting an apartment with roommates. You can afford to marry and have children. You can buy all the things the government would've provided you had it not been dysfunctional.
And how does the "dysfunction" of the current Canadian government compare to what is happening in the US, in your eyes?
> Plus, there are just more jobs in SWE in the USA.
There is the rational answer... for graduates in software.
I am a GPU programmer (on the compute side), and the biggest challenge is lack of tooling.
For host-side code the agent can throw in a bunch of logging statements and usually printf its way to success. For device-side code there isn't a good way to output debugging info into a textual format understandable by the agent. Graphical trace viewers are great for humans, not so great for AI right now.
On the other hand, Cline's harness can interact with my website and click on stuff until the bugs are gone.
(Shamless plug) I've been using my debugger-cli [1] to enable agents to debug code using debuggers that support the Debug Adaptor Protocol. It looks like cuda-gdb supports DAP so I'd love to add support. I just need help from someone who can test it adequately (kernels/warps/etc don't quite translate to a generic DAP client implementation).
> For device-side code there isn't a good way to output debugging info into a textual format understandable by the agent
Seems Codex works just fine with nsys and sqlite3 available to it, I've had success using it for debugging CUDA code that was crashing, and also for optimizing code.
> there is no guarantee `char` is 8 bits, nor that it represents text, or even a particular encoding.
True, but sizeof(char) is defined to be 1. In section 7.6.2.5:
"The result of sizeof applied to any of the narrow character types is 1"
In fact, char and associated types are the only types in the standard where the size is not implementation-defined.
So the only way that a C++ implementation can conform to the standard and have a char type that is not 8 bits is if the size of a byte is not 8 bits. There are historical systems that meet that constraint but no modern systems that I am aware of.
char8_t also isn't guaranteed to be 8-bits, because sizeof(char) == 1 and sizeof(char8_t) >= 1. On a platform where char is 16 bits, char8_t will be 16 bits as well
The cpp standard explicitly says that it has the same size, typed, signedness and alignment as unsigned char, but its a distinct type. So its pretty useless, and badly named
Wouldn't it be rather the case that char8_t just wouldn't exist on that platform? At least that's the case with the uintN_t types, they are just not available everywhere. If you want something that is always available you need to use uintN_least_t or uintN_fast_t.
It is pretty consistent. It is part of the C Standard and a feature meant to make string handling better, it would be crazy if it wasn't a complete clusterfuck.
> There's no guarantee char8_t is 8 bits either, it's only guaranteed to be at least 8 bits.
Have you read the standard? It says: "The result of sizeof applied to any of the narrow character types is 1." Here, "narrow character types" means char and char8_t. So technically they aren't guaranteed to be 8 bits, but they are guaranteed to be one byte.
Well platforms with CHAR_BIT != 8. In c and c++ char and there for byte is atleast 8 bytes not 8 bytes. POSIX does force CHAR_BIT == 8. I think only place is in embeded and that to some DSPs or ASICs like device. So in practice most code will break on those platforms and they are very rare. But they are still technically supported by c and c++ std. Similarly how c still suported non 2's complement arch till 2023.
That's where the standard should come in and say something like "starting with C++26 char is always 1 byte and signed. std::string is always UTF-8" Done, fixed unicode in C++.
But instead we get this mess. I guess it's because there's too much Microsoft in the standard and they are the only ones not having UTF-8 everywhere in Windows yet.
Of course it can be made UTF-8. Just add a codepoints_size() method and other helpers.
But it isn't really needed anyway: I'm using it for UTF-8 (with helper functions for the 1% cases where I need codepoints) and it works fine. But starting with C++20 it's starting to get annoying because I have to reinterpret_cast to the useless u8 versions.
First, because of existing constraints like mutability though direct buffer access, a hypothetical codepoints_size() would require recomputation each time which would be prohibitively expensive, in particular because std::string is virtually unbounded.
Second, there is also no way to be able to guarantee that a string encodes valid UTF-8, it could just be whatever.
You can still just use std::string to store valid encoded UTF-8, you just have to be a little bit careful. And functions like codepoints_size() are pretty fringe -- unless you're not doing specialized Unicode transformations, it's more typical to just treat strings as opaque byte slices in a typical C++ application.
I never said always. Just add some new methods for which it has to be UTF-8. All current functions that need an encoding (e.g. text IO) also switch to UTF-8.
Of course you could still save arbitrary binary data in it.
Is there a single esoteric DSP in active use that supports C++20? This is the umpteenth time I've seen DSP's brought up in casual conversations about C/C++ standards, so I did a little digging:
Aside from that, from what I can tell, those esoteric architectures are being phased out in lieu of running DSP workloads on Cortex-M, which is just ARM.
I'd love it if someone who was more familiar with DSP workloads would chime in, but it really does seem that trying to be the language for all possible and potential architectures might not be the right play for C++ in 202x.
Besides, it's not like those old standards or compilers are going anywhere.
Cadence DSPs have C++17 compatible compiler and will be c++20 soon, new CEVA cores also (both are are clang based).
TI C7x is still C++14 (C6000 is ancient core, yet still got c++14 support as you mentioned).
AFIR Cadence ASIP generator will give you C++17 toolchain and c++20 is on roadmap, but not 100% sure.
But for those devices you use limited subset of language features and you would be better of not linking c++ stdlib and even c stdlib at all (so junior developers don't have space for doing stupid things ;))
How common is it to use Green Hills compilers for those DSP targets? I was under the impression that their bread was buttered by more-familiar-looking embedded targets, and more recently ARM Cortex.
Dunno! My last project there was to add support for one of the TI DSPs, but as I said, that's decades past now.
Anyway, I think there are two takeaways:
1. There probably do exist non-8-bit-byte architectures targeted by compilers that provide support for at-least-somewhat-recent C++ versions
2. Such cases are certainly rare
Where that leaves things, in terms of what the C++ standard should specify, I don't know. IIRC JF Bastien or one of the other Apple folks that's driven things like "twos complement is the only integer representation C++ supports" tried to push for "bytes are 8 bits" and got shot down?
Judging by the lack of modern C++ in these crufty embedded compilers, maybe modern C++ is throwing too much good effort after bad. C++03 isn't going away, and it's not like these compilers always stuck to the standard anyway in terms of runtime type information, exceptions, and full template support.
Besides, I would argue that the selling point of C++ wasn't portability per se, but the fact that it was largely compatible with existing C codebases. It was embrace, extend, extinguish in language form.
> Judging by the lack of modern C++ in these crufty embedded compilers,
Being conservative with features and deliberately not implementing them are two different thing. Some embedded compilers go through certification, to be allowed to be used producing mission critical software. Chasing features is prohibitively expensive, for no obvious benefit. I'd bet in 2030s most embedded compiler would support C++ 14 or even 17. Good enough for me.
> Being conservative with features and deliberately not implementing them are two different thing.
There is no version of the C++ standard that lacks features like exceptions, RTTI, and fully functional templates.
If the compiler isn't implementing all of a particular standard then it's not standard C++. If an implementation has no interest in standard C++, why give those implementations a seat at the table in the first place? Those implementations can continue on with their C++ fork without mandating requirements to anyone else.
> Then they will diverge too much, like it happened with countless number of other languages, like Lisp.
Forgive me if I am unconvinced that the existence of DSP-friendly dialects of C++ will cause the kinds of language fracturing that befell Lisp.
DSP workloads are relatively rare compared to the other kinds of workloads C++ is tasked with, and even in those instances a lot of DSP work is starting to be done on more traditional architectures like ARM Cortex-M.
A cursory Chromium code search does not find anything outside third_party/ forcing either signed or unsigned char.
I suspect if I dug into the archives, I'd find a discussion on cxx@ with some comments about how doing this would result in some esoteric risk. If I was still on the Chrome team I'd go looking and see if it made sense to reraise the issue now; I know we had at least one stable branch security bug this caused.
Related: in C at least (C++ standards are tl;dr), type names like `int32_t` are not required to exist. Most uses, in portable code, should be `int_least32_t`, which is required.
Because agentic AI can parse unstructured data and make purchasing decisions regardless of whether your site allows it, which avoids the chicken-and-egg problem.
It's similar to DoorDash. If your restaurant didn't want to sign up, they added you anyways and took orders on your behalf, then sent a physical courier over with a prepaid card to order takeout. Sometimes the menus were parsed incorrectly and customers blamed the restaurant.
This forced restaurants to sign up, claim their page, and keep their menus up to date, since not offering delivery wasn't an option.
At least 1 agentic AI tool will ignore these new terms and buy stuff on eBay anyways. Inevitably there'll be bugs or it won't get the best deal. At first this won't matter, but eventually competitors will offer a bug-free purchasing experience and consumers will move over.
>Because agentic AI can parse unstructured data and make purchasing decisions regardless of whether your site allows it, which avoids the chicken-and-egg problem.
People can do that too, and also benefit from actual structured data. But the avoidance of the chicken and egg problem didn't seem to result in widespread structured data stores beating out the SEO-spam-style listings.
The revealed preference of players is for terrible AI so games are easier. That's why AI has been going downhill.
Payday 2 is my favourite example since they've had a bug since Day 1 that lobotomizes the AI subsystem.
Specifically, there's a global cap of 1 action for all enemies per game tick, so when there's too many enemies the reaction time is 5 seconds.
The mod Full Speed Swarm fixes this bug and the game is unplayable without collaboration and a lot of skill.
It's also unnoticeable until you die or if you do a ton of research on the AI. I used to host pubs with the mod to troll other players who suddenly found the game impossible to play at lower difficulties.
I think it's possible we get an AI driven RTS but the demand is too small right now unless its a recruitment vehicle for the military.
Whether better AI is fun or not depends on (A) the sorts of advantages/disadvantages it is given and (B) its skill level compared to the player.
When people say they want better AI, they mean they want smarter + human-like AI without "cheating" (more resources, perfect aim, rubber banding, perfect knowledge, etc).
And even if you build a human-like AI, it's not going to be fun for players if it's too good, just like how humans have the most fun playing against other players at their level, not against the top-ranked players in the world.
So to keep AI fun, there is always going to be artificial limitations, and "people want terrible AI" doesn't really capture that.
One example might be games where you can hear footsteps or a branch snap. With stereo audio, you could technically derive exactly where someone is, possibly even which room they are in. When that raw signal is provided directly to a machine, it might accidentally be too good, know which bush you crawled into, and shoot into it.
Yet humans would only be able to go "I think I heard a branch snap" or "someone must be creeping up on me" even though they are given the same signal as the AI.
I don't know about this... I think that the "expert" AIs on RTS games just get more resources (not compute resources, but in-game resources) so they can create more units. I'd love to play against an AI component that has the same resource collection speed as the human player.
I still remember the mod scene in original StarCraft (+ Brood War) - it got to the point of supplying alternative AI algorithms, and there were plenty of non-cheating ones that were a noticeable step up in difficulty. At that point I already played competitively with friends and on-line, so they were only very challenging to me; for a new player, or someone casually doing the single player campaign, or just me a year earlier, they'd be impossible to beat.
I wonder how strong such an agent would be today if years ago AlphaStar was on par with pros. All those fun projects (the one with Dota 2) died out when LLMs took the scene and RL died out.
Yeah, cheating with resources or vision is disappointingly common. I remember the Green Tea AI for Starcraft 2 being pretty hard, but it looks like above "medium" it cheats (but the TL wiki does say "very easy", which will give you 10 minutes to set up, is equivalent to Blizzard's "harder").
Turning museums into a Resident Evil house is a cool idea.
reply