Crystal has an explicit static type system and is actually optimized at the language level for AOT compilation. These features are pretty much required for compiling and maintaining large programs.
This is for a limited subset of Ruby - almost no popular Ruby gems would run under it. It's more like PreScheme [1] (ie. a subset of a language oriented at C compilation).
I don't think these compete in the same niches right now. Full Ruby almost certainly requires a JIT.
It's a similar subset to mruby, and it might well end up influencing mruby, which does have its users. But it's almost a different language in some ways.
This is what I've been wondering after only a cursory glance ("It...generates optimized C code" from the OP). Interesting that mruby itself got a major version update around the same time (in just the past few days) https://github.com/mruby/mruby/blob/master/doc/mruby4.0.md
There are ways to partially improve or at least quantify the specification gap using LLMs, by analyzing variance in the output formal specification when given a natural language specification (by eg. generating many formal specs from an input description).
See eg. "Draft-and-Prune: Improving the Reliability of Auto-formalization for Logical Reasoning" [1].
No. Did you even read the article? It talks about the "specification gap", which is the difference between the formalized semantics and the intended semantics.
Every formal method has that problem (including the mentioned trivial ones like SAT and SMT).
C# is a great language with almost unlimited power and great ergonomics (as the article shows), but the .NET CLR (runtime) is a bit overcomplicated with a distinct "Java smell", and packaging and distribution is still meh.
If they could make the developer experience similar to Go, it would rule the world...
You also have the option to do single file deployment where it self-extracts the runtime when you run it. It's not as nice but it works and maintains full compatibility.
Pretty much, yes. For example reflection is severely limited in .NET AOT vs. JIT, runtime generated code is more common than you'd think and cannot be done AOT. Go was designed for AOT so they already built everything around the limitations because it never supported more.
It'll just take time for .NET to catch up where the dependencies you need automatically work with AOT builds.
I actually really like the CLR developer experience next to java ngl. I reach for C# in lieu of java (less J2EE SingletonBeanFactoryManagerInstance slop) but particularly F# is pretty nice to use. Haskell has bad tooling, OCaml is getting better thanks to JaneStreet (and if OxCaml gets wide adoption unboxed types are a big perf win) but if nothing else lack of a Rider-esque debugger is just a big time sink.
I drank the Go kool-aid, then tried to do some high performance things the Go way: didn't work (channels are slow) and I got over it. Still think Go is great for web backends and the like with production grade stdlib.
Maybe the content is great, but the AI writing style is really grating with its staccato sentences and faux-"profoundness". Can't bear it any more, stopped reading.
"You’re not checking logic. You’re checking shape.". Ugh.
Sorry for that, everyone. I did use the AI to help me with structure and English. I thought I'd proofread and edited that enough to be readable, but apparently it still smells. I'll update the wording soon.
Or you can just write in your native language, and let us machine-translate it? Just a thought. We are, perhaps, letting ourselves be held back by norms that no longer bear any load.
That's a great idea, in fact. I'll try it out next time. Maybe even a mix, because I do sometimes want to be very specific about some expressions and experiment with wordplay
The mental-model that I am using for online writing, is that it is analogous to the spectrum of `pretending <-> acting`. The worst writing (AI or otherwise), looks, sounds, feels like pretense, like a kid that tucks a towel into his shirt, and runs around, pretending to be a super-hero. Meanwhile, acting, true acting, is invisible, it is a synonym for _being_[1].
That said, a lot of the AI writing feels "procedural", in the sense that most corporate writing (whitepapers, press releases, etc) feel procedural (i.e. the result of a constructed procedure). Before AI, the constructed procedure was basically that a piece of writing passes through a bunch of people (e.g. engineering -> management -> marketing -> website/email), and the output is a bland, forgettable pablum designed to (1) be SEO-friendly, (2) be spam-filter friendly, (3) be easy to ingest, (4) look superficially trustworthy and authoritative (e.g. inflated page count, extra jargon, numbers, plots), (5) look like it belongs to the "scene" or "industry" by imitating all the other corporate writings out there[2].
AI is interesting, in the same way that computers or the internet or an encyclopedia are interesting: how people choose to use it tells you a lot about them. All of those technologies can be used to compensate for a lack of skill (it helps one pretend), or they can be used to forge a skill (it helps one become).
One has to pretend, before they can act (I guess? Feels intuitively correct to me). So perhaps, AI (and web, and computer, and encyclopedia) is only harmful to the extend that it does not nudge a person towards becoming[3]? And if so, that's a _cultural_ limitation, not a technological one.
[1]: I am not an actor, and so I might be wrong, but that is the impression I get from just watching and analyzing the acting in various films.
[2]: this becomes frustrating when you get criticized for producing something that "reads like $famousSomething", and then you get criticized again for producing something that "does not read like $typeOfFamousSomething".
[3]: No clue how you (plural -- let's bring back "yous") will convince your boss that you did not take the shortcut, because you were trying to "become more".
Maybe for resume cover letters and LinkedIn posts but I haven't met anyone with half decent taste who prefers AI writing, even well prompted, to skillful human writing. I'm not a stranger to using AI for writing tasks by any means but it's only ever a starting point that gets heavily rewritten by both myself and the model.
It's not hard to get them to copy a style, you just have to provide examples and they will happily produce similar text including grammatical and spelling mistakes. The trouble is with the composition and novelty. Most of the big models have had all of the interesting parts hidden behind a wall of RLHF. Local models are better since you can use ones that are not indoctrinated as a "helpful assistant" and also control the system prompt, temperature and see the top K alternate tokens which let you steer them in interesting ways.
>Maybe for resume cover letters and LinkedIn posts but I haven't met anyone with half decent taste who prefers AI writing, even well prompted, to skillful human writing.
That attitude is one, maybe two generations away from extinction. Taste is created by the market, which caters to the young. When enough people have been born into a world in which AI generated culture and communication is the norm, that is what will define what good taste is. People like you (and I) will just come off like old people yelling at clouds.
We can already see this happening at the fringes. People have relationships with AI, they prefer AIs to real people, they use AI as a primary source of truth, they consider AI generated art to be superior to human work, they trust AI more than people. People identify as AI. AI is filling an emotional, sociological and creative space that an increasingly alienating and hostile society denies to people, for better or worse. Generative AI has only been a thing in popular culture for four years or so and it has already completely transformed human society and human sociology.
Barring a complete collapse of the AI bubble, which seems existentially impossible at this point given how invested our economies and government are in it, that's just what normal is going to be in a decade or so.
Popular taste is guaranteed to be awful since it is driven by economics and fads. That's the type you point out as created by the market and catering to the young. It's a disposable product of consumption used to sell shoes and overpriced paintings.
I don't disagree that it will permeate everything, it already does. It'll just be written by an AI instead of people being paid to find the next style to cop. I don't think it will extinguish human writing, you'll just have AI writing that you feed to official or public channels and then real writing that goes in private or pseudonymous channels. Using AI writing among friends or an in group will still be a faux pas and cringe because it will have become the norm to be rebelled against.
Tangent, but.. It must’ve picked up the faux profoundness on LinkedIn. Those posts I find truly unreadable. It half seriously makes me think anyone being able to post anything was a bad move.
I can 100% guarantee you that if you have a computer with 8GB RAM, the computer wouldn’t start swapping if you brought up a new process that needed 4 of those 8GB even though it says the operating system is using “8GB RAM”
reply