I'd have to take a contrary view on that. It'll take some time for the technologies to be developed, but ultimately managed JIT compilation has the potential to exceed native compiled speeds. It'll be a fun journey getting there though.
The initial order-of-magnitude jump in perf that JITs provided took us from the 5-2x overhead for managed runtimes down to some (1 + delta)x. That was driven by runtime type inference combined with a type-aware JIT compiler.
I expect that there's another significant, but smaller perf jump that we haven't really plumbed out - mostly to be gained from dynamic _value_ inference that's sensitive to _transient_ meta-stability in values flowing through the program.
Basically you can gather actual values flowing through code at runtime, look for patterns, and then inline / type-specialize those by deriving runtime types that are _tighter_ than the annotated types.
I think there's a reasonable amount of juice left in combining those techniques with partial specialization and JIT compilation, and that should get us over the hump from "slightly slower than native" to "slightly faster than native".
I get it's an outlier viewpoint though. Whenever I hear "managed jitcode will never be as fast as native", I interpret that as a friendly bet :)
> JIT compilation has the potential to exceed native compiled speeds
The battlecry of Java developers riding their tortoises.
Don’t we have decades of real-world experience showing native code almost always performs better?
For most things it doesn’t matter, but it always rubs me the wrong way when people mention this about JIT since it almost never works that way in the real world (you can look at web framework benchmarks as an easy example)
It's not that surprising to people who are old enough to have lived through the "reality" of "interpreted languages will never be faster than about 2x compiled languages".
The idea that an absurdly dynamic language like JS, where all objects are arbitrary property bags with prototypical dependency chains that are runtime mutable, would execute at a tech budget under 2x raw performance was just a matter of fact impossibility.
Until it wasn't. And the technology reason it ended up happening was research that was done in the 80s.
It's not surprising to me that it hasn't happened yet. This stuff is not easy to engineer and implement. Even the research isn't really there yet. Most of the modern dynamic language JIT ideas which came to the fore in the mid 200X's were directly adapting research work on Self from about two decades prior.
Dynamic runtime optimization isn't too hot in research right now, and it never was to be honest. Most of the language theory folks tend to lean more in the type theory direction.
The industry attention too has shifted away. Browsers were cutting edge a while back and there was a lot of investment in core research tech associated with that, but that's shifting more to the AI space now.
Overall the market value prop and the landscape for it just doesn't quite exist yet. Hard things are hard.
You nailed it -- the tech enabling JS to match native speed was Self research from the 80s, adapted two decades later. Let me fill in some specifics from people whose papers I highly recommend, and who I've asked questions of and had interesting discussions with!
Vanessa Freudenberg [1], Craig Latta [2], Dave Ungar [3], Dan Ingalls, and Alan Kay had some great historical and fresh insights. Vanessa passed recently -- here's a thread where we discussed these exact issues:
Vanessa had this exactly right. I asked her what she thought of using WASM with its new GC support for her SqueakJS [1] Smalltalk VM.
Everyone keeps asking why we don't just target WebAssembly instead of JavaScript. Vanessa's answer -- backed by real systems, not thought experiments -- was: why would you throw away the best dynamic runtime ever built?
To understand why, you need to know where V8 came from -- and it's not where JavaScript came from.
David Ungar and Randall B. Smith created Self [3] in 1986. Self was radical, but the radicalism was in service of simplicity: no classes, just objects with slots. Objects delegate to parent objects -- multiple parents, dynamically added and removed at runtime. That's it.
The Self team -- Ungar, Craig Chambers, Urs Hoelzle, Lars Bak -- invented most of what makes dynamic languages fast: maps (hidden classes), polymorphic inline caches, adaptive optimization, dynamic deoptimization [4], on-stack replacement. Hoelzle's 1992 deoptimization paper blew my mind -- they delivered simplicity AND performance AND debugging.
That team built Strongtalk [5] (high-performance Smalltalk), got acquired by Sun and built HotSpot (why Java got fast), then Lars Bak went to Google and built V8 [6] (why JavaScript got fast). Same playbook: hidden classes, inline caching, tiered compilation. Self's legacy is inside every browser engine.
Brendan Eich claims JavaScript was inspired by Self. This is an exaggeration based on a deep misunderstanding that borders on insult. The whole point of Self was simplicity -- objects with slots, multiple parents, dynamic delegation, everything just another object.
JavaScript took "prototypes" and made them harder than classes: __proto__ vs .prototype (two different things that sound the same), constructor functions you must call with "new" (forget it and "this" binds wrong -- silent corruption), only one constructor per prototype, single inheritance only. And of course == -- type coercion so broken you need a separate === operator to get actual equality. Brendan has a pattern of not understanding equality.
The ES6 "class" syntax was basically an admission that the prototype model was too confusing for anyone to use correctly. They bolted classes back on top -- but it's just syntax sugar over the same broken constructor/prototype mess underneath. Twenty years to arrive back at what Smalltalk had in 1980, except worse.
Self's simplicity was the point. JavaScript's prototype system is more complicated than classes, not less. It's prototype theater. The engines are brilliant -- Self's legacy. The language design fumbled the thing it claimed to borrow.
Vanessa Freudenberg worked for over two decades on live, self-supporting systems [9]. She contributed to Squeak EToys, Scratch, and Lively. She was co-founder of Croquet Corp and principal engineer of the Teatime client/server architecture that makes Croquet's replicated computation work. She brought Alan Kay's vision of computing into browsers and multiplayer worlds.
SqueakJS [7] was her masterpiece -- a bit-compatible Squeak/Smalltalk VM written entirely in JavaScript. Not a port, not a subset -- the real thing, running in your browser, with the image, the debugger, the inspector, live all the way down. It received the Dynamic Languages Symposium Most Notable Paper Award in 2024, ten years after publication [1].
The genius of her approach was the garbage collection integration. It amazed me how she pulled a rabbit out of a hat -- representing Squeak objects as plain JavaScript objects and cooperating with the host GC instead of fighting it. Most VM implementations end up with two garbage collectors in a knife fight over the heap. She made them cooperate through a hybrid scheme that allowed Squeak object enumeration without a dedicated object table. No dueling collectors. Just leverage the machinery you've already paid for.
But it wasn't just technical cleverness -- it was philosophy. She wrote:
"I just love coding and debugging in a dynamic high-level language. The only thing we could potentially gain from WASM is speed, but we would lose a lot in readability, flexibility, and to be honest, fun."
"I'd much rather make the SqueakJS JIT produce code that the JavaScript JIT can optimize well. That would potentially give us more speed than even WASM."
Her guiding principle: do as little as necessary to leverage the enormous engineering achievements in modern JS runtimes [8]. Structure your generated code so the host JIT can optimize it. Don't fight the platform -- ride it.
She was clear-eyed about WASM: yes, it helps for tight inner loops like BitBlt. But for the VM as a whole? You gain some speed and lose readability, flexibility, debuggability, and joy. Bad trade.
This wasn't conservatism. It was confidence.
Vanessa understood that JS-the-engine isn't the enemy -- it's the substrate. Work with it instead of against it, and you can go faster than "native" while keeping the system alive and humane. Keep the debugger working. Keep the image snapshotable. Keep programming joyful. Vanessa knew that, and proved it!
Yeah I've heard this my whole career, and while it sounds great it's been long enough that we'd be able to list some major examples by now.
What are the real world chances that a) one's compiled code benefits strongly from runtime data flow analysis AND b) no one did that analysis at the compilation stage?
Some sort of crazy off label use is the only situation I think qualifies and that's not enough.
Compiled Lua vs LuaJIT is a major example imho, but maybe it's not especially pertinent given the looseness of the Lua language. I do think it demonstrates that the concept that it is possible to have a tighter type-system at runtime than at compile time (that can in turn result in real performant benefits) is a sound concept, however.
The major Javascript engines already have the concept of a type system that applies at runtime. Their JITs will learn the 'shapes' of objects that commonly go through hot-path functions and will JIT against those with appropriate bailout paths to slower dynamic implementations in case a value with an unexpected 'shape' ends up being used instead.
There's a lot of lore you pick up with Javascript when you start getting into serious optimization with it; and one of the first things you learn in that area is to avoid changing the shapes of your objects because it invalidates JIT assumptions and results in your code running slower -- even though it's 100% valid Javascript.
Totally agree on js, but it doesn't have the same easy same-language comparison that you get from compiled Lua vs LuaJIT. Although I suppose you could pre-compile JavaScript to a binary with eg QuickJS but I don't think this is as apples-to-apples comparison as compiled Lua to LuaJIT.
Any optimizations discovered at runtime by a JIT can also be applied to precompiled code. The precompiled code is then not spending runtime cycles looking for patterns, or only doing so in the minimally necessary way. So for projects which are maximally sensitive to performance, native will always be capable of outperforming JIT.
It's then just a matter of how your team values runtime performance vs other considerations such as workflow, binary portability, etc. Virtually all projects have an acceptable range of these competing values, which is where JIT shines, in giving you almost all of the performance with much better dev economics.
I think you can capture that constraint as "anything that requires finely deterministic high performance is out of reach of JIT-compiled outputs".
Obviously JITting means you'll have a compiler executing sometimes along with the program which implies a runtime by construction, and some notion of warmup to get to a steady state.
Where I think there's probably untapped opportunity is in identifying these meta-stable situations in program execution. My expectation is that there are execution "modes" that cluster together more finely than static typing would allow you to infer. This would apply to runtimes like wasm too - where the modes of execution would be characterized by the actual clusters of numeric values flowing to different code locations and influencing different code-paths to pick different control flows.
You're right that on the balance of things, trying to say.. allocate registers at runtime will necessarily allow for less optimization scope than doing it prior.
But, if you can be clever enough to identify, at runtime, preferred code-paths with higher resolution than what (generic) PGO allows (because now you can respond to temporal changes in those code-path profiles), then you can actually eliminate entire codepaths from the compiler's consideration. That tends to greatly affect the register pressure (for the better).
It might be interesting just to profile some wasm executions of common programs. If there are transient clusterings of control flow paths that manifest during execution. It'd be a fun exercise...
The American working class doesn't like to acknowledge its own existence or assert its self-worth. There's no real self identity for that class in America.
In fact, a huge number of the people that are in that class would resent you for classifying them in this way. And the same is true for those in the upper middle class, or elites.
Secondly, trying to scope the xenophobia problem to just the working class is itself a bit of a misdirection. Plenty of that comes from the swaths of upper middle class white collar folks. And plenty of it comes from second gen immigrants who are eager to be counted among the natives.
The xenophobia _is_ the substitute American culture provides as a filler for the vacuum left by the lack of any sort of class identity. Everybody falls over themselves demonstrating how they can be "more American" in one way or the other. Who is a "real" American, what their qualities are, whether this particular thing or that particular thing is more or less American, etc. etc.
It's an alternate focus to direct all that shame the culture demands from the poor.
A considerable part of this is the fact that in a society where utilizing these programs is stigmatized to the degree that the USA does, people who see themselves as honest tend to avoid utilizing them.
And even those who are less than honest, but have a sense of propriety, would understand that the correct, culturally approved time to engage in these activities is AFTER one acquires a significant amount of wealth, when entitlements are knighted to become "economic incentives".
I really don't understand what any of this has to do with "trust", especially of the project or code. If anything people who want to gain undeserved trust would be incentivized to appear to follow a higher standard of norms publically. The public comments would be nice and polite and gregarious and professional, and the behaviour that didn't meet that standard would be private.
FWIW I've never programmed a line of code in zig and I don't know who this developer is.
All I got from it was "seems like GitHub is starting to deteriorate pretty hard and this guy's fed up and moving his project and leaving some snark behind".
I think there's still a category theoretic expression of this, but it's not necessarily easy to capture in language type systems.
The notion of `f` producing a lazy sequence of values, `g` consuming them, and possibly that construct getting built up into some closed set of structures - (e.g. sequences, or trees, or if you like dags).
I've only read a smattering of Pi theory, but if I remember correctly it concerns itself more with the behaviour of `f` and `g`, and more generally bridging between local behavioural descriptions of components like `f` and `g` and the global behaviour of a heterogeneous system that is composed of some arbitrary graph of those sending messages to each other.
I'm getting a bit beyond my depth here, but it feels like Pi theory leans more towards operational semantics for reasoning about asynchronicity and something like category theory / monads / arrows and related concepts lean more towards reasoning about combinatorial algebras of computational models.
You'd want to have the alteration reference existing guides to the current implementation.
I haven't jumped in headfirst to the "AI revolution", but I have been systematically evaluating the tooling against various use cases.
The approach that tends to have the best result for me combines a collection of `RFI` (request for implementation) markdown documents to describe the work to be done, as well as "guide" documents.
The guide documents need to keep getting updated as the code changes. I do this manually but probably the more enthusiastic AI workflow users would make this an automated part of their AI workflow.
It's important to keep the guides brief. If they get too long they eat context for no good reason. When LLMs write for humans, they tend to be very descriptive. When generating the guide documents, I always add an instruction to tell the LLM to "be succinct and terse", followed by "don't be verbose". This makes the guides into valuable high-density context documents.
The RFIs are then used in a process. For complex problems, I first get the LLM to generate a design doc, then an implementation plan from that design doc, then finally I ask it to implement it while referencing the RFI, design doc, impl doc, and relevant guide docs as context.
If you're altering the spec, you wouldn't ask it to regen from scratch, but use the guide documents to compute the changes needed to implement the alteration.
Yes, but that is usually more relating to pay/benefits. At google (from what I heard) contractors are put on the bad projects, maintenance work or support functions. As in there is a big separation between work done by full-time employees and contractors in most teams.
I think FTE is mostly used as a 'unit'. E.g. if two people work on something 50% of the time, you get one "FTE-equivalent", as there is roughly one full-time employee of effort put in.
Though in this context it just seems to be the number of people working on the code on a consistent basis.
* “Full Time Employee” (which can itself mean “not a part-timer” in a place that employs both, or “not a temp/contractor” [in which case the “full-time” really means “regular/permanent”]) or
* “Full Time Equivalent” (a budgeting unit equal to either a full time worker or a combination of part time workers with the same aggregate [usually weekly] hours as constitute the standard for full-time in the system being used.)
In a society where having a job is, for that vast majority not in the non-gilded classes, the only mechanism by which a person can secure their core needs.. losing a job is indeed a pitiable situation for most.
If we've built a society that when it "pivots" leaves swathes of people smeared out as residual waste, I'd argue we should feel bad.
We've certainly reached a point of technological advancement where many of these consequences at the individual level are avoidable. If they're still happening, it's because we've chosen this outcome - perhaps passively. But the clear implication of would be that we're collectively failing ourselves, as a species that tends to put some degree of pride in our intelligence.
And we should feel bad about that failure. It's OK to feel bad about that failure. We tend not to improve things we don't feel bad about.
Unfortunately the practical effect of whatever policy that comes out of this theorycrafting has left your media landscape an absurd and abject failure. Where the idea of objective truth being open to the highest bidder and allowed to change on a week by week, or day by day basis without challenge.. is a reality Americans now live every day.
If the theory is "sensible", who cares? At some point you do want to connect it to reality and outcomes, no?
Unfortunately it isn’t that simple. The opposite of our media landscape is countries that think they have free speech but really don’t, like most of Europe.
I’ll take having all the information in the world (true or false, purposefully curated for propaganda or organically reported) over any society that locks people up for social media posts deemed “fake”.
I have faith both in the marketplace of ideas leading to the best outcomes, and that the ability to lock people up over false speech will be weaponized eventually.
The American media landscape is the only possible result of true freedom of speech combined with the internet. It’s faaaaar from perfect but I do believe it’ll be the best in the end.
But right now America is factually less free than either Europe or my own country. You simply do not have due process anymore. Most of the protections of your constitution have been interpreted away to nothingness.
I just don't see how these so called valuable principles have actually materially served your people in being able to protect or defend the values you claim to hold.
Faith is fine, but you do need to evaluate ground truth at the end of the day. Outcomes matter.
That’s just propaganda. If we didn’t have due process, Donald Trump would be in jail. Or, if you think he’s the reason we lost it, half the Biden administration would. The idea that the legal system has somehow melted down in the last nine months is just scaremongering.
Norms are being violated, for sure, and the courts are being pushed to determine the bounds of the law. I won’t say I’m a fan of most of it, but it’s a far cry from lack of freedom.
I don’t know what your country is, so I can’t respond, but if you can be locked up over a social media post (assuming reasonable exemptions like direct incitement to violence) you’re not free. You just have been told you are.
The keystone freedom is free speech and almost nowhere else truly has it. It’s a spectrum for sure, and Europe is a lot closer than, say, China, but we’re the far extreme.
Any good outcomes also come from that same freedom of speech. It’s a double-edged sword, for sure. You have to take your anti-vax movement along with your Wikipedia.
> That’s just propaganda. If we didn’t have due process, Donald Trump would be in jail.
Well your country does still offer its protections for the wealthy and powerful. It's just regular people have less of it than we do in freer countries.
> Or, if you think he’s the reason we lost it, half the Biden administration would. The idea that the legal system has somehow melted down in the last nine months is just scaremongering.
I see this as coping, to be honest. Americans simply have been told that they are the vanguard of freedom for so long that they cannot imagine a world where their freedom is somehow lesser than others.
But as someone who grew up in America and emigrated out, I can tell you for a fact that Americans are less free than Canadians.
On average, the Canadian government gives itself less of a leeway to abuse people.
If you insult a police officer in America, that officer can abuse you and take away your rights and the probability of consequences for that officer is far lower than the probability in my country.
My country doesn't have a constitution that protects me from unreasonable searches and seizures, but yet, Americans have less protections from unreasonable search and seizure despite their constitution - due the loophole of civil forfeiture.
In your country, your government can pass a law to criminalize you, and then that makes it legal for the government to turn you into a slave. That's not allowed in my country.
Speedy trials in your country are only reserved for the rich.
Americans simply have less freedom than the rest of the first world. It's extremely hard for them to accept because of the propaganda they've been subject to.
But as someone who grew up all over America, and has seen and lived and experienced more of it than most Americans, I know for a fact that they are wrong.
If you don't have the money to pay for freedoms in America, you have far fewer than someone from my country does.
Yeah again, this is all a cartoon. Cops everywhere now have body cameras on nearly all the time. Freedom of speech gives us the right to video them and they can barely do anything in public anymore without five people doing so. Contempt of cop beatings still exist sometimes I’m sure, everywhere, but it’s hardly a thing most people are exposed to. If we were having this discussion in 1975 I’d grant you this point, it’s dramatically reduced now.
I was prosecuted for a misdemeanor when I was 18 and broke. I got a free lawyer (as the constitution says) who did a great job and the whole thing was over in a month. I was not rich. I don’t know what TV shows make you think our government is just locking people up willy nilly, it isn’t. (Our drug laws lock a lot of people up, but they aren’t that different or more draconian than most places, just the number of people who do drugs is, and there are countries that execute people for drug offenses that are misdemeanors here.)
The government cannot pass a law to criminalize you, criminalizing things is never retroactive. I assume by the slavery thing you mean prison labor. That actually is in the constitution, and is crazy. We’re working on it. Same with asset forfeiture.
The idea that because we have some areas in which we are less free than other countries we are less free in total is ridiculous. The fact that you say things like “Americans don’t have due process” is a strong indicator of internalized propaganda.
And I’m not some flag waving patriot American Exceptionalist by any means, I’ve traveled quite a bit more than most. But the one thing we do best is individual liberties. It’s why we’re where we are in the grand scheme of human history and Canada is basically just our suburb enjoying all of the benefits (national security with next to no defense budget, unlimited free trade a short truck ride away) while avoiding the cost.
Although we are disagreeing, I hope that this is not in an antagonistic sense - I do find this conversation interesting because it's not often the opportunity arises to discuss this topic.
As someone who was raised American, went to American schools, lived and breathed American culture for over the decade I went from child to adult.. moving away and living and breathing Canadian culture has been an enlightening experience.
Getting back to the topic..
These rebuttals really fall flat to my ears. They sound like technicalities that are constructed to paper over the underlying reality. My feelings on this topic aren't from propaganda, but from having experienced how people feel, act, and behave when I was growing up in America.
It's only after I moved to Canada that I realized that most Americans have to live in fear of police. Police are able to break laws at whim, and abuse people's rights, and the mechanism for resource is so inaccessible to the average person that it might as well not exist. I thought this was normal and didn't detract from "freedom" when I was growing up.
Now, this happens in Canada too, but on average they are _less_ able to abuse people. They still do, but the government and society does a better job of ensuring consequences in more of those situations.
The institutionalized pipeline to slavery that exists in America doesn't exist in Canada. Now, this one is something that affected me less on a personal level, because that institutionalized pipeline is targeted largely at black people, and I'm not black.
That said, if I was black, and in America.. the processed plant flower I'm lighting up and enjoying this saturday in my basement would be very much a direct threat to my freedoms. That would be enough, in many parts of America, to brand me as a dangerous threat to society. And it would be enough for my freedoms to be taken away by the state, and then for my labour to be rented out to private companies against my will.
This is not a hypothetical circumstance. This is a reality that tens of thousands of Americans live. This is on the ground reality.
But really for me, the emotional aspect is how people just live in less fear of the government here. Their government, on average, abuses them less. It's less capricious. It's less mean to them. It doesn't step on them as much as the American government steps on Americans.
But you do have to live and breathe it to understand the change in mentality.
> But the one thing we do best is individual liberties
This is a cultural mythology. An earnest review of the evidence shows that America is, in real terms of delivering liberties to its people, at the back of the pack of the cohort of first world nations.
> It’s why we’re where we are in the grand scheme of human history and Canada is basically just our suburb enjoying all of the benefits (national security with next to no defense budget, unlimited free trade a short truck ride away) while avoiding the cost.
I'm not too concerned about the place of Canada in "human history". The human suffering it seems to entail to gain that acclaim seems not really worth it.
You're entirely right about your other points though. Canada has benefited greatly from the US's economic engine. In fact, I think part of the reason Canadians enjoy more freedom than Americans is because of this.
It's adjacent to the American market, but segregated enough to make it a much smaller market. This has historically made it less interesting for powerful commercial interests to come meddle in Canadian political affairs and laws, and over time that means Canada has been able to protect its individual liberties better.
That pressure to undermine freedoms through loopholes, creative interpretation, and just straight up ignoring some of them.. that hasn't been as high in Canada, and that's definitely a circumstantial reality having to do with its proximity to the USA.
I tend to agree with this sentiment, but my takeaway is slightly different.
People who would describe themselves as supporters of "capitalism", as well as supporters of "communism" or "socialism", are not able to admit that their belief systems are actually religious in structure. Not spiritual perhaps, but effectively "secular religions".
Both capitalism and its nemesis arose in the mid 1900s, when humanity was obsessed with modernist thinking about "solving problems once and for all". And in that context, the people fell in love with these two "clean systems". A more perfect set of rules.
Sure, capitalism doesn't claim to be the most powerful god. But in surrogacy, it claims to be "the least imperfect system". Which is structurally the same claim: declaring the scripture to be some apex that is not surpassable.
The main difference between communism and capitalism was how it was implemented. The USSR went full-tilt ideologically rigid, and collapsed very quickly. The US didn't go full-tilt capitalism. It implemented a hybrid system with a high marginal tax, welfare programs, subsidies, labour unions, public works projects, along with a market system, and that hybrid non-ideologically rigid model served it well.
Around the time it was clear the USSR was collapsing, the USA went hard tilt in favour of ideological purity in capitalism. Systematic series of clawbacks in the tax regime, privatization, elimination of labour unions.
As they leaned into the religion, it was used against them, much like the communist religion was used against the people of the USSR. And now they have been robbed of their prosperity, of the value of their efforts, much like the people in the USSR were robbed.
Nice read but we also have democracy to prevent things but it still feels effectively hi-jacked by such fictional constructs like capitalism and the lobbying power
Theoretically we should be able to think of the majorities or ourselves and we can have a good system
but we also feel like a lack of choice I suppose, the elections feel between just two parties with choosing the lesser evil (I think zohran is cool tho in the democratic party and maybe he could signify some good things I guess)
Personally I feel like we need to focus more on the incentives and competency of people more than anything and try to vote it on that and not what they speak I suppose.
We don't have democracy because the people with the most money can use a century of learning how to manipulate people through mass propaganda, advertising, pr, spin to get the results they want. People don't form political opinions in a vacuum, they are formed by the messages they receive.
'Both capitalism and its nemesis arose in the mid 1900s, when humanity was obsessed with modernist thinking about "solving problems once and for all". And in that context, the people fell in love with these two "clean systems". A more perfect set of rules.'
All of this is junk. Karl Polanyi famously puts the birth of capitalism very late compared to other important thinkers, in 1834, by defining it as characterised by markets of fictitious commodities, i.e. stuff like labour, land, money. More mainstream would be to point to the Renaissance or british 16th century.
The idea that capitalism and communism would be dependent on an art movement of the early 20th century is quite bizarre, the Communist Manifesto was published in 1848 and by the late 19th century when modernism started to form unions and communist parties were already common.
Actually, modernism is a reaction to the apparent stalling of 'progress', WWI and nostalgia for the optimism of the early modern period. I.e. from 1500 to late 1800s. In part it was also a reaction to what is usually called modern physics, i.e. things like newtonianism and ether hypotheses breaking down in due to Michelson-Morley and early study of quantum phenomena, relativity and so on.
> All of this is junk. Karl Polanyi famously puts the birth of capitalism very late compared to other important thinkers, in 1834, by defining it as characterised by markets of fictitious commodities, i.e. stuff like labour, land, money. More mainstream would be to point to the Renaissance or british 16th century.
Once again, I'm not referring to theorycraft here. I'm talking about the pragmatics of it.
"Capitalism" as an ideological polemic that stood opposed to "Communism" was a concept that society adopted in the mid 1900s.
What you're talking about is some labeling of some social and economic mechanisms.
Marx might have described communism. But when the USSR came to power, the specific brand of communist _ideology_ that was adopted by the government was its own thing, its own creature and entity.
Likewise, many theorists might have described a loose economic structure as "capitalism", but the "Capitalism, Freedom, and American Pie", as an ideological fixpoint that was sold to society as something to aspire to was something entirely different from the academic theorycraft you're referring to.
another absurd ahistorical comment on HN, where capitalism apparently arose in the mid 20th century despite the long-standing pre-existence of stock issuing multinationals, wage laborers, currency-mediated trade, reserve banking, etc.
The American ideological fixture of Capitalism certainly did arise then. I'm not talking about the general descriptive academic theory that labels certain loose economic and social models as capitalism. I'm talking about the capital C capitalism, standing opposite to capital C communism. The USA vs USSR, the grand battle of ideologies.
We remember that right?
The ideology was born in the mid 1900s, in the middle of modernist fervour where humanity believed itself to be on the cusp of some sort of transformation into a kind of godhood. We had invented flight, we had harness light itself, we had controlled temperature, we had learned how to build buildings of any shape and size. And likewise we turned our attention to a machine for people.
Set up the right rules, and everything else will follow, the ideologies posit.
The initial order-of-magnitude jump in perf that JITs provided took us from the 5-2x overhead for managed runtimes down to some (1 + delta)x. That was driven by runtime type inference combined with a type-aware JIT compiler.
I expect that there's another significant, but smaller perf jump that we haven't really plumbed out - mostly to be gained from dynamic _value_ inference that's sensitive to _transient_ meta-stability in values flowing through the program.
Basically you can gather actual values flowing through code at runtime, look for patterns, and then inline / type-specialize those by deriving runtime types that are _tighter_ than the annotated types.
I think there's a reasonable amount of juice left in combining those techniques with partial specialization and JIT compilation, and that should get us over the hump from "slightly slower than native" to "slightly faster than native".
I get it's an outlier viewpoint though. Whenever I hear "managed jitcode will never be as fast as native", I interpret that as a friendly bet :)
reply