'widespread armed conflict in an increasingly vast fantasy universe, the introduction of housing seems to have spurred Blizzard into also pitching it to a new audience as a cozy house-building simulator. It’s simply that to get new furniture, you occasionally may have to go to another dimension and beat it out of a dragon.'
I wonder if similar philosophy could be argued for encrypted containers given analysis of the binary would produce no dicernable file structures, names, or matching content at any scale.
Consider a software download site which only hosted fully encrypted container binaries. Those binaries use a time based key (say attempt to decrypt when the clock returns x value representing a future or past date reached naturally or manually...). Go a step further - a hidden container which always decrypts a partition with decoy content unless the clock returns a non-existent date (missing day in a leap year?)
IP ambivalence at its finest; aI tech bros would approve.
That this is an actual rule that other versions of have been a thing for years makes further convinced we are on the falling edge of capitalist society.
TLDR:
"soon people will be able to sit at a computer and create a movie 'indistinguishable from what Hollywood now releases.'"
To be frank, 'soon' is now, most Hollywood movies are already under 15 second action shots and dialog scenes can be green screened by anyone with a grid suit and enough time.
The key word is "feel" which has a direct and causal relationship to societal programming which is directly impacted if not dictated by media/marketing _ both of which are heavily influenced by big players who encourage the consumer to feel guilty while paying both for the resource they are using AND it's markup which is ostensibly used for marketing and the consumption guilt "feeling" feedback loop grows.
I wouldn't say nvda is completely out,but the chess moves in response is tough - Release a new chip, obsoleting the existing lines and take hits on billions of defaults on hardware
TLDR: For now, everyone is sold out of tokens: a ridiculous percentage of every Nvidia card is selling every token it's generating, every token generated by Google's TPUs sells, Amazon's Trainium, Groq's silicon giants (they don't really name their chips and the chips are like 30 cm in diameter, so let's go with giants), ... and Nvidia B200s are the cheapest way, by far, to generate tokens and are being sold at something like double the speed they can be produced.
Once the AI craze slows, the most surprising thing is going to happen: Nvidia sales will go up. Why? Because it's older cards that will get priced out first, and it will become a matter of survival for datacenter companies to fill datacenters that currently run older hardware with the newest Nvidia hardware ...
That's the bull case. Under unlimited token demand, Nvidia wins big. Under slowing token demand, Nvidia actually wins bigger, for a while, and only then slows. For now, everything certainly seems to indicate demand is not slowing. Ironically, under slowing demand, it's China that will suffer in this market.
And the threat? Well it is possible to beat Nvidia's best cards in intelligence, in usefullness, because the human mind is doing it, on 20W per head (200W for the "full machine"). And long story short: we don't know how, but obviously it's possible. Someone might figure it out.
“Nvidia wins either way” assumes the game stays the same — but Google, Amazon, and Meta aren’t building custom silicon to beat Nvidia on price, they’re building it to never need Nvidia at all. The moat isn’t the chips, it’s CUDA lock-in, and every major player is racing to break it.
I would argue it just means the game doesn't suddenly change all at once. If the game changes slowly, in the short term it'll be good for Nvidia. It will take quite a while for it to affect Nvidia.
Google, Amazon and Meta are to some extent solving the wrong problem, or not solving the whole problem. They're designing chips ... which they can't build because they don't have the infrastructure and don't have as long running contracts as Nvidia does. They can't match Nvidia even at 3nm, at 10nm ... Now, maybe they can go with Intel (though several have tried and given up), but ...
Nvidia GPUs are still at their core reliant on the PC architecture,
Inferencing on Nvidia cores will soon be like encoding a h265 stream on CPU.
I expect custom built TPUs will have progressively more and more advanced hardware acceleration where legacy aspects of the CUDA architecture will eventually limit their innovation without architecture changes (pci-e, nvme bus, cpu interrupts, reliance on system ram for index tables, etc
..) which fill their moat and level the playing field for google/Amazon/eventually apple