Visit this link and click on the "Thought for 11m38s" text: https://chatgpt.com/share/693ca54b-f770-8006-904b-9f31a58518... - that will show you exactly what it spent those 11 minutes doing, most of which was executing Python code using the reportlab library to generate PDF files, then visually inspecting those PDF files and deciding to make further tweaks to the code that generates them.
I had a major WTF moment there, until I realized that's probably for a hex driver (and thus something totally different than what I think of when someone says "impact wrench").
There are cases of pet dogs, having great relationships with their owners, eating the corpses of their owners after the owners died of some unrelated reasons. Possibly due to starvation in some cases.
In that video, was the ewe and lion cub pets or wild animals?
> They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.
Reminds me of the Crack.com interview with Jonathan Clark:
Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.
this is very very common as there's only a handful of school that teach this. Displacement mapping with a single poly is the answer. Game dev focused schools have this but any other visual media school it's "build a brick, array the brick 10,000 times".
I had the opposite reaction. As someone who was on team PSX, the wobbly jank is pleasingly nostalgic. Didn't someone say that the limitations and artifacts of the obsolete media of the past become the sought-after aesthetics of the future?
They are certainly sometimes a key part of the retro look that makes things nostalgic.
But even during the PSX era I found it distracting and annoying to look at so I can't say I have any nostalgia for it even now in the way I do for N64-style low-poly 3-D games or good pixel art.
This is all subjective so I suppose I should add an IMO, Even back then many games were preferable on the N64 like megaman legends, what the PS1 offered that was superior was storage, which allowed for more music and FMVs, and also allowed for voice acting and probably why MGS is still talked about to this day, my guess is the lack of detail helps immersion the same way you would read a novel, and I imagine the PS1 with its storage would've been the perfect vehicle for Visual Novels, but that still is not popular anywhere but Japan.
Even with realism, ports to dreamcast were better overall and considering the latest port of Final Fantasy Tactics does not emulate any of its PS1 limitations, I don't think a lot of people strive/like the aesthetic.
As someone who was team N64 I do agree PSX has more of a "trademark look" compared to the N64 which is pretty much just a very limited version of a modern graphics rasterizer.
That "expansion pak" RDRAM upgrade was designed to give the N64DD enough buffer space so devs could continue using about 4MB of RAM for everything else, so I can't imagine how expensive the N64 would have been if they had to ship it with 8MB of RDRAM to maintain the same standard of visual quality and a 2X CD-ROM drive.
Then again, the good games would have been $50 instead of $70, and there would have been a lot more developers willing to pay $0.20 per unit to ship games than $20 per unit for the common 12MB and 16MB ROM chips.
However, I don't know if Ocarina of Time or Majora's Mask would have worked as well without that ability to load entire scenes in < 500ms. Diddy Kong Racing and Indiana Jones & The Infernal Machine relied on the ability to stream data from the cartridge in real time to smoothly transition between scenes/areas. DKR only used it in the overworld AFAIK, but it was still impressive.
Not saying you're wrong, just that I'm glad things turned out the way they did because Ocarina and Majora's Mask likely could not have been done on a Saturn or PS1 beefed up with the N64's GPU.
I could be wrong, and some experienced romhackers could conjure up enough clever optimizations to make a faithful PS1 port of Ocarina of Time that doesn't have noticeable load times, but it would have been the result of years of research with no deadline pressure. I admit I'm just speculating, but not in a presumptuous and baseless way.
There was actually an unauthorized third-party CD-ROM drive for it, the Bung Doctor V64[1]. It didn't actually expand the available ROM space beyond what was possible with cartidges, but its still interesting in that it was allegedly used by licensed Nintendo devs as a lower-cost alternative to the devkits officially provided to them.
The RAMBUS speed is the main issue. The RDP can literally be stalled over 70% of the time waiting for memory. It's extremely flawed.
They could have used SDRAM and it would perform so much better, and I believe the cost is around the same.
If you wanted to cut something, cut the antialiasing. While very cool, it is a bit wasted on CRTs. Worst of all, for some reason they have this blur filter which smears the picture horizontally. Luckily it can be deblured by appliying the inverse operation.
I think the main reason is that when they architected it, RDRAM seemed like the better choice based on price and bandwidth at that time, and they underestimated the performance issues it would cause (RDRAM has amazing bandwidth but atrocious latency).
By the time the N64 launched, SDRAM was better and cheaper, and they considered it was too late to make the switch. Allegedly SGI wanted to make changes but Nintendo refused.
Basically they made the wrong bet and didn't want to change it closer to release.
OK, I also just read that basically Nintendo bet on ram bandwidth, but ignored latency.
A more general lesson: Nintendo bet on cutting edge, speculative technology with RDRAM, instead of concentrating on 'Lateral Thinking with Withered Technology'.
The whole thing about the texture cache being the worst design decision in the N64 just gets parroted so much, but nobody can cogently explain which corner should have been cut instead to fit the budget.
The N64's CPU, with pretty much every single game released on the platform, is just sitting there idling along at maybe 30% load tops, and usually less than that. It's a 64 bit CPU, but Nintendo's official SDK doesn't even support doubles or uint64!
Of course, Nintendo clearly cared about the CPU a lot for marketing purposes (it's in the console's name), but from a purely technological perspective, it is wasteful. Most of the actual compute is done on the RSP anyway. So, getting a much smaller CPU would have been a big corner to cut, that could have saved enough resources to increase the texture cache to a useful resolution like 128x128 or so.
It should be noted, though, that the N64 was designed with multitexturing capabilities, which would have helped with the mushy colors had games actually taken advantage of it (but they didn't, which here again, the Nintendo SDK is to blame for).
Only really in the marketing material. It's a bit like calling a 386 with an arithmetic co-processor an 80 bit machine, when it was still clearly a 32 bit machine by all metrics that matter.
However, I agree in general that the N64 CPU sits idle a lot of the time. It's overspecced compared to the rest of the system.
Yes. Although people like to point out that on the N64's CPU the external data bus is restricted to 32 bits, that's irrelevant in practice. The real limitation is the RDRAM's data bus, which is only 9 bits wide (of which the CPU uses 8 bits). The problem is that the rest of the system simply cannot match the overspecced CPU.
I wonder what the maximum addressable memory of the N64 is?
Of course, even 32 bit are massively more than they actually need for the paltry amount of memory they actually get, even if you map ROM and various devices into the same virtual address space.
> So, getting a much smaller CPU would have been a big corner to cut, that could have saved enough resources to increase the texture cache to a useful resolution like 128x128 or so.
How? The texture RAM (TMEM) is in the RSP, not in the CPU.
How is that relevant? "Resources" really just means money, which can be allocated between different items on the BoM at-will. The N64's chips are all (more or less) bespoke, so the functionality of each individual part is completely under Nintendo's control. Spend less on the CPU, and you suddenly have money left to spend on the RSP. (And on the RDP, which contains the TMEM -- it lives on the same chip as the RSP, but is a distinct thing. I assume you know this, but just to add to the discussion for readers - the RSP is the N64's SIMD coprocessing unit, which most games use to perform vertex shading, whereas the RDP is the actual rasterization and texturing hardware.)
Realistically it wasn't even "We only have X dollars to spend". They needed the console to have a final budget and they really could have "just" added more transistors dedicated to that texture unit without significantly altering prices or profit.
But hardware was actively transitioning and what we "knew" one year was gone the next and Nintendo was lucky to have made enough right choices to support enough good games to survive the transition. They just got some bets wrong and calculated some tradeoffs poorly.
For example, almost everything Kaze is showing off, all the optimizations were technically doable on original hardware, but devs were crunching to meet deadlines and nobody even thought to wonder whether "lets put a texture on this cube" needed another ten hours of engineering time to optimize. Cartridges needed to be constructed by Christmas. A lot of games made optimization tradeoffs that were just wrong, and didn't test them to find out. Like the HellDivers 2 game size issue.
Sega meanwhile flubbed the transition like four different ways and died. Now they have the blue hedgehog hooked up to a milking machine. Their various transition consoles are hilariously bad. "Our cpu and rasterizer can't actually do real polygon rendering and can't fake it fast enough to do 3D graphics anyway. Oh, well what about two of them?"
You are right about the RSP/RDP distinction. My point is that removing transistors from one chip doesn't magically let you add more transistors to another chip, that's not how IC fabrication works. And the CPU was not a custom design, it was a VR4300 licensed by NEC from the original R4300.
Anyway, the real problem is that TMEM was not a hardware-managed cache, but a scratchpad RAM fully under the control of the programmer, which meant that the whole texture had to fit under a meagre 4 kB of RAM! It is the same mistake that Sony and IBM later made with the Cell.
You could have saved a lot of money by using CDs instead of cartridges.
If you sell games for roughly the same amount as before (or even a bit cheaper), you have extra surplus you can use to subsidise the cost of the console a bit.
Effectively, you'd be cutting a corner on worse load times, I guess?
Keep in mind that the above ignores questions of piracy. I don't know what the actual impact of a CD based solution would have been, but I can tell for sure that the officials at Nintendo thought it would have made a difference when they made their decision.
imho, Nintendo had a hard enough time with preventing piracy and unlicensed games with the NES and SNES and saw the PS1 got modded within a year, even with the special black coated discs to hide the tracks. There wasn’t a lot of optical/compact disc copy protection magic at the time and, cd-rs and writers started getting popular quickly as well. ps1 in 1994, n64 in 1996, backwards Dreamcast GD-ROMs and beginnings of larger discs and DVDS in 98.
> I agree that the PS1 had more piracy, but I'm not sure that actually diminished its success?
At least in my corner of the world (Spain), piracy improved its success. Everybody wanted the PSX due to how cheap it was, I think it outsold the N64 10:1.
A million years ago we had Microsoft Office, PerfectOffice, Lotus SmartSuite, Lotus Symphony (which became one of the free suites), and others I can't remember.
Then we had a bunch of Java and web versions built of various office appplications.
It would be a massive undertaking to create a new office suite from scratch.
As much as everyone complains about Microsoft Office the historic alternatives were all much worse and eventually all collapsed under their own weight.
Companies that had a successful niche, like Lotus, failed to keep up.
It seems like most users got tired of the unknowns with CentOS and went to Alma/Rocky. Doesn't help that most third party software vendors also didn't bother to support it.
reply