Are they cheap, though? I often avoid going to shops the like of GS25, Żabka or 7/11, because the prices are often 125%-150% the normal supermarket price.
In the area I stayed in Tokyo, there was a konbini and supermarket across the street from my nearest subway. Food is definitely cheaper, more selection and better quality at the supermarket. I started making notes on Google Maps for any supermarket that I can find along my routes.
25-50% markup is rather small, given the convenience. In the US, 100% premium is not unheard of. And Costco may be half the price of the supermarket if you’re willing to buy bulk.
The part about bluetooth is all too relatable.
I'm wondering if there is some other way to do wireless communication that is much less annoying to set up?
Or maybe it's because if you're well-off, there's no reason to buy an expensive villa that looks just like the one next to it. Better to buy a smaller, more tasteful house that's uniquely yours?
IMO neighborhoods with a single house design copy-pasted are really unappealing, no matter the country.
Teleport that whole thing to London or the Bay Area and they'd sell for millions each. Some people are fine with cookie-cutter affluence. And after good landscaping and tree growth, it wouldn't look so weird.
The don't sell for millions because of "cookie-cutter" affluence in London or the Bay Area, they sell because of the location and there are no other options.
Depends on the design/typology. Uniform rowhouses can often look very beautiful--take any number of streets in Brooklyn Heights, or the newer projects by Peter Barber in London. It's not "repetition" as a general principle that's unappealing, it's the particular execution that's so common in cheap sprawl developments.
There are national differences to this, in Europe and Asia affluent people tend to move into city centers whereas in NA often the opposite is often the case (exceptions of course do apply), which makes these developments even more baffling in China.
It's literally a bunch of McMansions thrown at some plot of land. You see this a lot with real estate development in the emerging economies, as if developers looked at Western movies from the 80s and decided that is what high status looks like. A lot of really tacky retro-futuristic skylines that look terrible and out of place have come out of this as well in China or the gulf states.
Oof. I've lived in townhouses for so long that I didn't even stop and think for a second that they all look the same. I kind of thought my previous house was a bit ugly, but I couldn't afford the ones I actually liked the looks of so I guess my expectations are just super low.
They don't care about the home, they care about the land (the buyers that is). Rules in China penalize holding undeveloped land, so if it's approved for a housing development, something has to be built. And if you end up renting it out, what do they care what it's like?
Oh no, the conclusion is so funny. That's the problem with "magic frameworks" like ASP.NET, where it's not obvious how to extend things if need be (if even possible).
I wonder how common were full game VMs in the 90s. For a game older than myself, wouldn't a VM layer incur a great performance penalty on PCs from that time?
It was far more important to have the same software work on Amiga, x86 (DOS), Mac and the whole slew of different machines than came and went.
Today we have fewer machines than the great explosive growth of the 80s.
Consider that most 'software' today is JavaScript interpreted by the Web Browser. It's not like those portability concerns didn't exist in the 80s, if anything, it was harder because you had to make your own interpreter back then.
---------
Many (maybe most?) video games seem to have been written in a VM, at least before Doom / high performance 3d graphics.
I think console games were in C/Assembly for performance.
But 'computer' games at that time was before the standard IBM PC or at least, before the PC won and Microsoft achieved dominance. When you didn't know if Amiga, PC98, IBM PC, Mac, or others would win it only made sense to write a VM.
SCUMM (Monkey Island and many others) comes to mind.
And as it happens, early versions of Excel used a bytecode running on a VM instead of native code. Though the motivation was not portability, but rather memory requirements:
> In most cases, p-code can reduce the size of an executable file by about 40 percent. For example, the Windows Project Manager version 1.0 (resource files not included) shrinks from 932K (C/C++ 7.0 with size optimizations turned on) to 556K when p-code is employed.
> Until now, p-code has been a proprietary technology developed by the applications group at Microsoft and used on a variety of internal projects. The retail releases of Microsoft Excel, Word, PowerPoint®, and other applications employ this technology to provide extensive breadth of functionality without consuming inordinate amounts of memory.
> Many (maybe most?) video games seem to have been written in a VM, at least before Doom / high performance 3d graphics.
This was not terribly common, for the obvious performance reasons. Another World ran at around 10-20FPS on most of the systems it was released for, which is fine for a methodical game like that (and for adventure games like Monkey Island, etc.) but doesn't work for fast action games.
And of course VM games were basically impossible for the entire 8-bit era, with the exception of things like Zork (and the rest of Infocom's Z-Machine games) whose performance needs were so small that the gigantic overhead of an 8-bit VM was hardly noticeable.
Even into the 16-bit era, the majority of multi-platform games were fully rewritten ports.
Yes, but you weren't doing things like _Elite_ in these sorts of special-purpose VMs. Aside from the portability issue (extremely important when platforms had the lifespan of mayflies back then as Moore's law blazed along full speed), VMs also got you compression. A full compiled binary might be exorbitantly expensive in disk/tape space, not to mention RAM. But a very small VM could make a custom-tailored language to interpret on the fly, and save a ton of space where you needed to sweat every kilobyte. (Think about the different in size between `print "Hello world!"` and the default compiled binary.) It didn't matter how fast your text adventure ran if it couldn't fit in X kb of space.
During the heyday of assembly language, VMs were common in business software as well. It made porting to different types of systems easier in a time when standards-compliant C-language compilers targeting a variety of systems did not yet exist or were very expensive.
Oh, I know about the emulation layers of early computers! But I'd assume those programs rarely required frame-perfect input unlike video-games. Wouldn't that be too wasteful and needlessly limit the playerbase?
Edit: after reading through wikipedia, I think maybe a VM wouldn't be that wasteful, since the game is very simple mechanically.
Consoles (and arcades) back then had far better graphical performance than computers. So good computer games didn't have frame perfect inputs at all ... Or at least, not good games (bad games like TMNT for PC / DOS did exist but we're horribly buggy and broken)
Computer games had explosive inputs available, like Civilization or needed the use of a mouse.
Not so much action / frame perfect stuff. Not until a bit later anyway. Eventually computers were fast enough for arcade ports but computer games just didn't really target that action niche.
------
The 'computers' with good graphics were like Amiga, not x86 based DOS with mode 13h graphics. So it was all the fallen / failed computers that had the decent action games IIRC.
The trick to these earlier VMs, from the Infocom Z-Machine and Wizardry's interpreted Pascal code, through SCUMM, Sierra AGI and SCI, Another World, the Horrorsoft games, etc., is that they recognized that the games they were making were primarily going to be "content-delivery mechanisms": lots of text and graphical assets, driven by relatively simple computations: the authoring constraint is only related to the hardware in terms of I/O and data compression. So the code that was being run by the interpreter was mostly run-once "initialize the scene" and then some animation timers.
The opposing idea is represented more by arcade gaming, and later, stuff like Doom and Quake: The game is relatively intimate with the hardware in what it simulates, while the kind of definition that makes up a scene is more on the order of "put a monster here and a health pickup there", which aligns it towards being map data, instead of scripted logic.
Depends on what you consider 'full game VM'. Adventure games from Infocom ran all game code on a VM, and so did the graphical adventures from Sierra and LucasArts.
The latter two used some native graphics primitives of course.
Another World is on a whole other level. SCUMM is from '89 and the NDS came out in 2004. Another World game came out in 1991, and because it used the VM it could be back-ported to Apple IIGS (1986), the computer that's 5 years older than the game itself!
The graphics exclusively used real-time rendered polygons with support for transparency, which nobody knew was even possible at the time. Along with researching the new rendering tech, the same person created everything else except the music - the memorable & immersive world, an original story, concept and cover art, strong cinematics that were SoTA at time, graphics and animation, innovative level design, puzzles, the game logic - over just 2 years. It also defined a new 'cinematic platformer' genre, with later titles like Flashback, Blackthorne, Oddworld, and recent LUNARK. It's simply incredible feat.
real-time rendered polygons with support for transparency, which nobody knew was even possible at the time
Aegis Animator was doing pretty much the same sort of rendering on the Amiga, in 1985.
I never did much with it, what with being a kid at the time, but it was fun to play with and looked pretty cool. I don't think its rendering was as tightly optimized as Another World's was, though.
Earthbound (SNES, 1994) contains TWO complete scripting systems, one for the dialog system (which is occasionally used for things it shouldn't be; most of shop logic is in it), and one for scripting sprite movement. The dialog script is actually quite impressive and easy to use; I'd consider implementing a similar system even in a modern RPG. The sprite movement script is trash, significantly harder to work with than games that use raw assembly. Apparently that movement script system was actually a common in-house library at HAL, dating back to the NES era, but I don't know too much about that history.
Also most of the game's assembly was actually compiled from C, which was almost unheard of for console games at the time.
Well the idea of a vm wasnt to foreign if my reading of computing history is right. Consider that java launched in 1993 or 94 and its big claim to fame was its portability between systems, and that was because of the jvm, or java virtual machine.
I don’t think virtual machines and emulation are that new of a thing. Virtualizing x86 at full speed on consumer hardware has been a thing for, what, 15 to 20 years? And sure that requires special processor features, but remember that systems that came before that that would need to be emulated had even less computational demands. Iirc, a widely used pos software from the 80s has been running in emulation on pos hardware that far exceeds its requirements for the last 25 years, at least.
Also, my understanding is that lots of crucial government and business software runs on many layers of virtualization.
And my last recollection from what I’ve gathered is that, really until around the mid 90s a lot of operating systems made until then were pretty much hypervisors that ran programs that were virtual machines themselves. Multitasking was simply being able to route hardware resources to a given program, which was sorta its own environment.
Memory was often the constraint on low-end computers "back in the day", so code density was a reason to have a VM. This is why Wozniak shipped a VM in the Apple II's ROM.
VM already existed since PL/0 which is the prototype of Pascal. It is also known as P-code and to.be honest it is fine. Especially when you can leverage JIT which trades memory space for gains in speed.
If you mean bytecode as executable format, that originates already in the late 1950's, early 1980's, with microcoded CPUs as the interpreter, from which Burroughs Large Systems is one of the most famous ones.
uxn is one of those things I know of and read about for quite a long time, but never end up trying myself. Those "limited on purpose" community-supported systems are a really cool concept.
Given that they compared rust-wasm on a selection of a rather odd number of nodes (673- what went into that particular selection?) and the title of "lies", it just reads as someone at fastly is bitter.
All the article does (which they even admit, blaming it on "beta" status) is that, if you want to run javascript on an edge server, cloudflare is faster. On top of that, you can do it in a free tier, whereas fastly says their free offering is only meant for a short trial.
The Agile shop I currently work for treats UML as mandatory before any code is written. Then we proceed to ignore all designs until the next time someone has to look at the design doc again