Tangential, but I remember when I was studying for the ACT, there was something in one of the practice books that stuck with me. I'm paraphrasing but it was something like "Good writing is clear and easy to understand. It's about communication, make sure you communicate".
It was something that I guess I logically knew but hadn't fully realized. I had always tried to be fancy with my writing and pad it out to meet minimum word counts, with "understand-ability" being somewhat of an afterthought. Just that one statement in my ACT prep book made me, in my opinion, a significantly better writer.
Good writing and good communication is also about keeping the reader engaged and concentrating, however. Even in business writing - for example, how-tos or intranet pages, altering sentence length, using rhetorical questions can be helpful. I'm concerned that tools like this will tend to stamp out useful writing conventions, that were picked up by LLMs precisely because they were useful.
The result? Increasingly homogeneous, boring text.
This is something I've been working on in my own professional writing for years. I used to write very long emails, thinking I was providing insight and detail, but nobody would even bother reading them because it was such a chore.
I consider more than three paragraphs and more than two sentences per paragraph a "writing smell." It's relatively easy now, especially when I realized my predilection for verbosity was actually a symptom of my own insecurity, emotionalism, and indecisiveness.
I try to limit my emails to one, clear, strong point—usually just factual statements—in the active voice, eschewing adverbs as much as possible. The emails almost write themselves now, because there isn't much choice on what to write anymore.
Milchik's problem was that he worshipped--quite literally--the company. I do this for my own gain. I started getting much more of what I wanted out of work and other formal relationships (my kids' teachers, my doctors, etc) when I got honest with how I was communicating and how I was being perceived and changed my habits to suit.
My first exposure to Metal Slug was actually in regular emulators, and I never used the scanline filters, so now when I use the scanline filters in Metal Slug they feel..."wrong". In my mind, Metal Slug is supposed to have really sharp, chunky pixels.
Not my case; I'm old enough to play it at mid-late 90's in both bars and arcade rooms.
And that's the problem with current pixel art artists: they have no idea of what actual pixel art looked like. Hint: look at Garou with at least scanlines (or maybe a bilinear filter) enabled. That's what's Garou almost meant too look in CRT, far closer than raw pixel art.
I need to play Garou: Mark of the Wolves again, haven't touched that one in years. I believe that's the one that has a character named "Butt".
I've played a lot of Neo Geo games, and I even used to own a full MVS machine for awhile with its own CRT (mostly playing KOF 99), but I guess the scanlines never did much for me. I grew up playing the SNES and PlayStation and N64, but I almost equally grew up with emulators, so I guess I'm just used to the raw digital signal being displayed.
It never really occurred to me that you'd want to be able to detect if something is running in DOSBox, since I figured that the point was to be as compatible as possible with MS-DOS.
I guess it makes sense to try it anyway. Now I'm wondering how I'd be able to detect something like Concurrent DOS or REAL/32 or REAL/NG.
For me the opposite. I would have never though that there would be a point to trying to "detect" DOSBox since it would be trivial to do so. After all, DOSBox is not really designed to run MS-DOS, but its own DOS-like thing, and there must be a million small details that you can use to distinguish it from MS-DOS, if you really wanted to. I mean, the default filesystem is not even FAT...
_Even_ if you run the MS-DOS kernel in DOSBox, the builtin DOS literally leaks through in many places (e.g. many API services still available instead of crashing), with only some of the more recent forks even trying to hide it.
I encountered this and hadn’t realized that DOSBox’s main goal was to play games - and that lots of “non game things” didn’t work. But DOSbox-X covered some; I ended up running DOS 5 in VMware.
Testing if you're running under virtualization or emulation is a whole thing. We wrote virt-what to do this for virt and containers. It could do emulators as well if someone was motivated enough. It's basically a giant shell script. https://people.redhat.com/~rjones/virt-what/
There's also an adversarial aspect to this. Some emulators try to avoid detection and a lot of software tries to detect if it's running under virt for various reasons, eg. to stop cheating in games or stop reverse-engineering. (virt-what is deliberately not adversarial, it's very easy to "trick" it if you wanted to do that)
Also: malware often tries to detect a VM or an emulator too, for example Windows Defender uses an emulator internally to detonate samples, and there are attempts by malware to detect this and change the behavior to something benign.
Way back in the early 90s Thunderbyte Antivirus' TBCLEAN would use the x86 trap flag to single step viruses up to the point where they restored the original entrypoint of the infected program, then write the "cleaned" program back to disk. They used the CPU single-step as a hack to alleviate needing to write an emulator.
The virus writer Priest figured out he could detect being run under single-stepping, and manipulate the stack and trap flag to re-vector control from TBSCAN to a destructive routine that trash the user's hard disk (but otherwise just run normally when not in the presence of TBCLEAN).
He later used this idea as the basis for the "emulating tracer" (in Natas, for sure-- but I think present in some earlier code too-- I don't remember what, thought) using single-step interrupt calls to trace thru resident antivirus programs to find original BIOS and DOS interrupt vectors and "call past" them (to prevent detection and do stealth).
His tracer decoded the next instruction to detect every method by which the trap flag state could be leaked to or mutated bu the traced code. He would emulate and step over any of these "privileged" instructions", presenting a sanitized state to the code under trace. It wasn't a full x86 emulator and could not have handled code that used trap-based anti-debug. That would have required a full emulator (and that way lies madness).
When VMware virtualized x86 I thought about Priest's code. Defender and other AV running samples under emulation makes me think about it too. So does this article.
Makes sense; when I was doing WGU they explicitly forbid virtual machines, which makes enough sense since if you're in a VM they can't see your full screen. It wouldn't surprise me if nowadays they have some sort of software detector to see if you're in a VM.
There are detectors for VMs, and modifications to allow VMs to evade those detectors. It's an arms race.
Example: There is (was? I don't actively follow the community) a patch set for a particular piece of VM software that made it undetectable to anti-cheat in games.
While I don't use said software (I have a casual interest in it only...would be nice to get more games working on Linux), I have to disclose that I'm against anti-cheat mechanisms. I'm a software engineer, and I've worked on a few smaller games, and know the overall structure of bigger ones, and I don't think I've ever seen a game use good practices in multiplayer. Instead, they usually rely on client side code and lean on anti-cheat software to stop cheaters.
Sorry, Western Governors University. It's an online school.
When taking a test they have a proctor that's watching you on a webcam, and they make you pan the webcam around the room to ensure that there's no obvious way to cheat, and they make you share your screen to ensure you only have a browser running.
When I took final exams or industry certifications, I reported in-person to the Testing Center at community college. The center is custom built for secure proctoring exams. You check in with ID, you stash your watch, phone and wallet in a locker, and you use a secure computer in a monitored quiet room.
It’s perfect for all parties, and doesn’t intrude on your personal living space or devices.
> and they make you pan the webcam around the room to ensure that there's no obvious way to cheat, and they make you share your screen to ensure you only have a browser running.
Well that level of intrusiveness would just make me come up with something overly complicated just so I could prove that I could cheat if I wanted.
I think it's more about just trying to keep honest people honest. I could think of a dozen or so ways that I could have cheated, some more convoluted than others, but it's enough effort to where I don't seriously consider it, and I did all my exams legitimately.
That's why they make you share your screen; obviously there's plenty of ways to fool that but the goal is to make it so that cheating requires enough effort to where it's probably less effort just to study and do it right.
50ms is pretty high, even by LCD standards. I have one of those MiSTer Laggy measuring things, and when I have my cheap Vizio TV in "Game Mode" the latency is around 24ms, a little lower on the top of the screen and a little higher on the bottom, but still considerably lower than 50ms. Moreover, I think that OLEDs can get less than 10ms nowadays (though I do not have one to test at this moment). Since most retro games ran around 60fps, so about 17ms, we're talking about 1.5 frames of latency for the LCD, and about half a frame of latency for an OLED.
With something like the MiSTer, you can also enable high speed USB polling, which I believe is roughly 1000hz. My understanding is that it doesn't work with all controllers, but it has worked with all the controllers I have tried it with.
The composite video artifacts are definitely noticeable though; I noticed the weirdness of the waterfalls in Sonic when I was playing it recently. It doesn't bother me that much but I could see why it bothers other people.
Since most retro games ran around 60fps, so about 17ms,
That’s an oversimplification. Many retro game consoles don’t use a frame buffer. Instead they render the game state to the screen on the fly, one scanline at a time, and they’re able to process input mid-screen because they read the controller input many times faster than 60Hz (on the order of 2kHz). In practice, this means input lag is way below even 1ms.
Lightgun games, for example, rely on very precise timing of the control input vs the CRT raster and simply do not work without a CRT.
>Lightgun games, for example, rely on very precise timing of the control input vs the CRT raster and simply do not work without a CRT.
Perhaps the most famous light gun game of all time (Duck Hunt on the NES), does not rely on especially precise timing. It draws one white rectangle per frame over each duck when you pull the trigger and checks if the Zapper can see it. LCD latency will probably still break this, but it's not like the later Super Scope for the SNES that actually does track the precise raster position. I expect it would be possible to patch the timing in software to make it work for a specific model of LCD. But even if you did this, the Zapper also includes a bandpass filter at the CRT horizontal retrace rate (about 15kHz) to better reject other light sources, so you'd need to mod it to bypass that, or mod the LCD to strobe the backlight at the right frequency.
It draws one white rectangle per frame over each duck when you pull the trigger and checks if the Zapper can see it
Almost, but not quite. First it blanks the entire screen to solid black and uses that to calibrate the black level of the gun, then it draws a white rectangle over one duck on one frame, then a white rectangle over the other duck on the next frame.
The NES could use this information to determine where the gun was pointing by firing an interrupt at the exact moment when the zapper’s photodiode reached a threshold brightness level above black, and then only register a hit if that occurred while the game was drawing the white rectangle. I think in reality the game didn’t care that much about the timing, only that a rising edge occurred after the fully black frame but before the return to a normal colour frame.
Either way, an LCD doesn’t work because it can’t transition full black to full white within a one frame window. It sometimes works in the 2 duck mode, but it usually records a hit on the wrong duck. In any case, it requires black to white latency less than 16ms
I'm not disputing that CRTs have lower input lag than LCDs or OLED. I was disputing the specific 50ms of lag claim that the parent post made; modern LCDs aren't that bad, and OLEDs are getting to a point that it's getting close to undetectable to human eyes. Even with horizontal interrupts that could be done between scanlines, there's still a limit to how fast we can actually perceive it (and frankly I'd be skeptical of anyone that claims that the 8ms of input lag that an OLED is actually affecting your gameplay).
For light gun games, yeah, that timing might matter, but I'm not convinced it matters anywhere else.
LCDs have a further issue that CRTs do not have: transition time. When an LCD pixel is displaying black and it is driven to white, the voltage change across the driving transistor happens a lot faster than the change in brightness of the pixel (caused by the mechanical twisting of the crystal). This has opened the space for a lot of display marketers to play games with latency numbers. Often they will quote numbers for transitions between 2 similar grey levels rather than between full black and full white, which takes a lot longer.
CRTs don't have this issue at all. The phosphor lights up extremely quickly to maximum brightness, even from fully black. It's a bit slower for the phosphor to "cool back down" to black, but it's much faster than an LCD unless you're using a specific high-persistence phosphor. Typical consumer CRT monitors had a persistence in the low microseconds, except the IBM 5151 monochrome monitor which was much longer to give a stable, flicker-free image for heavy office work.
Same here, Samsung s95b QD oled, mister laggy tested it, as far as I can remember it's about 8ms. Also snac adapters by pass usb entirely and are pretty much zero lag as far as I understand.
Retro arch has run ahead latency reduction etc, I'd like to see some comparisons of that Vs mister. I could do it myself but I've never got round to it. I've noticed that fiddling with latency reduction in retro arch really works, but it is a lot of fiddling.
I did the preemptive frames thing with Retroarch with Sonic the Hedgehog 3 a couple years ago, and I certainly convinced myself that I could tell a huge difference...and then I kept taking hits and dying just as much as I was without doing anything.
It's entirely possible that someone who is better at video games can tell a huge difference (e.g. speedrunners and the like), but I'm afraid that I'm not good enough at most games to be able to realistically tell much of a difference.
I might still fiddle with it a bit; someone told me that it helps a lot with Mike Tyson's Punch Out, which is a game I have never beaten with an emulator.
Interesting. I bought sonic origins as a palate cleanser the other day and I really feel like I can feel the latency. Sonic 1 was the only game me and my brother had for our mega drive so we know/knew everything there is to know!!
Our speed runs were crazy.
I don’t know if feeling the latency is just my age though, although I’m a semi pro SIM racer still and competitive in my late forties it’s a different kind of twitch reaction.
I hadn't heard this, but looks like you are right.
Makes me feel a little conflicted having bought one of those SNK bundles a couple years ago on Steam or Humble Bundle or something. Don't love the idea of giving money to someone like that.
Someone told me that in 2013 when I was trying to do contract work. I gave him a quote that I thought was reasonable, and he thought it was too much and he told me that his thirteen year old son could do it for free.
I responded that he was the one who had reached out to me, and if he feels like his son can do it then he shouldn't have wasted his time trying to find a contractor since that will be more expensive.
The client didn't like the attitude and I didn't get the job, but I was kind of glad because it was pretty clear to me that he would have tried to weasel out of paying me regardless.
Yeah, that's really the hard part of contracting. I was in my 20s and was certain I HAD to win every bid to keep feeding myself. I took longer than I should to realize that 3 worthwhile jobs is far better than 10 hassles. All clients required changes, most thought they were overpaying, and many gave me a hard time when the bill was sent. Meanwhile, when I'd toss out $20,000 as a price tag, those folks were far more serious, and they paid on time! It caused me to learn a valuable lesson. You want to quote high enough to offend the folks that aren't serious. They were gonna be such a pain.
I don't even think I was charging very much. I was still pretty junior in my career, so I think I was charging something like $40/hour, which was double my nominal wage at my previous W2 software job (doubled to cover stuff like health insurance and overhead and the like).
$40/hour is an extremely low rate for a software engineer, even at the time. I'm not sure what the guy was expecting, and I'm quite confident that he would have tried to weasel out of paying the second I delivered a product by claiming it didn't match his spec. Honestly I don't really think I'm ever going to contracts for small clients again regardless; a lot of them can get away with a SquareSpace site, and the ones that can't should probably spend their money buying a few courses on using Claude Code or ChatGPT.
There's also the fun scam of wannabe Steve Jobs characters. I had a person try to recruit me for a job where I worked for 2% equity in their business, where I was expected to write the entire codebase myself. Of course the remaining 98% went to them, and as far as I can tell they felt that their "idea" was just that valuable and didn't plan on contributing anything else. Fortunately, I didn't really fall for that one.
Tangential, but in my previous apartment one of my neighbors really wanted me to build him a website after he found out I work with computers.
I told him that I don't really build websites, and what I mostly do is write things that move information from one computer to another and that I haven't really enjoyed web development so it's not something that I do in my free time so I'm not good at it, and he should check out SquareSpace or Wix something.
He kept assuring me that it would "be easy". To shut him up, I gave him a quote of my daily rate for contract work (which honestly wasn't even that high by software engineering standards), and he backed off because he didn't realize how expensive software engineering is.
Something I did a few years ago was buy a thing on eBay of 300 random CDs for like $10.
Most of the CDs were unsurprisingly stuff that was pretty common, but I would occasionally find a few artists that I had never heard of that I ended up really liking, like "Hoss" by Lagwagon.
I haven't done this in awhile, but I might do it again soonish. It was fun digging through all the CDs to find stuff I ended up actually liking.
Found a favorite band through a similar technique: pile of CDs given from a friend who worked at a music store and no one wanted them.
You have to be willing to sift through junk. Which I think is hard for many to accept. However, the algorithms are often giving you junk anyway. Kind of no way around it.
Yeah, most of the CDs there were pretty unremarkable; a lot of them were unsurprisingly stuff that was extremely popular (since those have the most CDs available). A lot of the stuff that wasn't extremely popular was pretty bad.
Still, in that 300, there was about ~30 albums that I hadn't hear of that I ended up really liking.
Took awhile to sift through them all, which is why I haven't done it again, but it was a fun experiment all the same.
Yeah, I've thought about buying one in the past, but $229 is kind of rich for my blood.
I bought an ODROID-Go Ultra a few months ago for about $70. This can emulate the NES, SNES, Genesis, Game Boy, and oodles of other consoles, and can play what are arguably some of the best games ever made. The Playdate is three times that price, and while I'm sure that some of the games are fun, I would have a hard time believing that any of them are beating Donkey Kong Country or Phantasy Star IV.
It might be an apples and oranges comparison, but in my mind they still occupy a similar niche.
It was something that I guess I logically knew but hadn't fully realized. I had always tried to be fancy with my writing and pad it out to meet minimum word counts, with "understand-ability" being somewhat of an afterthought. Just that one statement in my ACT prep book made me, in my opinion, a significantly better writer.
reply