Hacker Newsnew | past | comments | ask | show | jobs | submit | chocochunks's commentslogin

I think the first Civ I played WAS III (maybe II at a friend's house once before?) and it ain't my fav. It sits below IV and V and even VI and I don't really like VI all that much either...

Just forget that mobile gaming exists? I think one of the cheapo Linux retro handhelds offer a better portable gaming experience than 99% of mobile games, ads or no.

Any specific models you’ve used and can recommend?

Oh and your Xbox Controller S won't work with your Xbox One S but it will work with your original Xbox.

The phone's hardware must also support it. It needs non-protected VM support which is available in Exynos SoCs but not Qualcomm which is why some Samsung phones have it but other arguably better phones don't (e.g, S25 Ultra VS. Flip 7).

Right, I enabled it, and got that exact error when starting the Terminal app on my Xiaomi 15: "Non-protected VMs are not supported on this device."

Anyone know if the Samsung Z Trifold has VM support that works for the Android Terminal?

No not right now at least, because it uses Qualcomm.

Unfortunate. Looking forward to a trifold with AVF support. And, ideally, support for unprivileged AVF being available for third-party virtualization applications to use.

Part of the reason the Apple I is so rare, is that Apple offered an Apple I trade in program. Apple would destroy the boards of Apple Is that were traded in for Apple IIs.

* Not that there was really many to begin with.


What was the reasoning behind that?

It's because the Apple I had no built-in BASIC, and booted to a Monitor prompt. It was hard to use without a manual in front of you.

Meanwhile, the Apple II just let you put in a disk and boot a program. Huge difference in usability.


Probably to reduce support costs.

I recall my junior high school had only Apple IIs in 1995.


Probably OCR'd with no editing.

Pros and cons. Multiple choice can be frustrating for students because it's all or nothing. Spend 10 minutes+ on question, make a small calculation error and end up with a zero. It's not a great format for a lot of questions.


They're also susceptible to old-school cheating - sharing answers. When I was in college, multiple choice exams were almost extinct because students would form groups and collect/share answers over the years.

You can solve that but it's a combinatorial explosion.


A long time ago, when I handed out exams, for each question, I used to program my exam questions into a generator that produced both not-entirely-identical questions for each student (typically, only the numeric values changed) and the matching answers for whoever was in charge of assessing.

That was a bit time-consuming, of course.


MacOS doesn't handle HiDPI screens that well either. The most common and affordable high res monitors are 27" 4K monitors and those don't mesh well with the way macOS does HiDPI. You either have a perfect 2x but giant 1080p like display or a blurryish non-integer scale that's more usable.

And god forbid you still have low DPI monitor still!


Blows my minded that a 4k 27" monitor that was $500 a dozen years ago is still near top tier now.

5k has been surprisingly stagnant.


There were several promising 5K 27” MiniLED displays announced at CES a few days ago. People speculate that LG has produced the panel for the upcoming Apple Display refresh, but is also making it available for the other display manufacturers.


At some point additional resolution is a dimishing return. The human eye has limits.


5K 27” looks usefully better than 4K 27” to my middle aged eyes.

I’d prefer that to not be so, because 5K panels are so much more expensive. But in a side by side comparison it’s very obvious.

But the market has spoken: a quality 4K display is very good, certainly good enough, and the value for money is great.

I’m ok with spending more on a better display that I spend so much time with. The cost per use-hour is still very, very low.


Agreed. I tried 24k 4k screen as soon as they came out (required two DP cables to run at 60Hz at the time), and turning subpixel rendering off, I could see jagged edges on fonts from normal sitting position (I am shortsighted, but at -3.25 I always need correction anyway, which brings my eyesight to better than 20/20). At 27" or 32", DPI is even worse.

And MacOS has removed support for subpixel rendering because "retina", though I only use it when forced (work).


It's not just that: bandwidth needed to drive things above 4k or 5k is already over the limits of HDMI 2.0 (and 2.1 without all the extensions). DisplayPort is a bit better with 1.4 already having enough bandwidth for 8k30Hz or 4k at 120Hz or 8k60Hz with DSC.

When considering a single-cable solution like Thunderbolt or USB-C with DP altmode, if you are not going with TB5, you will either use all bandwidth for video with only USB2.0 HID interfaces, or halve the video bandwidth to keep 2 signal lanes for USB 3.x.

(I am currently trying to figure out how can I run my X1 Carbon gen 13 with my 8k TV from Linux without an eGPU, so deep in the trenches of color spaces, EDID tables and such as I only got it to put out 6k to the TV :/)


We're approaching that point but are not there yet


You can adjust this in settings.


Adjust it to what? Making a 4K monitor look like 1440p (or a non-1080p or 4K desktop) ends up with a non-integer scale on macOS AFAIK. They also completely tore out subpixel font rendering for low DPI displays.


I use a 4k/27" display and it's crisp as it gets at 125%.


Perhaps try a 5k/27" at 150%, or look for visual acuity correction :)

FWIW, I could see jagged edges on 4k at 24" without subpixel rendering, 27" is worse. Yes, even 4k at 32" is passable with MacOS, but Linux looks better (to the point that 4k at 43" has comparable or slightly better text quality to 4k at 32" for a Mac).

I am trying to get a 55" 8k TV to work well with my setup, which might be a bit too big (but same width as the newly announced 6k 52" monitor by Dell), but it's the first next option after prohibitively expensive 32" options.


In my experience it's a little hit and miss with macOS. You need a monitor that is specifically listed as being supported by macOS. If not you get rather strange results. I had a Dell monitor that, under macOS only, would sometimes freak out and flicker if you had to many electron apps open.

In some sense it's reasonable that you need a supported monitor, it's just strange that Linux can support all these monitors, but macOS can't?


Wacom is a big name brand. You can get pen displays from brands like XP Pen and Huion for not much more (https://www.amazon.com/XP-PEN-Artist12-Battery-Free-Multi-Fu...). And they are true second displays. If all you want is a pen tablet then those are much less expensive.


That one is still twice the price and half the resolution, and I don't see anything about touch.

I mostly agree though, Wacom is far from the cheap option.


Only the PC version of the original version of Resident Evil was unavailable. It had a Remake in 2003, a port to the DS in 2006 and AFAIK has been and still is available for purchase on PSN in PS1 form for PS3, PSP and Vita since 2009 and PS4/PS5 since 2022. That's not even counting all the sequels and movies. And the PC port is generally considered, not the best.

There's better examples like No One Lives Forever that have been stuck unavailable for purchase because of rights reasons but RE1 is arguably not that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: