Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

a 320x200 6bit color depth wasn't exactly a pleasure to use. I think the games could double the res in certain mode (was it called 13h?)




For OCS/ECS hardware 2bit HiRes - 640x256 or 640x200 depending on region - was default resolution for OS, and you could add interlacing or up color depth to 3 and 4 bit at cost of response lag; starting with OS2.0 the resolution setting was basically limited by chip memory and what your output device could actually display. I got my 1200 to display crisp 1440x550 on my LCD by just sliding screen parameters to max on default display driver.

Games used either 320h or 640h resolutions, 4 bit or fake 5 bit known as HalfBrite, because it was basically 4 bit with the other 16 colors being same but half brightness. The fabled 12-bit HAM mode was also used, even in some games, even for interactive content, but it wasn't too often.


You might be thinking of DOS mode 13h, which was VGA 320x200, 8 bits per pixel.

And 6-bits per colour component.

VGA color palette was 18-bits/256K, but input into the palette was 8-bit per channel. (63,63,63) is visibly different from (255,255,255).

http://qzx.com/pc-gpe/tut2.txt

http://qzx.com/pc-gpe/


i remember playing with mode 13h, writing little graphics programs with my turboc compiler. computers were so magical back then.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: