The part that is counterintuitive to most people when it comes to the "server" terminology is that, with X, your end-user workstation (which may be an incredibly dumb X terminal) is the "display server", which means you remote into a server (in the traditional sense) elsewhere, which then acts as an X client by making requests to your local machine to display windows.
The way most people think about it, "client" is your local machine and "server" is the remote machine that has lots of applications and is potentially multi-user, but X turns that backwards. The big iron is the client and the relatively dumb terminal is the server.
I think the confusion is obvious, given a little empathy for the range of people who use computers.
The server is usually a remote machine, especially back in the time when "client-server" architecture was emerging in mainstream (business) vernacular.
Please don't imagine that I don't fully understand this.
Nevertheless, X11 "server" and "client" have confused very smart and highly technical people. I have had the entertainment of explaining it dozens of times, though rarely recently.
And honestly, still, a server is usually a remote machine in all common usage. When "the server's down", it is usually not a problem on your local machine.
Yes, it’s simultaneously logical if you look at how it works and immensely strange if you don’t understand the architecture. (As has been noted all the way back to the UNIX-HATERS Handbook[1], although, pace 'DonHopkins, the NeWS Book uses the same terminology—possibly because it was written late enough to contain promises of X11/NeWS.)
There is nothing at all strange about the terminology. Go run ps on macOS and marvel at the "WindowServer" process. The generic architectural term is "display server".
If you are not going to implement X11 drawing ops and XRender (which I, and many others, still use heavily), what's even the point? Any 'modern' program that only does client-side rendering already supports Wayland. AFAIK GTK 3 doesn't even support DRI on X11 unless you somehow trick it into using the abandoned OpenGL Cairo backend, but that's not modern enough apparently.
It talks about trimming 'legacy' features and specifically says they are omitting 'font-related' operations. That obviously means no useful core X11 application will work (unless you count xlogo and xeyes). Whether the XRender glyph cache mechanism is included is unclear. It also says only DRI is *currently* supported, but maybe that's incidental?
XRender isn't part of the core protocol so it should be implemented in the future. There is already some xrender code in there. Almost no applications use x11 core protocol font, except for cursor. Since the x11 core protocol font (on xorg server at least) is rendered without anti aliasing and doesn't really support unicode.
Core fonts absolutely support Unicode. My (non-Xft) xterm windows are full of Unicode characters right now. It is true that anti-aliasing is not supported by the X.Org server, although scalable fonts have been for a while (https://www.x.org/archive/X11R7.5/doc/fonts/fonts.html#AEN49...). But you don't need anti-aliasing on a high-DPI display, and on a low-DPI display you can use any of the many beautiful bitmap fonts, unlike a lot of ‘modern apps’ these days.
>If an English word looks similar to a Russian word, then the stress is likely on a _different_ syllable
Most of these are Latin and French loanwords where Russian (same as e.g. German) carried the accentuation over from the source language. English is the odd one out as it insists on putting the primary stress on either of the first two syllables, except in some recent loans (and those still get a secondary stress). With nouns the preference is for the first syllable. Russian surnames get similarly butchered, including notably Nabokov, which could have been adopted unchanged.
>the only way to "fix" the birth rate is to reject humanity (education, urbanization, technology) and retvrn to monke (subsistence farming, arranged marriages, illiteracy, superstition), which no civilized country will ever do.
They won't do it willingly. That just means it will happen without their input.
sure, they could, hypothetically, close the borders and begin a campaign of forced insemination, but those babies would have no fathers to provide for them, and the state - any state - really resents footing the bill for child rearing, going as far as forcing victims of infidelity, fraud, or rape to pay child support. the state - any state - wants to give you as little as possible and to take as much as possible from you, for the delta between giving and receiving is its lifeblood.
the ideal family has two full-time working parents, paying a mortgage and car loans, consuming as many high-margin domestic products as possible, rearing as many children (future laborers and consumers) as possible, with little to no assistance from the state. and you simply can't have that by force. if you could, you might as well drop the pretense and openly treat your population as slaves.
I’m actually happy about DRAM prices and hope more people share your mindset. This is the only thing that can force developers to start optimizing memory usage instead of externalizing the costs onto the poorest users.
I sincerely hope it works out this way instead of pricing out open sourced development. A couple open sourced projects changed their licensing to help mitigate the increased cost burden from skyrocketing hardware costs. It'll be a sad and potentially dangerous day if most people are permanently priced out.
It's absolutely possible to be mistaken about this. The placebo effect is very strong. I'm sure there are countless things in my own workflow that feel like a huge boon to me while being a wash at best in reality. The classic keyboard vs. mouse study comes to mind: https://news.ycombinator.com/item?id=2657135
This is why it's so important to have data. So far I have not seen any evidence of a 'Cambrian explosion' or 'industrial revolution' in software.
'It's $CURRENTYEAR' is just a cheap FOMO tactic. We've been hearing these anectodes for multiple current years now. Where is this less buggy software? Does it just happen to never reach users?
The anti-AI ‘movement’ is a minority like all partisans are a minority. You shouldn’t be comparing them to passive consumers but to enthusiasts who actively demand ‘AI’ in their browser/Paint/Notepad.
We don't really have reasonable PM's though. Or rather, they are being paid to be unreasonable. They are ignoring everyone because the CEO 5 levels uo wants it.
And then others wonder why customers are frustrated.
It’s easy to bash Mozilla because it is failing. Their usage share is a statistical error, and most of it comes from being shipped with Ubuntu. Firefox badly needs a value proposition beyond not being Chromium-based.
I agree, but there's nothing more frustrating than another niche user group imagining that the reason for this failure is Mozilla lacking to address their obscure requests, while Mozilla's real goal is to create a browser for everyone. The truth is that this goal is borderline impossible, and all these double standards (can't count the times I've heard "I'm tired of Firefox, moving to Chrome!") surely aren't helping.