Not the OP, but the beeper client looks better and has native GIF support, among other niceties that more "normal" users would prefer over other clients.
> With Meta’s Messenger application for macOS being so close to the Texts.com model—that being a standalone desktop application—Batuhan İçöz who is leading the Meta platform project at Texts.com thought we could gain some valuable insight by analyzing it.
I wasn't able to respond to any other comment in your history, because pretty much everything you post is dead. You don't seem to respond to the obvious feedback from that, but somebody needs to recommend that you stop making flippant, dismissive, and generally worthless comments, because I've never seen an account as deeply downvoted on HN as yours.
Did you read the article? It's about what happens in negative numbers. Of course everyone agrees that 7/2=3 in integer division, but -7/2 is less obvious.
People soothe their pain and trauma with all sorts of self-destructive behavior. Trying to numb the pain doesn’t mean they are trying to crawl back in the womb.
That wording is exceptionally poor in my opinion. Just above, @blueprint, does a fabulous job of summarizing what I think is essentially the same idea (but with far better wording):
"..the problem is that for some people, addiction is connected with:
* using an external means of dissociation
* to obtain instant control over feelings
* without relating to them on their own terms"
Someone who is without serious trauma won't likely have the same reaction/response to this level of dissociation.
I'll take this seriously since lots of people probably wonder this even if they don't bother to ask it.
Disability isn't a permanent state that you start with. It's something that can happen to you 5 years into your career, or 15. It can also be temporary - you break your leg and now you need crutches, a cane or a wheelchair until you heal, for example.
Accessibility also helps people who you wouldn't traditionally classify as disabled: Designing UI to be usable one-handed is obviously good for people who have one hand, but some people may be temporarily or situationally one-handed. Not just because they broke an arm and it's in the cast, but perhaps they have to hold a baby in one arm, or their other hand is holding a grocery bag, or they're lying in bed on their side.
Closed captions in multimedia software or content are obviously helpful for the deaf, but people who are in a loud nightclub or on a loud construction site could also benefit from captions, even if their ears work fine.
So, ultimately: Why should someone who's used to using a given editor have to switch any time their circumstances change? The developers of the editor could just put the effort in to begin with.
Not to defend GP, but if I suddenly went blind, I really don't know if it would take longer to learn how to use my existing tools with a screen reader or to learn new tools better designed for it. It would be a completely new and foreign workflow either way.
This is not about what tools you want to use, but what tools you're forced to use by your team.
If this were a simple, offline editor, a decision not to focus on accessibility would be far easier to swallow. They seem to be heavily promoting their collaboration feats. If those on your team collaborate using Zed and expect you to do the same, other tools aren't an option.
Have you considered that disability is not always permanent? What if you were temporarily blind? Or could only see magnified or high-contrast UIs? Or you broke both arms, but your feet are fine for using your USB driving Sim pedals as an input device for 10 weeks while your arms heal? Would you still want to learn new workflows to be used over a few months only?
A11y isn't about helping one set of users (those who have completely lost their sight), it's about helping a whole spectrum of accessibility challenges - not by prescribing boxed solutions, but giving the user options customizable to their specific needs.
The amount of time one is willing to set aside to learn new tools & workflows isn't worth it if they are to be used for a limited period. It's much better to use the old tools one is familiar with in those cases.
ramps (by parents with babies in their strollers), subtitles (by people learning languages or in loud environments), audio description (by truck drivers who want to watch Netflix but can't look at the screen), audiobooks (initially designed for the blind, later picked up by the mainstream market), OCR (same story), text-to-speech, speech-to-text and voice assistants (same story again), talking elevators (because it turns out they're actually convenient), accessibility labels on buttons (in end-to-end testing, because they change far less often than CSS classes), I could go on for hours.
For user interfaces specifically, programmatic access is also used by automation tools like Auto ID or Autohotkey, testing frameworks (there's no way to do end-to-end testing without this), and sometimes even scrapers and ad blockers.
Getting ratioed because people think “my app must follow accessibility for… oh.. uh… because big company does so we must do same ooga booga smoothbrain incapable of critical thinking”
Waste of time unless you are aiming your application AT people who use screen readers e.g. medical or public sector. EVEN THEN has anyone actually tried? Even accessible websites are garbage.
I'm guessing this was downvoted for being rude, but I think there is a valid question here. It looks like Zed is putting a lot of work into minimizing the latency between a key being typed and feedback being displayed on a visual interface which is easily parsed by a sighted user.
If a programmer is using audio for feedback, then there is probably be some impedance mismatch by translating a visual interface into an audio description. Shouldn't there be much better audio encoding of the document? There would also be many more wasted cycles pushing around pixels, which the programmer will never see. An editor made specifically for visually impaired programmers, unencumbered by the constraints of a visual representation, would be able to explore the solution space much better than Zed.
This has been tried in Emacspeak[1] and doesn't work that well in practice. I'm in the blind community and know plenty of blind programmers, none of whom seriously use Emacspeak. VS Code is all the rage now, and for good reason, their accessibility story is excellent, they even have audio cues for important actions (like focusing on a line with an error) now.
For instance, building up the DAG of all the build steps and then scheduling things so that various parts can be built in parallel. Buildkit does a lot that we use beyond just being a way do build things inside a runc container.
That said, we have a fork of buildkit for the various things we add that don't fit well in the upstream.
Already our auto-skip feature and our branching are implemented on top of Buildkit rather than using it. Probably as we grow and add more build centric features we will continue to diverge.
I'm just a DevRel person though, so that's just my 2 cents. The core team may disagree with me.
I am considering writing a small library / framework to optimize scheduled tasks by making a DAG and inferring which ones are safe to be parallel and which aren't. So this can be useful.