Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

UX was a field in the 1990s when it was at its height. We still have designers, but most software houses closed their academic UI/UX research houses and just hire people that make things that look attractive.

If you've recently tried to teach a computer illiterate person to do something, you'll know what I mean. No consistency, no internal logic or rules, just pretty stuff that you just need to know.



Windows 95, of all things, was actually a good example of a company doing 'proper' research driving UI work.

Btw, I loathe the term UX, because 'interface' (in UI) should already be a big enough term to not just mean pretty graphics, but the whole _interface_ between user and program. But such is the euphemism treadmill.


I remember studying and the difference was not only about interaction but a general impact on the user.

I always found MacOS Finder's spatial file placement a good example (Non-MacOS users - Finder has this thing where it remembers windows locations and file locations in window, so one can arrange files as they please and they stick). Given that that feature is removed the UI stays the same (there are file icons, some windows, same layout), but it does remove some of the cognitive load.

UX is impacted by many non-UI things: load times, responsivity to input, reliability (hello dreaded printer dialogs that promise to print, but never will).

Anti-pattern I hate with passion is MacOS update bar. I want to do some work in the morning, I open my computer and it's friggin' updating. This sucks, but happens, we got forced into this. And then there's this progress bar that jumps: 20%, 80%, 50%, 30%, 90%. Colleague asks when I'm going to be online - "oh, 10% left, probably soon" - ding - progress bar backs to 30%.

UI is the same from observers point of view (it shows the progress, which I suppose is correct and takes into consideration multiple update phases) but UX is dropping ball here.


'Interface' used to encompass all these things.

See https://news.ycombinator.com/item?id=43396140


OSX has had the striped progress bar for hard-to-estimate processes as long as I remember. Did they do away with it?

There are situations where I don't exactly care how far something has progressed but I want to see that it at least has not hung. Fedora's dnf doing SELinux autorelabeling for two hours without any indication of progress is one of those things I hate with passion.


There's still progress bar on update and there's timer (but not always).

The timer also jumps. Once I had ~40 minutes update that was hope-feeding me with "2 minutes left" for most of the time.

My guess is that it's not worth optimizing, but nowadays I shy from updates if I don't have 2h of time buffer (not because I am afraid something will break, but because I know I'll be locked out).


Their update timer always start from 29 minutes remaining and goes from there, IIRC, and I find that it's way more accurate than Windows' one on 99% of the time.

Funnily, Linux (KDE) got very good at their estimations for some time now. Better behaving storage also has a role, I presume.


The only real indication of progress being made is a log output of steps completed. All a spinner or similar indicator tells you is that the UI thread is not hung but that isn't really useful information.


In the 90's I had this idea that in the future steps completed would be confirmed to the server so that the progress can be calculated for other users. Like, on system A downloading step 1 takes 1 minute and step 2 takes 3 minutes. If on system B step 1 takes 1.5 minutes step 2 should take 4.5. Do the same for stuff that requires processing.

But we apparently chose to make things complicated in other ways.


Obviously in the ideal case indicator animation would be tied to something else, like background process sending output, and there would be a textual description next to it.


Scrolling logs scare common people. I don't know why.


You can turn off auto update. I update my Mac when I can, not when it decides.


Bonus: if you do that you don't have to deal with disabling new Apple Intelligence "features" every time.


Spatial file placement is probably not even good on the desktop. In a file manager its definitely an antipattern


I think on earlier windows (95 maybe?) opening a folder would also always open in a new explorer window so you had the impression that the window is actually the folder you are opening. Whereas today we're more used to the browsing metaphor where the window "chrome" is separate from the content. I also don't think today it's useful to have the spatial metaphor, but it probably made more sense back then.


People get way into this desktop metaphor.

A window is a program window not an actual window. A folder is not the same as a folder in a filing cabinet and a save icon is a save icon not a floppy disk.they dont have to stand for or emulate physical things


Historically, it's both, which is how we got here.

The Xerox demo was definitely trying to make near-as-possible 1-to-1 correspondences because their entire approach was "discovery is easier if the UI abstractions are couched in known physical abstractions." UIs that hewed very closely to the original Xerox demo did things like pop open one window per folder (because when you open a folder and there's a folder inside, you still have the original folder).

As time went on and users became more comfortable with computerized abstractions in general, much of that fluff fell by the wayside. MacOSX system 7, for instance, would open one window per double-click by default; modern desktop MacOS opens a folder into the current window, with the option to command-double-click it to open it into its own... tab? (Thanks browsers; you made users comfortable enough with tabbed windows that this is a metaphor the file system browser can adopt!).


I had my folders themed on win 95. It is kinda hard to explain but the color schemes and images trigger a lot of mental background processes related to the stuff in the folder. Just seeing a green grid on a black background would load a cached version of the folder in my head and alt-tab into a linked mental process that would continue where I left it.


I think we need more visual cues for common operations to give more assurance and reinforce the action. For example, recently I was trying to back up some photos from an android phone by plugging it into a windows machine and copying files over. I already had an older version copied from before, and I was surprised that the copy action resulted in the same number of files after I selected "skip" in the dialogue. What happened was that I probably tried to copy from windows to android by mistake. With everything looking the same it's easy to miss things and have the wrong mental model of what is about to happen. It would be great to have more feedback for actions like this, maybe show the full paths, show the disk/device icons with a big fat arrow for the copying direction or something. Basically the copy/move dialog is the same for 10 files and 10,000 files, same for copying between devices and within the folder... and it will happily overwrite any files if you click the wrong option by mistake. And unlike trashing files I am not sure it's possible to undo the action.


"Experience" is more than just "interface". E.g. which actions are lightning-fast, and which are painfully slow is an important part of user experience, even if the UI is exactly the same. Performance and space limitations, things like limited / unlimited undo, interoperability with other software / supported data formats, etc are all important parts of UX that are not UI.


UI, where I stands for "interface" just like in HCI, used to mean all those things.

But in the industry the focus turned to aesthetics, so a new term was invented to differentiate between focusing on the entire interface ("experience") vs just the look.

Just like "design" encompasses all of it, but we add qualifiers to ensure it's not misunderstood for "pretty".


And that has happened again. Changing the colours is "improving UX".


Thing is: changing the colours _could_ be improving the UX.

Eg I'm colourblind, and a careful revision of a colourscheme can make my life easier. (Though I would suggest also using other attributes to differentiate like size, placement, texture, saturation, brightness etc.)


> "Experience" is more than just "interface".

UX has become equivalent with crap. Give me back GUI.


To make a simile with books, to me the UI is the writing and the UX is the story plus how it's ingested via the writing.


That’s a good example for showing how “UI” and “UX” are essentially the same thing. At least in a practical context.

We can call an excellent story teller a “writer”. A good story can be described as “good writing”. A great story, let’s say a film being adapted as a book, can become a terrible book if it is “let down by the writing”.

In the context of books and storytelling, “writing” is the all-encompassing word that experts use to describe the whole thing. Just like “UI” used to mean the whole thing.


But UX is bigger than UI. Good UX might simplify some use case so that user might not need to see any UI at all.


The thing with not well-defined names is that they're open to interpretation. To me, the difference between UX and UI is on a completely different axis.

When I was at university, I attended a UI class which - although in the CS department - was taught by a senior psychologist. Here, the premise was very much on how to design interfaces in such a way that the user can intuitively operate a system with minimal error. That is, the design should enable the user to work with the system optimally.

I only heard the term UX much later, and when I first became aware of it, it seemed to be much less about designing for use and more about designing for feel. That is, the user should walk away from a system saying "that was quite enjoyable".

And these two concepts are, of course, not entirely orthogonal. For instance, you can hardly enjoy using a system when you just don't seem to get the damn thing to do what you want. But they still have different focuses.

If I had to put in a nutshell how I conceptualize the two disciplines, it would be "UI: psychology; UX: graphics design".

And of course such a simplification will create an outcry if you're conceptualization is completely different. But that just takes us back to my very first sentence: not well-defined names are open to interpretation.


Thanks for sharing!

> Here, the premise was very much on how to design interfaces in such a way that the user can intuitively operate a system with minimal error.

Yes, that's a good default goal for most software, but not always appropriate.

Eg for safety critical equipment to be used only by trained professionals (think airplane controls or nuclear power plant controls) you'd put a lot more emphasis on 'minimal error' than on 'intuitive'.

We can also learn a lot from how games interact with their users. Most games want their interface to be a joy to use and easy to learn. So they are good example for what you normally want to do!

But for some select few having a clunky interface is part of the point. 'Her Story' might be an interesting example of that: the game has you searching through a video database, and it's only a game, because that search feature is horribly broken.


That is still the man-machine interface

UX is just a weaselly sales term, "Our product is not some mere (sneers) interface, no, over here it is a whole experience, you want an experience don't you?"


I wouldn't be so harsh.

It's just the euphemism treadmill. Just like people perennially come up with new technical terms for the not-so-smart that are meant to be just technically and inoffensive, and over time they always become offensive, so someone has to come up with new technical terms.

See eg https://en.wikipedia.org/wiki/Idiot

> 'Idiot' was formerly a technical term in legal and psychiatric contexts for some kinds of profound intellectual disability where the mental age is two years or less, and the person cannot guard themself against common physical dangers. The term was gradually replaced by 'profound mental retardation', which has since been replaced by other terms.[1] Along with terms like moron, imbecile, retard and cretin, its use to describe people with mental disabilities is considered archaic and offensive.[2]


I once upon a time coin the term scientific physics. UX is not progress, it is the astrology of UI design. The UI exists between the silicon and the wetware computer as a means to interface the two. UX aims to modify the human and invade their state of mind. Doom scrolling is an example of great UX. Interact vs subdue. I want to experience the meaning of the email not the email application.


I don't think it's weaselly: it's not the first term that has lost its original meaning (like "hacker" or, ahem, "cloud") and required introducing specifiers to go back to the original meaning.


For fun, I did a search for "user interface" before:1996-06-01 .

I found a paper that was definitely taking the perspective that the "user interface" encompasses all the ways in which the user can accomplish something via the software. It rated the effectiveness of a user interface in terms of the time taken to complete various specific tasks. (While remarking that other metrics matter to the concept too, and also measuring user satisfaction and error rates.)

But that paper also suggested how the term might have specialized - four pieces of software were studied, and they are presented in a table that gives their "interface technology", in two cases a "character-based interface" and in the other two a "graphical user interface".

Enough usage like that and you can see how "interface" might come to mean "what the user interacts with" as opposed to "how tasks are performed".

( https://www.nngroup.com/articles/iterative-design/ . It really is dated 1993, which I made a point of checking because Google assigns the "date" of a search result based on textual analysis, and it is frequently very badly wrong. I can't really slam the approach, which I assume was necessary to get the right answer here, but the implementation isn't working.)


See my above comment: UI used to mean all of those and then it became just "pretty", so a new term was invented.


UX includes the possibility that the software will be actively influencing the user, rather than merely acting as a tool to be used. (websites selling you stuff versus a utilitarian desktop app).


> Good UX might simplify some use case so that user might not need to see any UI at all.

Yeah, just look at Windows {10,11} and Android. They simplified so much that it's unusable.


UX, as a term, didn't really exist in the 1990s: https://books.google.com/ngrams/graph?content=user+interface...

That's consistent with your timeline of the decline of UI/UX though. My sense is that the birth of the term UX marked the beginning of the decline because it meant redefining the term UI as being purely about aesthetics, implying that no one was paying attention to all of the non-aesthetic work that had previously been done in the field.


The term didn't really exist, but user experience was a thing. I took a human computer interface class in college about designing good UI. My first job out of college in 1996 I got permission from my boss and the boss of the corporate trust folks to go sit with a few of my users for 1/2 a day and see how they used the software I was going to fix bugs in and add features to. Apparently, no one had done that before. The users were so happy when I suggested and implemented a few things that would shave 20 minutes of busy work off their work each day that weren't on their request list because they hadn't thought it was something that could be done.


UX was Ergonomics back them, but current term also implies some "desire" to return to application, a tint of marketing maybe?


UX was Human Factors Engineering, Usability research, and library science. UX was the rebranded label after the visual designers took over everything.


I remember it as "human-machine interaction" and "HMI design" or "interaction design". It was mostly about positioning interface elements, clear iconography, and workflows with as little surprises and opportunities for errors as possible. In industrial design, esp. for SCADA, it is often still called HMI.


Yeah, if you wanted to study usability (or what we call UX today), you'd take the ergonomics course, and there'd be usability classes. So you'd learn about how to sit at a desk, how to design a remote control, and where to put the buttons in an application.

It does seem a bit weird, but I feel like this bigger picture is what a lot of today's design lacks.


I have a guy at work who does most of our UI/UX design, and recently one of the screenes we needed to implement involved a list where the user needs to select one option then click "Save". He designed it with checkboxes... some people just have no idea that UX conventions exist.


> some people just have no idea that UX conventions exist.

Because those were (G)UI conventions.

The new "UX" is in the line of "Fuck ICCCM or Style Guide, i'll implement my own".


He committed a clear and factual mistake in design - a [Basic Engineering Defect]. It cannot be merely called not following "convention".

Now if the question was between radio buttons and a drop-down list - that is a designer's choice.


The fundamental problem with UI/UX is that it’s so heavily dependent on your audience, and most software caters too disproportionately to one audience.

New users want pop ups, pretty colors, lots of white space, and stuff hidden. Experienced users want to throw the computer through a window when their tab is eaten because of a “did you know?” popup.

Enterprise, professional software is used a lot. Sometimes decades. You need dense UI with a UX that’s almost comically long-lived. Experienced users don’t want to figure out where a new button is, they’ve already optimized their workflow to the max.


My impression was that at some point, they went too far with the scientific approach. As in round up all the last persons who had never touched a computer, put them in an experiment and make their success rate as the only metric that counts. Established conventions? "Science says they don't work".

This attack on convention then paved the way for the "just make it pretty" we see today.


The last two companies I worked for had UI/UX teams with knowledgeable directors. It is not dead; it is just that some people don't see the ROI in it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: