Question

I'm a software developer who has a background in usability engineering. When I studied usability engineering in grad school, one of the professors had a mantra: "You are not the user". The idea was that we need to base UI design on actual user research rather than our own ideas as to how the UI should work.

Since then I've seen some good examples that seem to prove that I'm not the user.

  • User trying to use an e-mail template authoring tool, and gets stuck trying to enter the pipe (|) character. Problem turns out to be that the pipe on the keyboard has a space in the middle.
  • In a web app, user doesn't see content below the fold. Not unusual. We tell her to scroll down. She has no idea what we're talking about and is not familiar with the scroll thumb.
  • I'm listening in on a tech support call. Rep tells the user to close the browser. In the background I hear the Windows shutdown jingle.

What are some other good examples of this?

EDIT: To clarify, I'm looking for examples where developers make assumptions that turn out to be horribly false about what users will know, understand, etc.

Was it helpful?

Solution

I think one of the biggest examples is that expert users tend to play with an application.

They say, "Okay, I have this tool, what can I do with it?"

Your average user sees the ecosystem of an operating system, filesystem, or application as a big scary place where they are likely to get lost and never return.

For them, everything they want to do on a computer is task-based.

  • "How do I burn a DVD?"
  • "How do I upload a photo from my camera to this website."
  • "How do I send my mom a song?"

They want a starting point, a reproducible work flow, and they want to do that every time they have to perform the task. They don't care about streamlining the process or finding the best way to do it, they just want one reproducible way to do it.

In building web applications, I long since learned to make the start page of my application something separate from the menus with task-based links to the main things the application did in a really big font. For the average user, this increased usability hugely.

So remember this: users don't want to "use your application", they want to get something specific done.

OTHER TIPS

In my mind, the most visible example of "developers are not the user" is the common Confirmation Dialog.

In most any document based application, from the most complex (MS Word, Excel, Visual Studio) through the simplest (Notepad, Crimson Editor, UltraEdit), when you close the appliction with unsaved changes you get a dialog like this:

The text in the Untitled file has changed.
Do you want to save the changes?
[Yes] [No] [Cancel]

Assumption: Users will read the dialog
Reality: With an average reading speed of 2 words per second, this would take 9 seconds. Many users won't read the dialog at all.
Observation: Many developers read much much faster than typical users

Assumption: The available options are all equally likely.
Reality: Most (>99%) of the time users will want their changes saved.

Assumption: Users will consider the consequences before clicking a choice
Reality: The true impact of the choice will occur to users a split second after pressing the button.

Assumption: Users will care about the message being displayed.
Reality: Users are focussed on the next task they need to complete, not on the "care and feeding" of their computer.

Assumption: Users will understand that the dialog contains critical information they need to know. Reality: Users see the dialog as a speedbump in their way and just want to get rid of it in the fastest way possible.

I definitely agree with the bolded comments in Daniel's response--most real users frequently have a goal they want to get to, and just want to reach that goal as easily and quickly as possible. Speaking from experience, this goes not only for computer novices or non-techie people but also for fairly tech-savvy users who just might not be well-versed in your particular domain or technology stack.

Too frequently I've seen customers faced with a rich set of technologies, tools, utilities, APIs, etc. but no obvious way to accomplish their high-level tasks. Sometimes this could be addressed simply with better documentation (think comprehensive walk-throughs), sometimes with some high-level wizards built on top of command-line scripts/tools, and sometimes only with a fundamental re-prioritization of the software project.


With that said... to throw another concrete example on the pile, there's the Windows start menu (excerpt from an article on The Old New Thing blog):

Back in the early days, the taskbar didn't have a Start button.

...

But one thing kept getting kicked up by usability tests: People booted up the computer and just sat there, unsure what to do next.

That's when we decided to label the System button "Start".

It says, "You dummy. Click here." And it sent our usability numbers through the roof, because all of a sudden, people knew what to click when they wanted to do something.

As mentioned by others here, we techie folks are used to playing around with an environment, clicking on everything that can be clicked on, poking around in all available menus, etc. Family members of mine who are afraid of their computers, however, are even more afraid that they'll click on something that will "erase" their data, so they'd prefer to be given clear directions on where to click.

Many years ago, in a CMS, I stupidly assumed that no one would ever try to create a directory with a leading space in the name .... someone did, and made many other parts of the system very very sad.

On another note, trying to explain to my mother to click the Start button to turn the computer off is just a world of pain.

How about the apocryphal tech support call about the user with the broken "cup holder" (CD/ROM)?

Actually, one that bit me was cut/paste -- I always trim my text inputs now since some of my users cut/paste text from emails, etc. and end up selecting extra whitespace. My tests never considered that people would "type" in extra characters.

Today's GUIs do a pretty good job of hiding the underlying OS. But the idosyncracies still show through.

Why won't the Mac let me create a folder called "Photos: Christmas 08"?

Why do I have to "eject" a mounted disk image?

Can't I convert a JPEG to TIFF just by changing the file extension?

(The last one actually happened to me some years ago. It took forever to figure out why the TIFF wasn't loading correctly! It was at that moment that I understood why Apple used to use embedded file types (as metadata) and to this day I do not understand why they foolishly went back to file extensions. Oh, right; it's because Unix is a superior OS.)

I've seen this plenty of times, it seems to be something that always comes up. I seem to be the kind of person who can pick up on these kind of assumptions (in some circumstances), but I've been blown away by what the user was doing other many times.

As I said, it's something I'm quite familiar with. Some of the software I've worked on is used by the general public (as opposed to specially trained people) so we had to be ready for this kind of thing. Yet I've seen it not be taken into account.

A good example is a web form that needs to be completed. We need this form completed, it's important to the process. The user is no good to us if they don't complete the form, but the more information we get out of them the better. Obviously these are two conflicting demands. If just present the user a screen of 150 fields (random large number) they'll run away scared.

These forms had been revised many times in order to improve things, but users weren't asked what they wanted. Decisions were made based on the assumptions or feelings of various people, but how close those feelings were to actual customers wasn't taken into account.

I'm also going to mention the corollary to Bevan's "The users will read the dialog" assumption. Operating off the "the users don't read anything" assumption makes much more sense. Yet people who argue that the user's don't read anything will often suggest putting bits of long dry explanatory text to help users who are confused by some random poor design decision (like using checkboxes for something that should be radio buttons because you can only select one).

Working any kind of tech support can be very informative on how users do (or do not) think.

pretty much anything at the O/S level in Linux is a good example, from the choice of names ("grep" obviously means "search" to the user!) to the choice of syntax ("rm *" is good for you!)

[i'm not hatin' on linux, it's just chock full of unix-legacy un-usability examples]

How about the desktop and wallpaper metaphors? It's getting better, but 5-10 years ago was the bane of a lot of remote tech support calls.

There's also the backslash vs. slash issue, the myriad names for the various keyboard symbols, and the antiquated print screen button.

Modern operating systems are great because they all support multiple user profiles, so everyone that uses my application on the same workstation can have their own settings and user data. Only, a good portion of the support requests I get are asking how to have multiple data files under the same user account.

Back in my college days, I used to train people on how to use a computer and the internet. I'd go to their house, setup their internet service show them email and everything. Well there was this old couple (late 60's). I spent about three hours showing them how to use their computer, made sure they could connect to the internet and everything. I leave feeling very happy.

That weekend I get a frantic call, about them not being able to check their email. Now I'm in the middle of enjoying my weekend but decide to help them out, and walk through all the things, 30 minutes latter, I ask them if they have two phone lines..."of course we only have one" Needless to say they forgot that they need to connect to the internet first (Yes this was back in the day of modems).

I supposed I should have setup shortcuts like DUN - > Check Email Step 1, Eduora - Check Email Step 2....

What users don't know, they will make up. They often work with an incorrect theory of how an application works.

Especially for data entry, users tend to type much faster than developers which can cause a problem if the program is slow to react.

Story: Once upon a time, before the personal computer, there was timesharing. A timesharing company's customer rep told me that once when he was giving a "how to" class to two or three nice older women, he told them how to stop a program that was running (in case it was started in error or taking to long.) He had one of the students type ^K, and the timesharing terminal responded "Killed!". The lady nearly had a heart attack.

One problem that we have at our company is employees who don't trust the computer. If you computerize a function that they do on paper, they will continue to do it on paper, while entering the results in the computer.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top