Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Great UI design lies

By dash2 in Technology
Sat Jan 11, 2003 at 02:46:23 PM EST
Tags: Software (all tags)
Software

Grate Latin lies: all the Roman women were beautiful

Designing a decent user interface is critical for applications and operating systems. Unfortunately, this whole field is beset by quackery. Here's a short list of my favourite UI myths.


1. Get yourself a guru

The gurus are the root of the problem. Computing is a particular kind of profession. We admire experience and expertise. So we come to UI design and we look for an expert. We talk about "Fitt's law" and digest the nostrums of various gurus, whose thoughts are handed down from on high - usually via a web site - and given an uncritical reception.

Later on I'll be examining some of these supposedly brilliant ideas in more detail, but oh brothers and sisters, believe me at least in this:

The only User Interface guru is your mum.

Forget the experts - they have little to teach. Go and test your design. If you are Microsoft or Red Hat, fill a hall full of end users and make them try your product. If you're Johnny Freshmeat, use your mum.

I would say that UI design is in its Aristotelian age, where great attention is paid to Great Minds. We now need to move to the age of Galileo.

2. Save and open? Outdated nonsense!

There is a much-repeated meme that the contents of a typical "File" menu are a hangover from the 80s. The argument goes that with modern computers, we can afford to make no differentiation between memory and hard drive. Rather than explicitly saving, our documents should be automatically saved as we make changes, and should reopen automatically when the app is restarted.

Now have you noticed the subtle flaw? What if you wanted to work on more than one document?

Well, I am setting a straw man on fire. But the underlying point is that save and open aren't just an ugly hack to work with limited hard drive space. They are there so that users can decide when to finalize their changes; or to record different alternative versions of a document (without having to wend their way back through a single linear undo history); or to make one birthday card for John and another for Jane.

There are cases when having no save and open makes sense - for example, Konqueror's bookmark editor has a save button, while Galeon's quite sensibly doesn't. But in the main it is an essential part of editing functionality that you can choose what gets stored and when.

Now this leads me on to...

3. Filenames are so passe...

...now let me just enter ten key-value pairs in a database. The approach taken, for example, by the recently aired newdocms - a great example of how potentially useful developers are lured into wasting their time by these myths. Any time newdocms becomes the new doc ms, you'll find me back on the oldproprietaryos.

Imagine thinking up a way for users to remember where their data is stored. Hmm. It has to be simple to use. They shouldn't be limited in the amount of info they put in, and this info should be indexable and searchable by intelligent means.

Here's a bright idea. You let the user input an arbitrary-length string (or as near arbitrary as makes no odds) which will be linked to the document. The user can include as much data as they want - or as little - it's totally up to them. And because it's unstructured, your search engine can be as clever as you want in retrieving it.

But you dolts! You've just described a f i l e n a m e. And ls | grep or "find file" or whatever clever searcher you want to use.

Or, you could make the user type in fifteen values for keys that he has previously defined. That's great, cos his porn mpegs will be indexed using the system he thought up to categorize his powerpoint presentations. It'll only require ten times as much work from him.

There are plenty of ways to improve the filesystem - in fact, I think evolution's vfolders are very cool. But filenames are a great solution already.

As a parallel, look at your home directory or "My Documents" folder. I bet, like most people, it's a big mess of different stuff. Why haven't you organized it into neat little subfolders, you disorganized fool? Well, because whenever you looked for something, you'd have to trek up and down into all these subfolders, remembering how your mind worked 6 months ago. Right now, you put it in a big long list and scroll through the list until something looks familiar. Hurrah. And filenames are like that: everything in one place.

4. WIMP? No, we need something much cleverer

Everyone agrees that the Windows Icons Mouse and Pointer model is way outdated. It's time for a total rethink! Er... er... virtual reality 3D walk through file managers! Or a CLI, but like a new CLI, that's new.

WIMP, how I love thee. Thou taught new users how to use a computer, using a simple set of universal metaphors. Thou canst be confusing at first, but after some learning, thou teachest people to use the same techniques to answer all the different questions their computer throws at them. Choose this option from the list. Now hit "OK". (Another bugbear of UI gurus, who want every OK button to say something different: which is like having a different type of handle on every door in your house.) Click the picture of an Internet Explorer to launch the internet explorer.

And get behind me, that wicked command line! That vile excrescence whereby you learn 100 different commands each with their little switches and syntaxes, and then you can use your computer, and you are immensely proud of your Unix knowledge, just like any other trainspotter who has memorized a vast and meaningless connection of facts.

There's plenty of room for improvements to the UI, and new ideas. And WIMP is, I think, misplaced for mobile devices - one reason why I would back the mobile phone companies against the PDAs. (Have you seen the O2 XDA with Pocket Windows? You can right-click in it, FFS.) But on the PC, WIMP is well-tested and improvements will be built on this basis.

The way forward

I must sound old and grumpy. I am not against all innovation. For example, the web has given us a vast lab of UI experimentation, and integrating some of that with the traditional windowing system will provide lots of new ideas. (Notice, for example, how Windows XP uses lots of blue links on white backgrounds, rather than the traditional buttons on grey "chrome".)

But my point is really this. Don't trust the gurus. They get paid to have clever, radical opinions. That's how they get hits to their website. (Because they don't have real proper jobs, you see.) Trust the end users. It is the only way to find out what works. KDE, Gnome, Red Hat, SuSE... you're up against a company who does a lot of end user testing, and I think it shows - so nota bene!

Ah yes, Latin. Bonus points for those who spot the quotation at the top.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Poll
What do you think?
o I'm right 36%
o I'm right 63%

Votes: 44
Results | Other Polls

Related Links
o Freshmeat
o recently aired newdocms
o way outdated
o Also by dash2


Display: Sort:
Great UI design lies | 165 comments (155 topical, 10 editorial, 0 hidden)
My favourite lie (3.66 / 3) (#1)
by Psycho Les on Fri Jan 10, 2003 at 04:45:47 AM EST

"A text editor with black text on bright white background is a good idea."

You've inspired me Les (3.00 / 1) (#5)
by starsky on Fri Jan 10, 2003 at 05:21:12 AM EST

whats colours are your faves?

[ Parent ]
Something that doesn't burn holes in your eyes (3.00 / 1) (#6)
by Psycho Les on Fri Jan 10, 2003 at 05:35:07 AM EST

A very pale yellow, lightgray or very light blue.

[ Parent ]
Subjective (4.33 / 3) (#14)
by xL on Fri Jan 10, 2003 at 07:01:04 AM EST

For me, it is the only thing that works when I'm hacking code. Information presents itself better in that form, which is the main reason why light background colors are being preferred on the web. A darker background tends to make me want to focus on specific areas of the screen, which is ideal for command lines. When I need to grasp things in context (which  is generally the case when I work with a text editor), a light background with dark characters helps.

So let's assume that this is not a UI lie, nor a truth, but some basic property that, statistically, seems to favor the viewpoint that dark text on a light background reads better. But not always, and not for everyone.


[ Parent ]

Actually (none / 0) (#32)
by Psycho Les on Fri Jan 10, 2003 at 01:38:05 PM EST

I don't like dark backgrounds either.  It's just #ffffff I have issues with.  Also, usability tests have determined that black on white is best when reading on paper, but a white piece of paper isn't as bright as white on a computer screen.

[ Parent ]
Depends on the monitor (none / 0) (#109)
by zerblat on Sun Jan 12, 2003 at 10:56:07 AM EST

You might want to adjust your monitor settings. Many people have their brightness turned up too high. Bright areas make flicker more noticeable, so using a higher refresh rate makes watching bright images less annoying. The lighting in the room is also important for your eye ergonomics. If you're outside a sunny day, white paper can be really bright.

That said, I too prefer a light gray (or similar) background in text editors etc.

[ Parent ]

Printing (none / 0) (#45)
by Teehmar on Fri Jan 10, 2003 at 10:15:34 PM EST

Actuallly, I think the reason we see mainly dark text/light background colors on the web, is because many browsers don't print (to paper) light text/dark background pages well.


[ Parent ]
Interesting; I'm the opposite (none / 0) (#159)
by Gromit on Thu Jan 16, 2003 at 10:30:37 AM EST

When writing and editing code, nothing like grey text on a black background (with blue comments and green literals). And I'm not that old. :-)

I agree with the OP about white backgrounds -- eyestrain city.



--
"The noble art of losing face will one day save the human race." - Hans Blix

[ Parent ]
Hear, hear (none / 0) (#163)
by Rizzen on Wed Jan 29, 2003 at 05:57:29 PM EST

Teal on black is much easier on the eyes, although I've seen studies where a shade of orange on black is easiest (less eye strain).

It would be really nice if *all* applications would let you choose your own foreground/background colours.  There's nothing worse than staring at a bright white light all day to make the eyes ache and the head throb.
----- The years of peak mental activity are undoubtedly those between the ages of 4 and 18. At age four, we know all the questions; at eighteen, we have all
[ Parent ]

Latin quote. (2.00 / 1) (#7)
by ambrosen on Fri Jan 10, 2003 at 05:37:32 AM EST

Do I need an exact source for the Latin quote, or will just the author do?

--
Procrastination does not make you cool. Being cool makes you procrastinate. DesiredUsername.
Galeon (3.00 / 1) (#8)
by Ubiq on Fri Jan 10, 2003 at 05:56:34 AM EST

Galeon has a save item in the menu of the bookmarks editor. It seems to save changes even if I don't use it though.



Fear the google-fu (5.00 / 1) (#9)
by phybre187 on Fri Jan 10, 2003 at 05:59:48 AM EST

I say the Latin came from here

eheu clot... (2.33 / 3) (#58)
by dash2 on Sat Jan 11, 2003 at 07:45:06 AM EST

you are a wet and a weed. i diskard you.
------------------------
If I speak with the tongues of men and of angels, but have not love, I am become sounding brass, or a clanging cymbal.
[ Parent ]
Pleease... (3.50 / 4) (#11)
by mdpye on Fri Jan 10, 2003 at 06:46:16 AM EST

Even the MS Hotmail team concluded that the command line and the script capabilities made BSD most suitable from them to run from than Win2k. Ever tried to admin with a GUI? Enjoy it for 100+ machines did you? I like GUIs in there place, but the CLI has an equally important one, it's saved me much time in the past and still does. After all, which is more natural, some crazy graphical representation using a number of different paradigms, or a language such that you may express yourself to the computer in fluent terms and build complex instructions from simple parts. MP

Admining GUIs (none / 0) (#29)
by Elkor on Fri Jan 10, 2003 at 12:28:49 PM EST

Have you tried WinVNC? I use it to manage my processing farm of 20+ workstations.

Don't get me wrong, I love CLI as well. It's much easier to script a CLI than a GUI.

But, there are some decent tools for remote GUI management.

Regards,
Elkor


"I won't tell you how to love God if you don't tell me how to love myself."
-Margo Eve
[ Parent ]
ICK! (none / 0) (#33)
by regeya on Fri Jan 10, 2003 at 01:44:01 PM EST

I use VNC to admin 20+ Macs. It sucks.

[ yokelpunk | kuro5hin diary ]
[ Parent ]

it's the automation that's the problem. (none / 0) (#92)
by ethereal on Sat Jan 11, 2003 at 10:32:42 PM EST

Access to the remote machines can be done either way (now). But once you serially login to your twenty GUI-oriented desktops, you have to do all the same clicks over again every time.

Granted, some Windows software does provide some level of scripting interfaces. But in general automating the administration of a bunch of remote GUI-oriented machines is going to be more of a pain than automating the administration of a bunch of remote CLI-oriented machines.

--

Stand up for your right to not believe: Americans United for Separation of Church and State
[ Parent ]

So why... (2.00 / 1) (#39)
by magney on Fri Jan 10, 2003 at 09:03:21 PM EST

is Microsoft using Win2K to run the Hotmail web servers now?

Do I look like I speak for my employer?
[ Parent ]

They're eating their own dogfood. (4.00 / 1) (#41)
by Canthros on Fri Jan 10, 2003 at 09:21:22 PM EST

See http://www.joelonsoftware.com/articles/fog0000000012.html

--
It's now obvious you are either A) Gay or B) Female, or possibly both.
RyoCokey
[ Parent ]
I know full well what that means. (3.00 / 1) (#48)
by magney on Sat Jan 11, 2003 at 12:51:01 AM EST

But it doesn't change the fact that if the dogfood were that much less nutritious than FreeBSD, there's no way even Microsoft would've wasted as much money on support staff and hardware costs. Not to mention the development expense of rewriting the front end.

Do I look like I speak for my employer?
[ Parent ]

Still not the point. (5.00 / 1) (#65)
by Canthros on Sat Jan 11, 2003 at 10:46:30 AM EST

They're using Win2k because it is their platform. Not only would using FreeBSD look bad to the entire rest of the world, it does nothing to improve Windows as a server platform (an area in which it sorely needs improvement).

--
It's now obvious you are either A) Gay or B) Female, or possibly both.
RyoCokey
[ Parent ]
Because, as always... (none / 0) (#72)
by mdpye on Sat Jan 11, 2003 at 12:40:09 PM EST

...managers and politics decide such things, not the people in the know or the technical merit of the options involved.

MP

[ Parent ]

Scripting Windows (none / 0) (#96)
by OldCoder on Sun Jan 12, 2003 at 02:10:14 AM EST

Actually, Microsoft has come up with a very powerful network scripting language for System Administration. It ain't easy to learn, but it sure is powerful. It (WMI) is usually used from VBScript, but can be used from JScript. You can learn about general Windows Scripting at this Microsoft site.

The SysAdmin scripting tool, WMI, runs on Win2K, WinXP, and the new 2003 Servers coming out. There's a WMI primer on this Microsoft page, but you'll probably need printed documentation. You can search Amazon for WMI, but I personally recommend this book, once you've mastered basic Windows Scripting.

Internally, WMI is based on DCOM, but you don't need to know DCOM to use it.

Don't read this signature

[ Parent ]
Need a decent GUI (4.00 / 1) (#131)
by drsmithy on Mon Jan 13, 2003 at 01:46:48 AM EST

Adminning a bunch of machines through a *good* GUI is easy (and quick). Using VNC (or equivalents) to remotely login and control 200 servers is *not* a good way to do it. Having a GUI that you make a change in that then goes out and replicates that change amongst the 200 machines *is*.

Despite what you might believe, "adminning through a GUI" does not mean performing the same GUI operation on every host. After all, when you admin through a CLI, do you login to each machine and do the same thing multiple times, or write a script to do whatever it is you want on each machine ?

[ Parent ]

OK buttons (4.63 / 11) (#12)
by zephc on Fri Jan 10, 2003 at 06:49:03 AM EST

the standard MS way, having a button labeled "OK" (or sometimes "Okay") and "Cancel" is just bad UI.

bad and requires you to think about the question and what Okay and Cancel mean in this context:

Do you want to save the file?
Okay   Cancel

good and clear:

Do you want to save the file?
Save   Don't Save

You clicked Save from the menu, and it asks you if you're sure, you see the word Save, virtually no thinking involved.  Why waste cycles on stupid things like decyphering a poorly stated question?

On another note, I think X-Windows systems, while somewhat clunky by modern standards, is great for testing out new UI designs/ideas (though maybe not enough of that goes on, at least nothing with high-visibility)

YOU KNOW WHAT YOU DOING (4.00 / 2) (#22)
by kichigai on Fri Jan 10, 2003 at 08:50:03 AM EST

That's why Mac was popular in the old '90's. Their UI was simple.

You are about to exit a program with unsaved information
[Save]    [Don't Save]         [Cancel]

What's hard about that? It took MS until Office 2K to figure this one out. But then again, Linux is about that helpful too. I can speak on that, being a user of both.

But I think the best combination of UI is part GUI, Part TUI (Textual User Interface). Sort of like Linux with X on top, or Mac OS X. This way, command-line programs can exist and people can write scripts for mass what-ever-ing. Like my friends script to down the bitrate on MP3s for his Rio. But then you have the addition of a GUI, which allows you to use WYSIWYG editors, Audio/Video software/editors, Games, and other fun things. But if we're talking about GUIs only, then perhaps the default set-up by Debian Linux for Gnome works best. A simple menu on top, like on the Mac, listing Programs, Favorites (Programs, not websites. Possible one of MS's worst ideas: Putting your Internet Favorites in EVERYTHING), etc. etc. Once again, Apple came up with a really good idea. But one of the things I like best about X/Gnome/Sawfish, is that everything is customizable. You can change every little bar on the windows, to make things more visable for you, more functions, or just plain cool looking.

But as far as developing new UIs, using X is the easiest way.

And by the way, unless it's in a Microsoft program, the [OK] button's text is defined by the software's developer.

"I said I was smart, I never said I was mature!"
-Me

[ Parent ]
Hold on, what's that 'cancel' (none / 0) (#94)
by Ebon Praetor on Sun Jan 12, 2003 at 01:08:21 AM EST

The 'save' and 'don't save' part are easy to understand, but I knew a lot of people who got confused on the cancel.  What's so hard about saying 'don't quit' instead?

[ Parent ]
Cancel (none / 0) (#111)
by pistols on Sun Jan 12, 2003 at 11:29:01 AM EST

I always liked 'Cancel'. When I accidently click on the wrong menu or button or key accelarater, I can quickly hit cancel without worrying about understanding what the dialog is asking me. Sort of like a ^C under unix, or the escape key under old dos or apple programs.

[ Parent ]
It's not too hard (none / 0) (#157)
by kichigai on Wed Jan 15, 2003 at 09:34:55 PM EST

CANCEL. When you exit or close the file, it pulls up that alert box.

You can save, not save, or CANCEL the operation. CANCEL the closing of the file, CANCEL the exiting of the program.

DON'T QUIT. Too smart.

"I said I was smart, I never said I was mature!"
-Me

[ Parent ]
Save / Don't Save (3.75 / 4) (#52)
by Moooo Cow on Sat Jan 11, 2003 at 03:01:15 AM EST

You clicked Save from the menu, and it asks you if you're sure, you see the word Save, virtually no thinking involved...

Right then, I'll just look for that "Save" bit of text, press that button, and I'm on my way... erm, wait a sec, did that "Save" have a "Don't" in front of it? Dang, can't remember.

I think the articles comment that you don't want a different doorknob on every door in the house is applicable here. I once worked on a project with a designated UI designer, and they actually decided that the save confirmation messages should be "Save" and "Undo To Last Save" (?!?). And that's the way we implemented it, because that's what the UI guru wanted.

[ Parent ]

Good point (none / 0) (#113)
by epepke on Sun Jan 12, 2003 at 11:59:30 AM EST

Right then, I'll just look for that "Save" bit of text, press that button, and I'm on my way... erm, wait a sec, did that "Save" have a "Don't" in front of it? Dang, can't remember.

Yeah, that's a good point that's generally neglected. Processing "don't save" takes the brain longer than just identifying "save," and especially when the user is tired, may cause problems.

I think the articles comment that you don't want a different doorknob on every door in the house is applicable here.

OK, then. Would you acknowledge that your front door has a different kind of doorknob than your regular doors, because it has a lock? How about your sliding glass doors? How about the bathroom doors? That's a bit more apropos than the basic analogy.

I once worked on a project with a designated UI designer, and they actually decided that the save confirmation messages should be "Save" and "Undo To Last Save" (?!?). And that's the way we implemented it, because that's what the UI guru wanted.

That's just mind-bogglingly stupid. Did it actually revert? I hope not.


The truth may be out there, but lies are inside your head.--Terry Pratchett


[ Parent ]
even better (5.00 / 1) (#91)
by ethereal on Sat Jan 11, 2003 at 10:26:42 PM EST

My personal favorite are the dialogs that say "Press OK to do operation X, or Cancel to do unrelated operation Y.". It's silly to make the user think about what OK and Cancel mean in this particular context; the buttons should actually have useful labels like 'Operation X' and 'Operation Y' or whatever. I don't understand why developers won't go to this simple effort; it would make software tremendously easier to use for many people.

--

Stand up for your right to not believe: Americans United for Separation of Church and State
[ Parent ]

Well, in this case (none / 0) (#141)
by Lagged2Death on Mon Jan 13, 2003 at 09:28:32 AM EST

In a case like this, when asking the user a yes/no question, the "Standard MS Way" is to present the user with Yes and No options, not OK and Cancel.

When exiting MS Word 2000 with an unsaved file, for example, the dialog reads "Do you want to save the changes you made to (document name)?" It's a yes/no question, and the dialog includes Yes, No, and Cancel buttons.

I have plenty of gripes about Windows, but I don't find this confusing. The hoary old style guideline is to make the response buttons match the style and phrasing of the question. It's perhaps debatable whether this is really the clearest way to arrange things, but I'd say it's a step up from OK/Cancel, which would be confusing in this situation.

Starfish automatically creates colorful abstract art for your PC desktop!
[ Parent ]

The worst OK button (none / 0) (#153)
by 87C751 on Tue Jan 14, 2003 at 07:46:33 AM EST

A favorite memory of Windows 2.0 (if any memories of Windows 2.0 can be said to be "favorite") is the text of the Unrecoverable Application Error dialog.

An unrecoverable application error has occurred. The application will be closed. All unsaved work will be lost.
And the only button available to acknowledge this disaster? You guessed it... "OK".

I never got how losing all your work through no fault of your own could be OK.

My ranting place.
[ Parent ]

You're not a programmer, are you? (4.25 / 4) (#15)
by L Satyl on Fri Jan 10, 2003 at 08:00:19 AM EST

Apart from disagreeing with most of your rant, I get the feeling that you're ranting about something which you don't understand:
Trust the end users.
After reading that sentence, I cannot shake the feeling that you are not a designer or a coder, because there's only one thing worse than a phb, and that's an end-user :-)

thanks for all comments (4.00 / 1) (#31)
by dash2 on Fri Jan 10, 2003 at 01:16:38 PM EST

I'll add replies here, if only because I want to point out: yes, I am a programmer and yes, I know just how dumb end users can be. (And yes, I have a much loved mum who took several months to figure out that parts of a web page could be hidden, and you needed to scroll down to them.)

Apologies for sloppy editing and lack of links. I was going to correct it earlier, but I had to dash; I was going to correct it now, but apparently my time has run out.

Some people think by "guru" I mean "any UI design professional". No, I wouldn't talk about UI if I didn't think there were
useful things to say and learn about it. I am aiming solely at the hotshots of this world; also at a developer culture that tends to trust experts too much (and endusers too little? :-) )

This was a rant, but I do hear too much mindless dissing of "what works" in favour of untested brilliant theoretical alternatives. Not usually from UI professionals... more the Slashdot crowd.

xx
dave
------------------------
If I speak with the tongues of men and of angels, but have not love, I am become sounding brass, or a clanging cymbal.
[ Parent ]

If you have edits you want to make... (none / 0) (#55)
by pwhysall on Sat Jan 11, 2003 at 04:35:18 AM EST

...reply to this comment with them and I'll do them for you.
--
Peter
K5 Editors
I'm going to wager that the story keeps getting dumped because it is a steaming pile of badly formatted fool-meme.
CheeseBurgerBrown
[ Parent ]
thanks, here's one (none / 0) (#57)
by dash2 on Sat Jan 11, 2003 at 07:42:22 AM EST

i won't make editorial changes - too late, let it stand - but could you change the 1/2/2/3 list bug?
------------------------
If I speak with the tongues of men and of angels, but have not love, I am become sounding brass, or a clanging cymbal.
[ Parent ]
Done. (none / 0) (#60)
by pwhysall on Sat Jan 11, 2003 at 07:52:49 AM EST


--
Peter
K5 Editors
I'm going to wager that the story keeps getting dumped because it is a steaming pile of badly formatted fool-meme.
CheeseBurgerBrown
[ Parent ]
you are right about Mum (4.33 / 3) (#16)
by mreardon on Fri Jan 10, 2003 at 08:12:15 AM EST

The only truly intuitive UI is the nipple.

Intuitive how? (4.00 / 1) (#35)
by DanTheCat on Fri Jan 10, 2003 at 06:26:49 PM EST

People throw this quote about all the time, but really it's more of a pre-programmed response than intuitive.

Or maybe that's really what intuitive is all about: It stimulates pre-programmed responses in our brains. But then what is considered pre-programmed?

At the CS building at my college the double front doors were always a bit of a joke because they opened in opposite directions, but the handles on both sides were the same - horizontal. Terrible user interface. Most people 'round here associate horizontal with push, and vertical with pull. But is that really a pre-programmed response, or just a convention that is in such wide use that it is subconciously performed without really thinking about it.

What was my point again? Ummm...

Anyway, I think I am trying to question what it means to have a truly intuitive UI. Does that mean my next computer will have nipples?

Dan :)

<--->
I was in need of help
Heading to black out
'Til someone told me 'run on in honey
Before someone blows your god damn brains out'<
[ Parent ]

Handle orientation (4.00 / 1) (#46)
by davidduncanscott on Fri Jan 10, 2003 at 11:22:45 PM EST

Most people 'round here associate horizontal with push, and vertical with pull.
Here's my theory: vertical for pull, because one's hand naturally falls that way, just like shaking hands with the door. Horizontal for push, because that's done half the time with the hip, and a wide target is easier to hit. I suppose that's why panic bars are horizontal -- if the building is on fire and people are stampeding, make the target hard to miss.

[ Parent ]
Half right (5.00 / 3) (#17)
by cestmoi on Fri Jan 10, 2003 at 08:25:18 AM EST

Apple spent a fair amount of money on UI gurus to come up with a consistent interface when they first started out and it showed. Though I disagree with some of their choices, the key contribution Apple's effort made to the UI was to make the interface consistent across applications. Score one for the gurus.

OTOH, one of the key elements of the early Apple developer guides was "test your interface." We did and discovered that Apple's UI could use some work. Animated cursors, hierarchal menus, popup menus are among the ideas that came out of the developer community - sometimes in direct contravention of Apple's UI guidelines. Right mouse buttons for example. Score one for testing.

Bottom line you need to do both - listen to the gurus and then test what they're telling you.

Second point is we're different users with different needs. An advanced user is going to want to resort to a CLI at times. Doing the same thing over and over? A good CLI does it better. It's one of the reasons Unix has survived despite so many predictions that it would not. Those little switches can be very handy. Apple partially recognized that fact when they came out with MPW after trying to shove a pure gui down developer's throats and finding gui's didn't cut it in all circumstances. Different people need, or prefer, different interfaces.

You mean (none / 0) (#27)
by davidduncanscott on Fri Jan 10, 2003 at 11:53:34 AM EST

things like "BRUN filename" and "PR#slotnum" were based on UI guruship?

You people are making me feel old again.

[ Parent ]

Sorry, you're wrong. (5.00 / 4) (#18)
by Canthros on Fri Jan 10, 2003 at 08:25:54 AM EST

Well, not totally wrong, but not totally right, either.
  • People get to be gurus because they have experience and that means they have some idea what's going on. Just because it fits into $GURU's set of requirements doesn't excuse user testing, as any reputable UI expert would tell you. Gurus are not the problem. Lack of user testing is.
  • Actually, the Open/Save paradigm is outdated — why not use a revision control system and track changes all the way, with the ability to undo specific changes(within reason, obviously: you would have to undelete text before you could edit it, for instance). This could allow one to avoid the Open/Save paradigm and avoid linear undo.
  • I can't speak for the common user, but the reason I don't keep stuff well organised is that I'm a disorganized, busy sort of slob. Once I get things on the computer organized they tend to stay that way, but anything that makes that process easier for me sonds like a Good IdeaTM
  • Actually, WIMP is old and misleading. Computers are not file cabinets, after all. I doubt that it will ever go away (we still have command lines, and, where they're good, they're still very useful), but I don't think that WIMP is the end-all, be-all of the computer UI. The command line wasn't, the desktop won't be, either. What it does become will depend a lot on where computer use goes. I don't think it will be a VR interface anytime soon, but I wouldn't rule it out: the future is notoriously hard to predict.
"But my point is really this." Take UI predictions and rants with a very large grain of salt. Predictions are wrong more often than right, regardless, and those who rant tend not to be the best informed on the subject in the first place.

HAND, HTH.

--
It's now obvious you are either A) Gay or B) Female, or possibly both.
RyoCokey

Gurus (5.00 / 2) (#37)
by the on Fri Jan 10, 2003 at 08:52:39 PM EST

People get to be gurus because they have experience and that means they have some idea what's going on.
Actually my experience with gurus is that they are gurus because, well, they are gurus. Gurus are people who work full time to publicise themselves. They have glitzy looking web sites, they push themselves at conferences (often merely by speaking the most loudly and confidently), they have PR departments working for them and they often make sure their stuff, whatever it is, looks suitable for publication in popular magazines, and they know how to schmooze.

Sometimes they also know stuff. Very occasionally they're experts. But experts in many fields are ten-a-penny. What differentiates a guru is that extra stuff I just mentioned above.

--
The Definite Article
[ Parent ]

Guess we have different definitions here. (none / 0) (#42)
by Canthros on Fri Jan 10, 2003 at 09:21:23 PM EST

If a person pushes himself as God's gift to $SUBJECT, but doesn't know their shit to back it up, I don't consider him a guru. I consider him an ass. A guru, to me, is someone who is very knowledgeable on the subject, both in terms of practical experience and theoretical understanding.

But maybe that's just me being naïve. I do that.

--
It's now obvious you are either A) Gay or B) Female, or possibly both.
RyoCokey
[ Parent ]

Maybe what you have defined is an actual guru (5.00 / 1) (#68)
by the on Sat Jan 11, 2003 at 11:59:07 AM EST

As opposed to someone who gets called a guru.

--
The Definite Article
[ Parent ]
I'm very interested in this open/save thing. (5.00 / 3) (#40)
by vadim on Fri Jan 10, 2003 at 09:11:10 PM EST

Okay, let's think about this. I take a text editor and remove the 'save' button. Now what?

The editor still has to save internally. So, when does it happen? After every character, word, paragraph? Every 5 minutes? What if the document is on a floppy? I don't want to take the floppy out, hit a key on the keyboard and have the app try to write to the floppy. You'll also need more reliable storage, since with so many saves the changes of a power outage or crash during a save operation is greater.

But okay, suppose you solved that. Now, how am I going to do programming? For me writing code consists of editing a file and running it.

Perhaps I take a working version of my program, in Perl, start making changes and then find I have to run the original program to try something. Now what? Do I undo all my changes because they've been saved and the syntax is incorrect? Since this is Perl there's no compiled version I can use.

More problems: Supposing we've got all this working, how do the revisions work? Do we just use CVS, or instead invent some XML format to keep track of the changes? Do I want anybody I send that document to to be able to see all my previous versions? Would you like your employer to see how fought with a bug and wrote "why won't this fscking thing work" in a moment of anger? Or your friends to see emails that were intially full of insults?
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

Editing and versioning (5.00 / 1) (#63)
by dufduf on Sat Jan 11, 2003 at 09:51:03 AM EST

Document should be saved almost constantly. This brings obvious problems: Disk I/O is slow, power-consuming and generates heat. First one troubles everyone, but good (journaling?) filesystem helps some. Second is a big problem for laptop-users and green minded. Last one touches over-clockers and AMD-users. I think that working solution for majority of users could be achieved, even with the technology now availabale.

While editing program code, you should make a snapshot (version, revision, whatever) of it every now and then. That way you always have some past version to revert to, if you so desire. Of course you could also undo your changes, but that's a viable alternative only if the application has good mechanism for undo. Pressing CTRL-z few hundred times is not an option. Good undo might be something like independent undo-stack for every function and source file.

I haven't thought very much about the technical implementation of versioning. As a user I shouldn't have to care about application internals. It might be XML, binary, database, flat text file, or something else. But I do want to be able to export my documents in a sensible format without past versions and undo-stack. Or with them, if I so choose.

[ Parent ]

Okay... (4.00 / 1) (#64)
by vadim on Sat Jan 11, 2003 at 10:38:05 AM EST

First, journalled file systems, like ReiserFS in the default configuration only ensure metadata integrity. That means you can be pretty certain that if there's a power outage in the middle of a save the document will be broken, although the filesystem will be fine. You have the problem of that few people have filesystems that are reliable for this kind of use. IIRC the only current options are ext3/ReiserFS with data logging and NTFS.

Okay, snapshots. Suppose I'm writing a book. If it's any decent size, say 1MB, and the program keeps saving I'm going to end with a 100MB file at some point. This is "a bit" heavy for current computers. I've got 768MB RAM, but 100MB is still *way* too much for just a text document. And you can imagine what will be the size of a Photoshop picture with revisions.

Now, the technical implementation is where this idea starts to break, IMO. The whole point of this was to avoid the terrible effort of having to press the unintuitive save button. So now the user doesn't have to deal with that, but instead has to worry about sending the document with or without edition history, snapshots and branches.
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

Snapshots (none / 0) (#67)
by dufduf on Sat Jan 11, 2003 at 11:57:15 AM EST

I don't know much about how filesystem works, so there must be some holes in my theory. If EXT3 or NTFS are sufficient with some configuration, it's not that much of a problem to configurate them. I'm not concerned about the default configuration. If it has to be changed, so be it. If we need totally new filesystem or disks, then it takes a little longer.

100 MB is too much for memory or disk? I don't think that the whole document and history would be in RAM all the time, it is saved constantly. So you propably meant disk space. If you only save recent changes and snapshots, you'll probably end up using reasonable amount of diskspace, too. How much is 'recent' might be adjustable or, preferrably, auto-adjusted, so that it doesn't stuff your system.

I don't know about images. You could save some resources by not saving the intermediate images, but operations done to the image. I don't know if it's viable solution, because describing something like freehand draw with airbrush might still take too much space.

I've seen a demo of one partial solution for Word about a year ago. It seemed quite good, but needed work. Unfortunately, it was done by few of those UI gurus, so I'm not sure if they ever thought about the performance issues.

The problem of sending the edition history is quite real for every Word user. So nothing new there.

[ Parent ]

Filesystems (none / 0) (#69)
by vadim on Sat Jan 11, 2003 at 12:00:36 PM EST

Ext3 and ReiserFS have full data journalling as an option. But this means that all data written to disk is written twice, so it's two times slower. While I guess this is acceptable for a home user I doubt distributions are going to use that configuration by default.
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]
Reliable filesystems (1.00 / 1) (#115)
by phliar on Sun Jan 12, 2003 at 12:26:05 PM EST

[journalled file systems] you can be pretty certain that if there's a power outage in the middle of a save the document will be broken, although the filesystem will be fine. ... the only current options are ext3/ReiserFS with data logging and NTFS.
Journalling refers to a specific technique. IMHO, of course, but the best filesystem implementation with the property of preserving consistency on crash is Soft Updates on the *BSDs. Here's the 1999 Usenix paper on Soft Updates by McKusick and Ganger. (It's a PDF.) No extra disk writes are used by Soft Updates.

Faster, faster, until the thrill of...
[ Parent ]

Not so hard (none / 0) (#71)
by epepke on Sat Jan 11, 2003 at 12:30:12 PM EST

The editor still has to save internally. So, when does it happen? After every character, word, paragraph? Every 5 minutes?

This isn't so hard. A properly designed OS would alert the program as to the correct times to save. The Palm OS does this nicely. The only other problem would be upon the crash of an application or an OS. I find it lamentable that so many applications crash at all, but still, they do. Every 5 minutes should be sufficient.

What if the document is on a floppy? I don't want to take the floppy out, hit a key on the keyboard and have the app try to write to the floppy.

Remember that CP/M systems required people to press ctrl-C before removing a floppy. We've somehow managed to get away from that. But anyway, this is why from Day 1, Macintosh systems required an interaction with the operating sytem. Nevertheless, everyone jumped on them. The Microsoft side needed to use existing hardware, which had manual releases. The geek side, which has become the Linux side, rejected it, basically, because they though they were 1337 who could remember to press ctrl-C and considered anything else deeply insulting.


The truth may be out there, but lies are inside your head.--Terry Pratchett


[ Parent ]
How does this apply to reality? (none / 0) (#74)
by vadim on Sat Jan 11, 2003 at 12:51:31 PM EST

It can't be applied to most systems because removable storage systems can be removed without the operating system having control over it. It'd require a hardware change to avoid you from removing a floppy in an inconvenient moment.

In Linux there's the mount/umount system, but floppies can still be removed manually. And I'm pretty sure that users would have it easier to understand when they can take out a floppy in Windows than dealing with the mounting/unmounting concept, especially when unmounting a disk doesn't make the floppy pop out.
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

Absolutely (5.00 / 2) (#77)
by epepke on Sat Jan 11, 2003 at 01:34:53 PM EST

It can't be applied to most systems because removable storage systems can be removed without the operating system having control over it.

Absolutely. You can't. And you can't put a steering wheel on a horse, either. But, personally, I'm glad that modern automobiles don't have reins, simply because some dorboe though "people like reins, that's what they're used to, reins are the user interface standard, my mom can't learn how to use anything other than reins."

Unfortunately, due to historical reasons, we're going to have a lot of bad user interfaces for a long time, a lot of file systems based on glorified magnetic tapes, etc. But that unfortanate fact doesn't make all the entrenched decisions and limitations all of a sudden just great and peachy and ooja-cum-spiff.


The truth may be out there, but lies are inside your head.--Terry Pratchett


[ Parent ]
Revision tracking (none / 0) (#80)
by Canthros on Sat Jan 11, 2003 at 03:45:38 PM EST

If you've got a copy of MSWord, you should be able to turn on revision tracking, which will enable you to accept or reject changes to a document of per-change basis. HTML supports <ins> and <del> for exactly that sort of revision control. Saving can be handled in the background on user-defined interval with a reasonable default, if there have been changes since the previous write to disk. Backups being prudent, the previous version could be renamed to something findable, so that it can be recovered if the disk write should fail. Open documents could be written to disk on close if any changes have been made.

Since you don't want to send the whole document, revision history and all, out 90% of the time, you'd have to introduce the idea of draft and final copies: a final (or release) copy would not have a revision history. If you have problems keeping your interaction civil and professional, I don't see that as a software issue. This could be havdled transparently to a certain extent, by forcing the MUA to automatically create a release copy of a document, for instance. Not significantly more complicated than the process of typing and editing documents on real paper.

None of this will remove the user from needing to know that documents are stored to the drive, and that powering off the machine mid-write will cause problems. It also can't protect them from a catastrophic disk crash. Neither of those is really the point: the idea was to keep them from having to do manually what they should not have to do.

--
It's now obvious you are either A) Gay or B) Female, or possibly both.
RyoCokey
[ Parent ]

Reliable save (none / 0) (#134)
by warrax on Mon Jan 13, 2003 at 03:52:56 AM EST

(Not going to comment on all the other stuff, but...)

Doing a reliable save is not really that difficult:

  1. Create temporary file containing new contents.
  2. Flush temporary file to disk.
  3. Rename over original file.
Note that rename is an atomic operation in most (all?) filesystems. Certainly, this is both atomic and realiable on filesystems that journal metadata.

If you're really paranoid, you can also do this: Always have two copies of every file, each with its own checksum (something cryptographically secure) and an integer version number. When writing a new version, simply overwrite one of these with the new version (and a newly generated checksum, of course). Again, there should be "flush barrier" between two different versions, but we need not actually flush immediately.

-- "Guns don't kill people. I kill people."
[ Parent ]

That's not a good way (none / 0) (#142)
by vadim on Mon Jan 13, 2003 at 09:32:39 AM EST

You're destroying the creation time of the document in the filesystem unless you set it manually, which many programmers seem not to bother with.

Also it requires twice the disk space, which could be a problem with floppies. Storing the temporal file on the hard disk wouldn't be a good idea. And if the file is big enough, say about 100MB this is also slow.

IMHO, this is an issue the filesystem should deal with. And the ReiserFS people seem to be doing exactly that.  
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

Well, maybe not, but still... (none / 0) (#143)
by warrax on Mon Jan 13, 2003 at 12:38:40 PM EST

You're destroying the creation time of the document in the filesystem unless you set it manually, which many programmers seem not to bother with.

This is really something which should be implemented in a VFS-type layer, so application programmers should not have to bother with this.

Also it requires twice the disk space, which could be a problem with floppies.

Agreed, but floppies are so unrealiable that multiple copies of a file on the same floppy doesn't really guarantee anything anyway. :)

And if the file is big enough, say about 100MB this is also slow.

Agreed, hadn't thought about that actually (since I mostly deal with databases where all writes are page-oriented).

IMHO, this is an issue the filesystem should deal with. And the ReiserFS people seem to be doing exactly that.

And thank $DEITY for that. I'm also really looking forward to ReiserFS 4! :) However, I seriously doubt that you will be able to have ResierFS 4 formatted floppies...

-- "Guns don't kill people. I kill people."
[ Parent ]

gui design philosophies (5.00 / 2) (#20)
by calimehtar on Fri Jan 10, 2003 at 08:38:41 AM EST

I can think of a few:
  1. Make it cool. This is actually more common with graphic designers (think flash websites) than even with oss developers. I don't think I have to explain why this is misguided.
  2. Add more options. I understand this is a common pitfall for oss developers. The result is software that requires hours and hours of configuration to run properly. Think sendmail and X11 Window Manager I have ever tried.
  3. Separate a project into microscopic pieces and test, test, test. This is the school of thought of one particular useability lab I studied with. The problem here is that this methodology works for detecting problems, but not at all for finding solutions to them.
  4. Make it simple. This is the guiding principle of both Apple and, I believe, Mr Useit, Jakob Neilson. Useit.com, IMO, demonstrates that this philosophy definitely has limited usefulness.

I don't think there is a panacea for good interface design. Software needs interface designers to set the direction and graphic designers to make it pretty and useability testing to catch problems and smart developers to suggest new features and implement them.



Couple of comments (4.00 / 1) (#21)
by dufduf on Fri Jan 10, 2003 at 08:43:18 AM EST

The only User Interface guru is your mum.
I hold my mother in great respect, but she's not a UI guru. If I ever design a financial software for a middle-to-large-size company, I'll be asking my mom to test it. That's because my mom uses a computer for a very limited number of tasks, and 90% of them are related to finance. For any other kind of sofware I'd get someone who's more likely to use it on a regular basis.

Software should be designed for regular or expert users, not beginners, because users are not beginners for the rest of their days. I don't mean, that we should abandon every attempt to make things easy for the beginner. Sacrificing efficiency for ease of use isn't good either, because that just annoys experienced users.

Save and open? Outdated nonsense!
Actually, it is. Saving is not only about finalizing your work, it's mostly storing. Desktop-metaphor is flawed because of this. If I write a note with a pen and a piece of paper and leave it on my desktop when I go home, I expect to find the note there when I come back next morning. Further reading about Problems with save.

Another bugbear of UI gurus, who want every OK button to say something different: which is like having a different type of handle on every door in your house.
Your metaphor is flawed. Every door handle serves same purpose: opening the door. But OK-buttons don't. Sometimes OK means 'Save all open files', sometimes 'Discard all changes', sometimes it's 'Send this to your mother-in-law'. Usually it means 'I've read (yeah, right) this totally useless piece of information'.

Don't trust the gurus. They get paid to have clever, radical opinions.
Well, there was a time when WIMP was clever, radical and new. The famous Engelbart's mouse demo.

Save/Open outdated? (none / 0) (#38)
by vadim on Fri Jan 10, 2003 at 08:59:54 PM EST

Okay, tell me, why would I want to have all my changes saved? That's actually one of the best features of using a computer, that you decide when you want to save it!

Sometimes I start writing a comment and then decide it's too incoherent or complete nonsense and just close the window. It'd greatly bother me if comments worked like a real conversation, without the ability of taking things back.

Saving documents is completely logical. It's like writing a letter. Just like the pen makes a mark on the sheet of paper you're storing it in RAM. You can spend a day, two or a week writing a letter, changing parts, etc. And then you can decide it's done and send it, or throw it into the trash bin and it'll be as if it never existed.

--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

Save, open, sending and undo (none / 0) (#50)
by dufduf on Sat Jan 11, 2003 at 02:36:29 AM EST

I would like to have all my changes saved. But that feature would be more like bug, if the application didn't have good support for versioning, undo and export. When I want to send my letter, I'll send it, not save it.

I'm thinking about something like constant auto-save and integrated versioning system. When I start working on a document, it'll be saved every few seconds. I can take snapshots of it whenever I like. If I choose to, I can export one snapshot and send it to someone else, who can work with it. Or print a snapshot, fold it into an envelope and send away. The point is, that the user shouldn't know nor care about the difference between RAM and disk.

This would work well with applications like Word or Photoshop. Kuro5hin would be harder. I don't think that any browser would benefit from that kind of an autosave. I guess you and I can still send our comments manually when we are satisfied with them.

[ Parent ]

Fine, what about my other comment? (none / 0) (#62)
by vadim on Sat Jan 11, 2003 at 08:45:05 AM EST

How would you address these issues? For example, how do I write code in a text editor that saves automatically?
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]
The deficiencies of the Reality GUI (none / 0) (#99)
by Znork on Sun Jan 12, 2003 at 05:09:48 AM EST

Leaving a note on your desk will sometimes work. Sometimes the note will be gone, without notification or any indication that it was removed. Furthermore, it is unlikely you have a backup note that can be easily restored.

This is what can happen overnight. Over a longer period of time physical information objects have a tendency to gather into unlabeled directories known as 'piles'. It is nontrivival to locate information objects within these piles since there is neither a convenient date labelling nor any naming visible without looking at the contents of each separate object in the pile. Imagine if your computer saved everything automatically with similar names unrelated to the contents and you had to look at every file to find out what was in it... that's Reality GUI for you.

This applies to other objects too, of course. I have a bunch of 'music objects', shiny round things which contain music. Unlike on the computer, these music objects tend to randomly move between containers, which results in many of them being mislabeled. This is a result of the Reality GUI not supporting actually playing them without removing them from their container or their place in the organising system. Imagine if you hade to move your .mp3 or .ogg file out of its place to play it? That would be a vast step backwards towards the 'intuitive' Reality GUI.

Etc. The point being, of course, that physical reality is, in many case, far, _far_ worse than most computer interfaces as far as usability goes.  Reality is not a good model for a lot of things because it is inherently not a good user interface.

[ Parent ]

Piles (none / 0) (#150)
by squigly on Tue Jan 14, 2003 at 05:51:04 AM EST

It is nontrivival to locate information objects within these piles since there is neither a convenient date labelling nor any naming visible without looking at the contents of each separate object in the pile.

These are actually generally ordered into groups (3 or 4 different piles) and date.  It's inherently a stack in order of last looked at date.  

Piles are usually groups of data of the same type or paper size.  At my last job, I had some A3 diagrams in one pile, some A4 docs in another pile, and a pile of handwritten notes in another.  It took a while to sort through, but more important pieces of paper tended to be near the top.

[ Parent ]

Save (none / 0) (#129)
by drsmithy on Mon Jan 13, 2003 at 12:51:51 AM EST

Actually, it is. Saving is not only about finalizing your work, it's mostly storing. Desktop-metaphor is flawed because of this. If I write a note with a pen and a piece of paper and leave it on my desktop when I go home, I expect to find the note there when I come back next morning.

Actually, if you perform the equivalent on your computer - that is, open a document, type stuff in it *and then just leave it*, it is. You have an opened document on your screen when you return.

"Saving" is the metaphorical equivalent of deciding you want to keep a physical document, so you label it in some way (give it a filename) and put it somewhere where you can find it again (somewhere in your filesystem).

Quite frankly, if you are having difficulty explaining the concept of "saving" and why it is important, then the problem lies in your explanation and not the principle. The complaints in that document you link to have more to do with bad implementations (eg: no autosave, no undo, no revision control) and reactions to what must have been poor explanations of the functions of the system, the the concept of "Saving".

[ Parent ]

Paper on desk (none / 0) (#161)
by chu on Wed Jan 22, 2003 at 01:01:06 PM EST

Why should the GUI precisely model a physical real-world desktop just because it refers metaphorically to one? There is a trashcan on my computer desktop but I'm rather glad that stuff doesn't fall out of the sides as I'm trying to empty it like the one in my house.

Analogies can be constructive as visualisation aids but my point is that the software version is different to the thing being symbolised (or why use it) and if you model too closely on principle, you will end up missing possibilities in your applications - basically you will cripple invention.

[ Parent ]

uh (4.71 / 7) (#25)
by tps12 on Fri Jan 10, 2003 at 11:10:15 AM EST

So you've come up with three or four things that "gurus" say but you don't believe. Don't you think you're overreacting to rail against the entire UI design research community? The problems you've found aren't even major (I noticed that you left Fitt's Law alone, despite your hinting in the intro).

Some self-styled "gurus" do indeed pull stuff out of their asses and try to start flamewars. But there is an established scholarship, in both academic and corporate environments, of user interface design, and it relies heavily on empirical evidence from user tests. Like any form of research, there are good ideas and bad, and they're distinguished through experimentation. This kind of active research is how we got WIMP to begin with, and guess what, assholes like you were bitching about it back then, too.

Current UIs are "good enough" for what we do now, but they will be made obsolete. And whatever they are replaced by will with almost certainty not be designed by some crank laying out VB apps so that his "mum" understands them, but by a concentrated and organized effort of creative, dedicated scientists performing research.

In this, as in all else,—
Y'r obd't s'v't.
tps12.—

personal data nobody cares about (4.00 / 2) (#28)
by eudas on Fri Jan 10, 2003 at 12:12:16 PM EST

"As a parallel, look at your home directory or "My Documents" folder. I bet, like most people, it's a big mess of different stuff. Why haven't you organized it into neat little subfolders, you disorganized fool? Well, because whenever you looked for something, you'd have to trek up and down into all these subfolders, remembering how your mind worked 6 months ago. Right now, you put it in a big long list and scroll through the list until something looks familiar. Hurrah. And filenames are like that: everything in one place."

actually i'm one of those weirdos that doesn't download everything into My Documents or Desktop or a "Stuff" folder. It's all sorted out into appropriate subdirs on the hard drive...

eudas
"Nothing is on fire, but the day is still young" -- Phil the Canuck

..I've been converted.. (none / 0) (#83)
by wvenable on Sat Jan 11, 2003 at 05:04:07 PM EST

I was one of those people that put my files all over the harddrive in neat subdirectories -- but Microsoft has finally converted me.  All my files are now in neat folders under My Documents and I only have WINDOWS, PROGRAM FILES, and DOCUMENTS AND SETTINGS in the root of my harddrive.

In some ways it's hard to get used to but Microsoft has done so much to make it easy to get to your My Documents folder and hard to get to your C: drive that it seems to be working out.


[ Parent ]

Juvenal (3.00 / 1) (#34)
by artsygeek on Fri Jan 10, 2003 at 04:39:43 PM EST

It was Juvenal that gave us that quote

LOL (3.00 / 1) (#59)
by dash2 on Sat Jan 11, 2003 at 07:46:18 AM EST

Close, but no banana.
------------------------
If I speak with the tongues of men and of angels, but have not love, I am become sounding brass, or a clanging cymbal.
[ Parent ]
Deep thinkers (3.00 / 1) (#106)
by jkozak on Sun Jan 12, 2003 at 07:48:15 AM EST

It's the goriller of 3B, isn't it?

[ Parent ]
If this is true... (2.25 / 4) (#36)
by Cup O Tea on Fri Jan 10, 2003 at 07:28:51 PM EST


Designing a decent user interface is critical for applications and operating systems.

Then er.. how come I have all these* applications with really shitty UI's installed?

Fucking annoying, but not critical IMHO.

-1 Boring uninteresting crap.

*Excluding the ones for which I've been able to find alternatives with better UI's

It is true. (4.00 / 1) (#43)
by porkchop_d_clown on Fri Jan 10, 2003 at 09:36:06 PM EST

The reason those UI's are so shitty is because designing a good UI is hard


--
Wouldn't it be a victory for the oppressed people of Iraq, of North Korea, of Iran, if their police-state regimes were overthrown? Even by a cowbo
[ Parent ]

And consistency trumps all. (4.80 / 5) (#44)
by porkchop_d_clown on Fri Jan 10, 2003 at 09:43:46 PM EST

Programmers and web designers feel the overwhelming need to show how cool they are by designing new ways to do old things.

Screw that. No skins. No fancy new 3d interfaces. Find the style guide for your OS and stick to it. Your users will be much happier.


--
Wouldn't it be a victory for the oppressed people of Iraq, of North Korea, of Iran, if their police-state regimes were overthrown? Even by a cowbo

A GUI design approach (4.60 / 5) (#47)
by mdevney on Fri Jan 10, 2003 at 11:33:19 PM EST

I am not a GUI guru.  In fact, I'm not even interested in user interface.  I just don't care enough about the interface.  I care about the work I'm trying to do.  Unfortunately, the UI often prevents me from doing that.  This comment is mostly a list of gripes along that line, since we're on the subject anyway.

Cascading menus.  Click on file -> save and you're done -- that's simple enough.  Click on Start -> Programs -> G.O.D -> Serious Sam -- that's 4 clicks, or 4 wait periods while it realizes the mouse is stopped.  Highly annoying.  Much to my dismay, practically every GUI uses this braindead waste of time these days.  Luckily, I can strip it out of most open source window managers.  (I use windowmaker because it was easy to strip all the cruft out.)

Start menu.  Same problem as the cascading menus: more clicks for the same program.  Screw that.  It is also highly annoying to see this 100% brain-dead, no redeeming qualities whatever idea go into KDE and Gnome, with their K and footprint menus.  If Microsoft has a bad idea, do we in the open source community *really* have to copy it anyway?

Tabs.  More clicks for even less program!  Now, after sorting through desktops to find the right one, and sorting through windows to find the right one, I get yet another layer to sort through!  Yay, just what I never wanted.  

Shortcut buttons.  Yes, I know I just complained about how the start menu wastes valuable time that I could be working, but the correct answer is *not* to put a shortcut button along the side of the screen.  My primary screen is a 14.1" TFT laptop display, and screen real estate is already at a premium.  I do not need a row of buttons that I'll use once or twice per session taking up 5% of my screen.  (Autohide is a nice workaround, but takes some getting used to.)  This is a key point: Nothing should be on screen unless it absolutely has to be.

Confirmation windows.  This falls under the general rubric of "the machine thinks it's smarter than I am."  To date, I have not met one that is.  Never, ever, ever confirm anything for me.  I clicked the button, so obviously I meant it.  (Okay, there are reasonable exceptions.  I would like a confirmation for rm -rf /, but not for rm -rf ./ .)  

Buried options.  If anything is under more than 3 layers of windows, a command line actually is easier.  For example, if I wanted to change my screen resolution in Windows, that would be start -> control panels -> display -> settings -> resolution.  There are worse things hiding under more layers; I just don't have a windows box here to get a concrete example, so have to do this from memory.  So to change the screen resolution I have to make at least 5 of what I'll call "gui motions" -- moving the mouse, waiting for the UI to realize I'm waiting for it, clicking, whatever.  This assumes I know where what I'm looking for is, which I almost never do.  I can't believe anyone can tell me with a straight face that that's easier than `edit win.ini`.  (Or `vi /etc/X11/XF86Config`)

Dock.  Okay, back to screen real estate being precious.  Why is there a dock wasting valuable space?  Just get rid of it.  Entirely.  

So now that I've gotten rid of practically every feature on the screen, how do we find, launch, and switch programs?  My personal favorite is a floating menu.  Find an empty spot of screen, right-click, and a menu pops up.  A very simple, non-cascading window; mine has only run, xterm, gaim, opera, nedit, gnutella, gimp, and kword.  It seems I was optimistic in setting that up; xterm and gimp are the only two buttons on there that I ever use.  Everything else I use rarely enough to invoke it from an xterm.  

As an added bonus, I never minimise anything.  Why should I?  I have a screen (workspace) for every task.  If I'm viewing web, Opera is maximised in workspace 1; workspaces 2-4 have a handful of xterms each, 3 has gaim, and 4 has gimp.  alt+1,2,3 etc. does everything I'd need to.  Alt+up or down raises or lowers a particular window, which you'd think would be important, but thinking of it now, I realize I've never once used that feature.  Not one single time in my life.  

Summary: In the world of GUI design, less is more.  I'll not even get into the GUI vs. CLI vs. Something Else debate; each has its place, I'm convinced, and no one is inherently better than another.

Confirmation dialogues (none / 0) (#97)
by HoserHead on Sun Jan 12, 2003 at 02:13:40 AM EST

Confirmation dialogues come in really handy when you accidentally click the "Close" button when you really mean to click "Save," or the "Blow up Earth" when you meant "Spawn Many Kittens."

Also, confirmation dialogues also come in very handy for people who can't easily operate a keyboard & mouse. Think cerebral palsy, or Parkinson's.

It's really good UI design to put in confirmation dialogues, regardless of whether you want them or not. Maybe an environment variable "CONFIRM_DESTRUCTIVE_TASKS=never" which you will regret setting one day, but currently that doesn't exist.

[ Parent ]

Confirmation dialogues. (none / 0) (#103)
by irrevenant on Sun Jan 12, 2003 at 06:39:19 AM EST

Confirmation dialogues come in really handy when you accidentally click the "Close" button when you really mean to click "Save," or the "Blow up Earth" when you meant "Spawn Many Kittens."

That makes sense, but in practice it generally doesn't work out that way. What happens in practice is that the user learns pretty clickly to just (eg.) press Ctrl-P, <Enter> when they want to print instead of just Ctrl-P.

My humble opinion is that, the majority of the time, a program would be better off providing an undo facility for destructive tasks than providing a warning before they occur. (For example, at least limited versioning when you save would be great - if space is a concern, then you could set the program to delete versions that are more than a couple of days old).

[ Parent ]
Changing resolutions (4.00 / 1) (#128)
by drsmithy on Mon Jan 13, 2003 at 12:43:32 AM EST

I can't believe anyone can tell me with a straight face that that's easier than `edit win.ini`. (Or `vi /etc/X11/XF86Config`)

Not only can I tell you that with a straight face, but I'd laugh hysterically at any suggestion it isn't. For starters, to edit the text file, you need to know where it is and what it is called - with little indication or prompting as to what the answers to those two questions are (not to mention knowing about the concepts of "editing", an "editor" and how to use one). After that, you then have to decipher the format of the config file, the name of the variable you want to change and the format of the data to put into it to get the result you want. Finally, you have to be sure you put in valid data and that the directive you give the machine is going to work (ie: trying to run 1600x1200 on an old 10" IBM VGA display won't).

Contrast this to the GUI approach, where you have a constant level of feedback and prompting (where does stuff get changed ? "Control panel". What might change how to monitor looks ? "Monitors", etc). You have a list of options to choose from (thus avoiding having to worry about how to format the data you want to input). You have an automted test and recovery system, in case you tell the machine to do something it can't (it shouldn't let you try in the first place, but that's another discussion).

The GUI way is _vastly_ easier - just look at the relative amounts of prerequisite knowledge you need to perform either method. If you're an experienced user it may be marginally slower (although you'd have to work hard to convince me - just because you've described the slowest way of changing resolutions doesn't mean there aren't faster ones).

This is the problem with people who argue that command lines and editing text files is easier than GUIs because they can describe it in less steps - they forget the enormous amounts of pre-learned skills and knowledge that must exist for their three line task description to be meaningful.

[ Parent ]

Window Managment (none / 0) (#164)
by David McCabe on Fri Jan 31, 2003 at 01:43:55 PM EST

> I have a screen (workspace) for every task.  If I'm
> viewing web, Opera is maximised in workspace 1.

I did this for a while.

You may want to look into the Ratpoison window manager. It always maximizes everything, automaticly, so each window effectively is its own workspace (although you may split-screen).

So, I type 'C-t 1' to get my terminal, 'C-t 2' for Konquerer, etc.

For programs such as the Gimp, which are rendered unusable by this, run another window manager inside Xnest.

So, all windows get to be full-screen, and there is zero screen clutter (unless that of the program itself).

[ Parent ]

Don't wimp out (huh huh huh) (4.85 / 7) (#49)
by psicE on Sat Jan 11, 2003 at 01:52:21 AM EST

Look at the Palm interface. What is most notable about it? Despite being generally WIMP-like, it is far more intuitive to the average user than any other interface like it. Let's examine why.
  • Windows: Instead of using movable, Palm uses maximized windows and semi-tabs: along the bottom of the screen are four hardware tabs and four software (silk-screen programmable) tabs, that can switch between programs that automatically save state. If you want to move data from one program to another, you move to the first, copy it, move to the second, paste. There is never a confusion as to what is active. Also, there is none of the useless and confusing min-max-close buttons; to switch applications, simply use the "tabs", or go Home and choose a new application.
  • Icons: Instead of having complete disarray, Palm icons are always neatly sorted. Icons themselves are only displayed when relevant; programs such as text/document editors don't bother using icons, economizing space. Icons are simple, abstract representations of programs, instead of the complex pictures that are modern icons (Mac OS X being the worst offender in this regard).
  • Menus: Most features in a Palm program are accessible right from the program. The menu is only for obscure features, and is always available in the same place at the top of the screen, though hidden when not in use to avoid clutter.
  • Pointers: The worst feature of WIMP, pointers are not even used in Palm OS, in favor of a stylus. No more losing the pointer; no more problems of scale either.
  • Persistency: Palm applications are generally structured such that data persistency makes sense. Many programs do not require documents (DateBook) and simply store data in their own database; programs that do have the user name them ahead of time. There is no version control, but that could easily be added - just keep track of all changes made to the document ever, and allow the user to click on a "mark revision" button in lieu of a "save" button.
  • Filenames: This is one place where no change is needed: filenames are here to stay, if for no other reason than they require minimal user effort.  However, I still kind of like the Palm halfway solution: Identify memos by filename, but that filename is the first line of the memo, whatever it is. Ideally, that would be expandable to include any portion of the memo, with live-queries; so that filenames are the first step, but contents and other attributes can easily be searched as a last resort.
  • Consistency: Skins suck. No, really. There is nothing wrong with skinning a system, like for example GTK skins, but no application should have a skinning mechanism. Consistency is good.
How can this be adapted to desktop computers? Thoguh I prefer Linux, I'll use Windows for my examples. Windows (the WIMP kind): Always maximized, eliminate taskbar in favor of tabs (at the bottom of the screen). with split-screen capability. There is absolutely no point in ever having dead space. Icons: Include on the tabs-bar a Home tab, bringing the user at any time to a page of all applications currently installed on their system. Nothing else needs to be on their desktop; that's what Explorer/My Documents is for. Menus: Put a single menubar at the top of the screen. Allow users the option to hide the menubar (and the tab-bar, for that matter) PalmOS-style until the user moves their mouse to the appropriate area, or clicks.

Pointers: Eliminate the mouse entirely. For most users, their arrow keys should be fine; just add in functionality so that holding a certain key (placed in the middle of the arrow set) while pressing up or down will navigate among the various buttons. Think of the way TV menus work. Then, when placed in text boxes, they work as a cursor. Et cetera. Persistency/Filenames: Require users to name documents beforehand, then use the same technique PalmOS does to autosave every character (if that means using flash memory, so be it :D). Consistency: Make it technically impossible for applications to draw their own widgets, unless they use the same techniques that high-end games do. Implement a feature in the OS to let users skin all applications at once, so that app designers have no incentive to allow local skinning.

... Whew.


Alright! (none / 0) (#54)
by carbon on Sat Jan 11, 2003 at 04:12:32 AM EST

PalmOS is great; a desktop version of it with good hardware recognition would sell like hotcakes. I'd never use it myself, of course, being the sort of person who uses a computer far more often than is healthy, but there's nothing more user friendly for the average user than it. The average user gets confused at the concepts of multiple tasks and directory hierarchies (which is fine; these are only simple concepts once you already understand them, like a lot of mathematics), and never needs more than one window open at a time. Several of my friends have used computers for years without ever figuring out how to, say, move a window around by dragging its title bars.

As for autosaving, the easiest method would be to use Vim's (and I'm sure other editors as well dot his) method for saving to the emergency temp file; save whenever they stop typing for more than about 5 seconds.

As for preventing drawing of your own widgets, the only way to do that is to (a) make sure everyone uses the same widget library and (b) get rid of all the developers who think they're graphic designers (and I speak as a developer who formerly thought he was a graphic designer.) The former is either easy or hard, depending on the OS; the latter is impossible (quite a few freeware and commercial apps for PalmOS are seperately skinnable, and generally look pretty crappy).


Wasn't Dr. Claus the bad guy on Inspector Gadget? - dirvish
[ Parent ]
1/2 right (none / 0) (#56)
by dash2 on Sat Jan 11, 2003 at 07:38:27 AM EST

I totally agree that the WIMP interface doesn't work for mobile devices of any sort. I recently tried out an XDA - a much-hyped device running pocket windows. I discovered that you can actually right-click by holding your finger down. ROTFLMAO. How about that for clinging to an inappropriate metaphor: the mouse was designed to be an extension of your hand, but now Microsoft have made your finger (or stylus) into a mouse replacement....

That said, I would be very wary of applying this logic to large-screened traditional PCs where there is space for multiple input devices. I think the mouse is a brilliant device, wonderfully easy to use. Remodelling the PC based on interactive TV is frankly daft. I suspect that the same applies to having windows always maximized. I don't always want all my windows maximised!

Dave
------------------------
If I speak with the tongues of men and of angels, but have not love, I am become sounding brass, or a clanging cymbal.
[ Parent ]

palmos is good for palms (4.50 / 2) (#61)
by jolt rush soon on Sat Jan 11, 2003 at 07:57:01 AM EST

i agree with most of what you're saying but the whole maximized windows always idea is taking it too far. i don't know about you but i often need more that one application open at a time and to be able to watch one while doing something else in the other. i'm sure that this isn't just me. there's also issues with reading very wide paragraphs. i prefer to read k5 in a thinner window for ease of finding which line i'm on when i get to the end of one, although this might be an application consideration rather than a user interface consideration. anyway, my point was that it's not a good idea to have a windowing system on a small handheld or anything with a small interface, but i do like to be able to switch what i'm doing on a pc with a 17 inch monitor or anything larger by just moving my eyes from one window to the other rather than fiddling with tabs or whatever.

and the windows taskbar is quite a good idea. switching applications as easy as changing tv channels. now that's what i call good idea.

as far as skins go, i'd like to skin the next person i find making a skinable application.
--
Subosc — free electronic music.
[ Parent ]

reply (none / 0) (#79)
by psicE on Sat Jan 11, 2003 at 01:54:05 PM EST

There are four words in my post that may have been overlooked: _with split-screen capability. _.

I am not proposing that all windows are maximized all of the time, such that a user can only have one application in the foreground at once. I am simply proposing that there be no dead space. If the user wants to use more than one application, they select the appropriate tab, perhaps holding down an option key or using a different mouse button, and an additional window is opened split-screen. Think Ratpoison or GNU Screen, but much simpler, such that it actually makes sense to the average user. Cascading windows are about the stupidest idea ever created - autotiling windows are much more intuitive and easier to use.


[ Parent ]

Ion (5.00 / 1) (#87)
by PurpleBob on Sat Jan 11, 2003 at 07:06:16 PM EST

I started using the Ion window manager when I was setting up a computer that was too slow for GNOME, and actually liked it quite a bit. It does just what you describe.

I'd even be tempted to use it on my main system, if it weren't that some applications don't work especially well with it. For example, Mozilla's download windows would need to be redesigned to work well in Ion.

[ Parent ]

zooming and clustering (none / 0) (#156)
by kubalaa on Tue Jan 14, 2003 at 07:43:35 PM EST

Reading this thread inspired a sort of brainstorm --

You start out looking at a tiled view of all your open applications. Some might be represented by icons, some by actual views of the window contents. You click on a few -- your IM program, your media player, your browser, your email client -- they cluster together. You click on the browser, then hit a zoom key. The view zooms in until only the browser window is showing. You recieve an IM, as signified by some sort of alert icon in the corner of the screen. You zoom out, click the browser and IM windows, which pull together, and zoom back in. Now you're working with both simultaneously, so you can chat and browse at the same time. If you were to zoom out a bit, though, you'd still see also the email and media player windows. Then you could zoom out further to see all your applications.

Essentially what's going on is you're creating levels of importance each time you select a group of applications, and zooming in and out allows you to focus on one or many tasks as needed. It's important that windows are designed so that they represent a good unit of work -- you shouldn't ever need more than 2 primary tasks and maybe 3 secondary tasks at hand.

I'm not quite sure about how groups are created. If you zoom out, click an application, then zoom in, do you get all the apps you were looking at plus the new one? Perhaps we maintain a history of clicks at each level, and zooming in removes the application which was clicked longest ago from the field of view. Maybe clicking applications in sequence pulls them together but you must draw a box around them to create an official group. Perhaps with large enough screens we could allow fine-grained zooming and tolerate more looseness in the clustering of windows.

[ Parent ]

User style sheets (none / 0) (#123)
by driptray on Sun Jan 12, 2003 at 10:43:04 PM EST

i prefer to read k5 in a thinner window for ease of finding which line i'm on when i get to the end of one

I have the following in my user style sheet:

p, li, blockquote, dd
{max-width: 35em !important;}

Causes a few hiccups with some bad HTML pages, but generally works real well (if your browser supports it). If you're really keen, you can correct most of that bad HTML by using a local proxy. On Windows I use Proxomitron.


--
We brought the disasters. The alcohol. We committed the murders. - Paul Keating
[ Parent ]
This will definately start a huge discussion (4.50 / 2) (#51)
by Baldwin atomic on Sat Jan 11, 2003 at 02:45:18 AM EST

so i voted +1 fp.

However, I have a few bones to pick:

The gurus are the root of the problem

Says you, who claims to be an expert (aka 'guru') on this topic. Not that I want to detract from your points, its just ironic - sort of like telling someone to question authority. (to which they should, of course, reply 'why?')

Regarding save and open - sometimes they are good, sometimes they aren't. OK, mostly they are good, but for some applications they aren't needed. Most programs seem to have this right - word processors, spreadsheets etc. have save & open, things like bookmark lists, calendars etc. don't. Everyone's happy.

Filenames = good. I agree, but that's probably because I've been using them since I was 5. We need to ask some people who haven't used computers how _they_ would do it, before such people become extinct. (I suppose that could be quite a while given the situation in third world countries, but that's another story for another day).

You mentioned Fitt's law, but gave no discussion of it..?! Fitt's law, from (unreliable) memory, seems to imply that we shouldn't have to move our mouse big distances accurately. Direction, rather than distance is much easier to control, so we should use that.
If only the windows 95 designers had used this, computers would be much better - try out the RadialContext plugin for Mozilla to see an example - the right-click menu is replaced by a round one, which presents further submenus when you move your mouse in one of the 8 directions. A bit different to normal windows right-click menus, but infinately superior in terms of intuitiveness (is that a word?) and speed of use.


That's enough of a rant from me for now, I'll throw in some more if this gets on the front page.



=+=+=+=+=+=+=+=+=+=+=+=+
Opinions not necessarily those of the author.
Who are these gurus? (4.33 / 3) (#53)
by enterfornone on Sat Jan 11, 2003 at 04:11:59 AM EST

Given that everyone making UIs these days agrees with you.

--
efn 26/m/syd
Will sponsor new accounts for porn.
"Grate Latin lies[...]" (1.00 / 1) (#66)
by 6502 on Sat Jan 11, 2003 at 11:44:13 AM EST

Grate Latin lies: all the Roman women were beautiful
A lie indeed! Have you ever seen an Italian grandmother?!?!?! Mama Mia, that's UGLY!!!

0x7f

Everyone "knows" what's "right" (4.00 / 2) (#70)
by jd on Sat Jan 11, 2003 at 12:23:24 PM EST

...until the next thing comes along.

Let's face it. We all want to be right about something, and we all have opinions on everything, but that doesn't mean that any of those opinions have any objective Truth to them.

Let's take the argument on filenames. Filenames are just one heirarchical model for representing which data the user wishes to use. However, in a GUI environment, the underlying heirarchy is largely obscured. What you see is whatever the GUI designer decided you should see.

This may reflect the underlying heirarchy. It might be a totally different heirarchy, bearing no resemblance to the one underneath. (This is the way Windows has gone, for example.) It may not be a heirarchical model at all. As databases have evolved into relational and object models, you might have a relational or object GUI. Such creatures do exist.

Then, you might dispense with a "classical" filesystem altogether. You don't really need one, since you inevitably have filepaths, manpage paths, javaclass paths, etc, which string all of the directories together. If everything is in one bin anyway, you don't really need to waste space on tracking seperate bins.

I'll offer an alternative vision in a second post, so that this one concentrates on the fallacies of opinions.

Databases (none / 0) (#160)
by chu on Wed Jan 22, 2003 at 12:30:54 PM EST

As I vaguely understood it, the relational database is a mathematically perfect model that all databases should aspire to and none have achieved - and the object model was largely discredited. But I'm no expert by any means =)

[ Parent ]
Easy to use v. Easy to learn (4.50 / 2) (#73)
by spcmanspiff on Sat Jan 11, 2003 at 12:50:54 PM EST

This is a distinction everybody forgets to make, but probably one of the most important ones out there.

Examples:

  • The much-hated command line. Very hard to learn. Easy to use, once learned. I'm no unix guru, and only use ten or so commands frequently, but I'm able to do things with find and grep that would take a GUI person ages -- and often they wouldn't even think of doing in the first place.
  • High-end graphics software; I'm thinking of compositing software put out by Discreet. Hard to learn, but an expert who knows the stuff well can make magic look like child's play.
  • MS Paint: Easy to learn. Impossible to use.
  • etc.
Combining the "easy to learn" with the "easy to use" is near-impossible unless the software has a very limited number of uses. iTunes is a good example -- does one very limited thing quite well. Microsoft Access is probably the best counter-example I can think of. It takes a complicated subject/set of uses, tries to make it "easy to learn" via simplifications, metaphors, and little wizards, and ends up with something damned impossible to do anything useful with, and still hard to learn on top of everything else.

Personally, I aim for software that's powerful and expressive at the expense of being harder to learn.

 

Variable complexity & Wizards. (none / 0) (#100)
by irrevenant on Sun Jan 12, 2003 at 05:43:39 AM EST

One idea that doesn't get enough play, IMO, is the ability to toggle the complexity of the interface. For example, with GetRight, you can toggle the interface between Simple and Advanced. Simple and Advanced have basically the same menu options, etc., but the more complex options are hidden in Simple mode.

There is a blatantly obvious button at the bottom of the screen that states something like "Switch to advanced mode".

Similarly is Nero Burning ROM's (amongst others) approach of defaulting to a wizard ("Copy a CD", "Burn an Audio CD", "Burn a Data CD" and "Burn a mixed mode CD", IIRC). Or you can click the button to go to the power-user interface.

IMO, this is not quite as good as the first approach, because the transition from the simple interface to the advanced one is more traumatic. But it's a step in the right direction.

Heck, to get really way out there for a minute, why aren't more programs designed with the UI separated from the functionality so new UIs can be slapped on with relative ease?

[ Parent ]
An alternative GUI design (4.50 / 4) (#75)
by jd on Sat Jan 11, 2003 at 12:55:46 PM EST

Ok, you want a radically different GUI design, that -is- new, and yet would be as intuitive as a standard WIMP? Here's one I designed earlier.

Let's split the screen into three sections. A top "bar", and two columns underneath. Similar to how many desks really are arranged, in fact.

On the lower left column, you have data. This data can be collected into groups (ie: folders) and/or kept seperate. No restrictions. But it can only be data.

On the lower right column, you have applications. Again, these can be collected into groups, and/or kept seperate.

So far, pretty similar to the current setup, except for the explicit distinction between data and applications. Now, we start to add the new ideas.

First, let's have all applications AND all data as drop targets. You can drop data into an application, OR an application onto data, and it has the same effect. In other words, no implicit associations, as with Windows. You define an explicit association at the time you perform the task.

Let's say you want to save/print your work. You pull the data from the application, and drop it into the data space (save) or onto a printer (print). Again, the operation is very clearly and explicitly defined. No ambiguity.

Ah, now where is that printer? That's the top bar. Those are your devices. As in the "real world", where that top bar would contain a phone, an in-box, and out-tray, etc, that top bar in this GUI contains import/export systems. Ways to get the data to/from your computer.

So what's very different? Isn't this just a layout thing? No. Look at that save/print thing again. It's a drag/drop operation. In fact, everything defined so far is a drag/drop operation. I've not mentioned menus. That's because this system doesn't need any.

What about spell checking? Search/replace? All the other stuff that people do? Unix has had those as seperate applications since the dawn of civilization. I don't see any reason to change that. We can do all of these by dragging the tool you're using onto the data, or the data onto the tool.

In other words, I'm dispensing with the entire notion of a central system. There is no central application, which does everything. Everything is handled by tiny specialized tools, just as it is in a real office, with the user piping the data from one tool to the next.

Doesn't this make things more complicated? No. In a "modern" GUI, each application can have a different UI. Here, there is no application-level UI, there is only a system UI. The application is merely a backend tool, and the the GUI becomes a transport mechanism.

Ok, doesn't that take away freedom? No. It means that everything is skinnable, and that those skins aren't constrained by the limitations of the application. A skin can literally do anything that the GUI can support, up to and including producing the illusion of "office suites", or other integrated packages.

Wouldn't this be slower? No. Users have to spend inordinate amounts of time making sure that they're using single/double/triple clicks with the correct mouse button, as according to what software they're using, and in what mode. Here, the mode is "universal", even though it's defined by the user. That means the same method to do the same task, no matter where.

Users also have to play hunt-the-option on the menu bars. Many suites are so complex, users never find/see many of the options. Something like 90% of Microsoft Office is never found by the typical user. Not because it's not useful, but because it's too obscure.

Here, the entire right half of the screen, in effect, becomes a "universal" menu bar, with all of the options the user actually uses visible. They're there, because the user can drag them there. That means, no more million-and-one menu choices to find what you want, and no more desperately searching the manual for what that application writer chose to phrase something as.

I mentioned skins, earlier, and the ability to "compose" super-applications. The same applies to data, in this system. Data can be linked together, using "skins", to create composite documents of any level of complexity you like.

If you like the Windows "association" model, you could even define skins which linked documents to their applications. Only, it could be by any criteria you chose, not merely filename extensions or MIME types.

What about the underlying mechanisms? Surely, they'd be horribly complex! All we're talking about is a GUI representation of a Relational Database which also allows POSIX pipes to be used as tables.

And THAT is where this GUI scores over the "flat-file database" model that conventional GUIs use. Flat-file systems, where each directory and each file is a flat-file database which must be opened and processed, are limited. You can't tie data together that well.

Relational models, where the entire filesystem is a collection of tables and database manipulations, don't suffer that limitation. Any data combination you can define can be used, transparently, by any component, equally.

Forget CAVE. Forget 3D. Forget WIMP. If we're after a desktop, then the only criteria for success is whether a desktop is what we have.

Drag and drop = wrong tool for so many jobs. (3.66 / 6) (#81)
by NFW on Sat Jan 11, 2003 at 04:02:06 PM EST

Mice are inherently low-bandwidth devices. Keyboards are high bandwidth devices. Count the buttons on each... Mice are useful for positioning things in applications where position matters, but the wrong tool for the job otherwise.

I don't want to drag and drop to save a file when I can Ctrl-S. I don't want to take my hands off home row to navigate a tree control click-by-(reposition-and)-click when I can use typing and command/keyword/path completion to do the same task.


--
Got birds?


[ Parent ]

Mice low bandwidth!? (none / 0) (#101)
by irrevenant on Sun Jan 12, 2003 at 05:55:50 AM EST

Firstly, there's no reason the posited system couldn't also support keyboard shortcuts.

Secondly, mice aren't the right tool for every job, but your dismissal of them as inherently low bandwidth is silly. In a given second, a user can manage ~3 different keypresses from around 100-150 possible keys (~1,000,000 possibilities). In a given second, a user's mouse movement can input a string of co-ordinates each selected from 40,000+ (a 200x200 pixel area) possibilities (1,600,000,000+ possibilities). Add more for button clicks and use of the mousewheel.

If you don't believe in the bandwidth of the mouse, just try playing Quake using only the keyboard against someone using the mouse. (Note: yes, they use the keyboard too, but only a small subset thereof - you can use the whole thing. :).

[ Parent ]
Pixels (none / 0) (#110)
by pdw on Sun Jan 12, 2003 at 11:25:51 AM EST

In a given second, a user's mouse movement can input a string of co-ordinates each selected from 40,000+ (a 200x200 pixel area) possibilities (1,600,000,000+ possibilities).

I don't know about you, but I can't click a pixel. I usually click icons, which are typically 48x48 pixels. For a 200x200 pixel area, this gives 16 possibilities. Some people would probably want icons that are a little larger, say 96x96. They still have 4 possibilities.



[ Parent ]
Pixels (none / 0) (#132)
by irrevenant on Mon Jan 13, 2003 at 02:01:54 AM EST

You're right, of course. Mice are higher bandwidth devices than keyboards, but their effective bandwidth is greatly reduced by the user's accuracy. Your average user can't select a single pixel target very quickly (unless they play a lot of Quake). Conversely, I'd suggest that the keyboard is limited by a user's knowledge and memory.

However, mouse use hardly ends with icons. On my current screen, there are 9 menus I could click on, 14 hyperlinks, 4 tabs, 11 toolbar buttons. and a dropdown control.

In current UIs, I don't think either mouse or keyboard can be considered 'better'. So much information is presented 2-dimensionally in modern UIs, that often a mouse is faster and easier. eg. I'd rather use a mouse than a keyboard to click on the links at the side of this page and I'd rather use a mouse to select a slab of text.

All that said, I agree with NFW's assertion that drag-n-drop - and mice in general - are hardly the tool for everything...

[ Parent ]
Bandwidth (none / 0) (#117)
by NFW on Sun Jan 12, 2003 at 03:16:30 PM EST

pdw covered the pixel-vs-icon issue pretty well, but I'd like to add another big factor: time.

Think of bandwidth in terms of commands per second. Clicking something requires positioning the mouse as well as pressing the button. Mousing commands is about like typing with one-finger hunt-and-peck. Most of your time is spent positioning your finger/pointer over the icon/key, very little is spent actually sending the command. What's the bandwidth of a one-finger hunt-and-peck typist vs. someone with their hands on "home row?"

And that brings me to another thing I hate about mice - in order to use it, I have to interrupt my use of a high-bandwith input device. It's like hitting a pothole on an otherwise freshly paved from.

If you don't believe in the bandwidth of the mouse, just try playing Quake using only the keyboard against someone using the mouse.

I refer you back to my previous message. As I said, the mouse (or any pointing device - joystick, tablet, whatever) is the right tool for the job when it comes to positioning things, and that includes crosshairs.


--
Got birds?


[ Parent ]

Sunir, Holloway... (none / 0) (#145)
by NFW on Mon Jan 13, 2003 at 11:03:46 PM EST

Why the low ratings? I'm interested in your thoughts. (And yes, I'll be glad to undo the low ratings I just gave your comments now that I have your attention.)


--
Got birds?


[ Parent ]

Huh? Forget WIMP? (none / 0) (#82)
by vadim on Sat Jan 11, 2003 at 05:00:20 PM EST

In your design you seem to have Windows, you have Icons, a Menu (I don't think it really matters you've got a bar full of icons, it's still a menu IMO) and you've got a Pointer.

As I see it, this is just a WIMP that doesn't look like Windows.
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

You're absolutely correct, so... (none / 0) (#102)
by irrevenant on Sun Jan 12, 2003 at 06:28:50 AM EST

...let's see what we can do to resolve that.

The M & P aspects of WIMP are actually hardware issues. If you've got a typical PC with a keyboard and mouse, then you don't have a great deal of alternative.

The Tablet PC escapes this paradigm by getting rid of the mouse, but is designed for portable solutions. Why not go all out for fixed PCs and posit a desktop tablet that's at least 36"x24"? (Desks could easily be designed to insert them into the work surface directly). No mouse required and your fingertip replaces your pointer (you need no icon to represent what you're pointing at - you can see for yourself).

Doing away with Windows is a lot harder. The desktop cries out for some sort of subdivision to organise it. However, it becomes a lot less necessary if what you need just appears when you call it. I actually do this at work now (WindowsKey-R, "wordpad"). My taskbar command line in GNOME does a similar thing. These options would become a lot more useful if the UI were designed to recognise alternatives to commonly used options. eg. "write a letter". If I type "36^2", it should automatically know to open the calculator or spreadsheet (whichever is the defaul for that PC). With this in place, you can just leave the most common default icons on the desktop and call new ones into being as required.

With a 43" working area, the letter would only take up a fraction of the screen, and your other icons can be sitting elsewhere on the screen waiting. You can organise your icons into neat groups if you like, but won't be forced to place them in a heirarchical window structure.

I don't see being able to get rid of icons for as long as we are working with a 2D display. The only real alternative is words, and there are situations in which graphics are just much more useful.

With those few modifications to the interface, JD's ideas can be implemented, without the W, the M and the P. eg. you can still drag and drop your letter to the printer icon as he envisioned, you can still have a cluster of data icons, a cluster of software icons etc. without organising them into neat windows.

However, the bit I don't buy is the complete removal of menus. Just looking at my current application (Galeon, the web browser), it's hard to see how a menuless system would organise such disparate commands as turning Java support on and off, jumping to the 'about' screen, setting the language encoding and telling it where your webserver is.

[ Parent ]
That's still WIMP (none / 0) (#108)
by vadim on Sun Jan 12, 2003 at 10:23:45 AM EST

Getting rid of the mouse doesn't matter. On a touch screen the pen is the pointer. It doesn't matter that it's not visible on screen, as this is just a requirement of the mouse. You can buy a tablet or touch screen right now and use it as a replacement of the mouse. And it will not make Windows stop being a WIMP environment.

You haven't succeeded either at removing windows if I understood correctly. What you suggest sounds like Windows 3.x and OS/2, where minimized programs appeared as icons on the desktop. Enlightenment has a system like that too, IIRC. To remove windows you need to find a way of getting rid of modal dialogs, message boxes, palettes like the one in Photoshop...

On the other hand, menus are the easiest thing to remove, IMO. For example, Vim is a program that can be used in a GUI without a menu.
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

'tis not! : P (none / 0) (#133)
by irrevenant on Mon Jan 13, 2003 at 02:18:38 AM EST

Getting rid of the mouse doesn't matter. On a touch screen the pen is the pointer. It doesn't matter that it's not visible on screen, as this is just a requirement of the mouse.

Getting rid of the mouse absolutely does matter - it's the "M" in "WIMP". Get rid of the mouse and you're 25% of the way there.

The visibility of the pointer matters. The thing that makes it a WIMP interface is that the pointer is not a direct interface. It is ao emote-controlled graphical representation of what you are doing with an input device elsewhere. It adds a layer of artificiality between the user and the interface.

You haven't succeeded either at removing windows if I understood correctly. What you suggest sounds like Windows 3.x and OS/2, where minimized programs appeared as icons on the desktop. Enlightenment has a system like that too, IIRC. To remove windows you need to find a way of getting rid of modal dialogs, message boxes, palettes like the one in Photoshop...

As I recall, Windows 3.x partitioned programs into different windows on the desktop (hence the name, I guess). I'm not really familiar with OS/2.

It sounds like, to satisfy your definition, we'd have to get rid of any display of information on the screen. That's not really practical, and IMO goes far beyond what getting rid of the window paradigm entails...

[ Parent ]
Its not the M its the P! (none / 0) (#137)
by forss on Mon Jan 13, 2003 at 08:03:53 AM EST

Getting rid of the mouse absolutely does matter - it's the "M" in "WIMP". Get rid of the mouse and you're 25% of the way there. The visibility of the pointer matters. The thing that makes it a WIMP interface is that the pointer is not a direct interface. It is ao emote-controlled graphical representation of what you are doing with an input device elsewhere. It adds a layer of artificiality between the user and the interface.
Well...in WIMP the P is the mouse (Pointer) and the M stands for Menus...
So you haven't lost the 25% just yet..
And regarding pen-on-screen usage, I would not advice to look down in a table for more than 1 hour at the time. You will get neck problems... Many graphic artists use the wacom tablets (without built in screen) just for that reason - they can work longer without getting problems..

[ Parent ]
The viewing angle. (none / 0) (#149)
by irrevenant on Tue Jan 14, 2003 at 02:37:57 AM EST

I hadn't thought of the viewing angle issue. On the other hand, people have been working with paper on desks since long before PCs. How did they deal with this issue, historically?

P.S. If the "P" is the mouse, then removing it still loses you the 25%...

[ Parent ]
Why does it matter? (none / 0) (#140)
by vadim on Mon Jan 13, 2003 at 09:25:16 AM EST

The "M" in WIMP stands for Menu. The mouse is the "P" for Pointer.

It's pretty well established that Windows is a WIMP environment. You can replace the mouse by another pointing device. For example, Windows can let you move the mouse with the keyboard. You could use a tablet, or a touch screen.

If I use a touch screen the only difference is that I don't have to see the pointer. But I can still click, double click, drag and drop, select rectangular regions... it provides absolutely all the mouse can do. The Windows interface doesn't change in any way because of it.

The layer of artificiality you talk about is a direct requirement of the mouse, not of WIMP. For example, at work we've got portable machines for taking orders. Normally you control them with a pen. However, I can run that program under DOS at home using a mouse. Does it suddenly become or stop being WIMP just because I switch from a pen to a mouse?

It's the same as an Unix machine with a Braille terminal is still an Unix machine. It doesn't matter that it uses a different implementation of the terminal device.
--
<@chani> I *cannot* remember names. but I did memorize 214 digits of pi once.
[ Parent ]

Semantics. (none / 0) (#148)
by irrevenant on Tue Jan 14, 2003 at 02:37:11 AM EST

Personally, I think that, yes, a big part of what makes a WIMP, a WIMP is the mouse. The interface changes radically without it.

However, this seems to have devolved into semantic bickering. Perhaps what I suggested is a WIMP, perhaps not. I personally suspect that you could label any interface with a 2D display WIMPlike, depending on how far you were willing to stretch the terms. Regardless of how you label it, my suggestion is a different approach to an interface than what we use now. (Actually, it's a step towards an old interface, in that it largely models a physical desktop).

[ Parent ]
Galeon, et al (none / 0) (#155)
by jd on Tue Jan 14, 2003 at 11:52:59 AM EST

Let's start with Java support, encoding, and other things that are either present or absent. You have a sub-window inside of Galeon (or whatever web browser) which contains all of the abilities that are currently active.

To disable Java, you simply drag the Java engine out of Galeon. To re-enable it, you drag the Java engine back in. The same would be true of the other on/off features.

Fonts would be a case of having a series of panels inside a sub-window, each panel containing a font for a specific type of text. You drag in the font you want, rather than select it from a pick-list. That way, you also do away with static font paths in the GUI. The fonts that are loaded are the fonts currently in the applications and no other paths need to be stored inside the font server.

The "About:" window is really a shortcut to that URL, which merely happens to be an internal page. This could be handled by having it as part of the bookmarks window heirarchy. Drag the URL into the browser, and it displays it.

The reason this isn't menu-based is that menus imply a pre-defined structure, involving a seperation of operation and operand. If you've just "objects" which you can drag/drop into other "objects", then no such distinction exists.

What I am describing is a wholly Object Oriented GUI, as opposed to a Procedural (ie: menu-driven) GUI. One of the key characteristics of OO logic, as opposed to procedural logic, is that the data, not the original programmer, defines what is possible and what is not.

In other words, this design is also data-driven, not operation-driven.

An illustration of what I mean by this is word-processing. There are many word-processors, each using their own internal format, and each requiring a bazillion transliterators to import and export data.

Let's say, though, that the system stores all data in a self-describing format. (I've posted a number of such formats on Freshmeat over time.) A self-describing format is one in which the semantics are included, not just the syntax.

This means that you have one universal format for documents, which can be loaded by any wordprocessor, transparently. (This is NOT the same as defining a new "wordprocessor file format", as that can never describe things that the programmer didn't think of. Self-describing data can be used to describe ANYTHING, whether the programmer thought of it or not. As a result, it is infinitely extensible, and will work for any wordprocessor that has existed, exists today, or ever will exist.

(This is why such formats exist. Scientific research groups don't want the hastle of reformatting their multi-terabit archives every time some utility gets upgraded. Much simpler to have a standard that can be handled by anything.)

[ Parent ]

Heh (none / 0) (#116)
by dash2 on Sun Jan 12, 2003 at 12:37:34 PM EST

I honestly can't tell whether you are serious, or whether this is a clever send-up of exactly the sort of loopy UI ideas I was ranting about... It sounds like a nightmare to me, but hey, write it and test it on your mum. I don't want to discourage wacky ideas, just so long as they don't come near mainstream applications before serious usability testing.
------------------------
If I speak with the tongues of men and of angels, but have not love, I am become sounding brass, or a clanging cymbal.
[ Parent ]
+1, *BUT*... (4.00 / 1) (#76)
by gordonjcp on Sat Jan 11, 2003 at 01:09:06 PM EST

And get behind me, that wicked command line! That vile excrescence whereby you learn 100 different commands each with their little switches and syntaxes, and then you can use your computer, and you are immensely proud of your Unix knowledge, just like any other trainspotter who has memorized a vast and meaningless connection of facts.
What about trying to memorise where things are hidden in menus? What about trying to memorise what all the meaningless little symbols and misleading labels are supposed to mean? Personally, I find the CLI a lot easier to use for most things, and I find GUIs messy and slow for most things. However, I find I need to use both, in much the same way as (and for the same reason as) when I'm working on my car, sometimes I need to use a spanner, and sometimes I need to use a screwdriver.

Give a man a fish, and he'll eat for a day. Teach a man to fish, and he'll bore you rigid with fishing stories for the rest of your life.


Joel on Software's GUI book (4.00 / 1) (#78)
by sien on Sat Jan 11, 2003 at 01:42:57 PM EST

Joel Spolsky has written a great book on UI design called User Interface Design for Programmers.

The first few chapters are also available online. It's well worth at least checking out the online stuff.

The reason this book is so strong is that Joel actually worked for the company that builds the UIs that most of us use and thus has lots of real experience, rather than teaching a Human Factors course and consulting or just broadcasting the way that he uses a GUI.

There are heaps of common sense ideas for improving things, for example Halway usability testing, keeping things big and simple and many others.

There are also lists of what goes wrong when people do to much, for instance the story of Wizards, which have their uses but become a dangerous menace. Also sensible use of user testing is described.

In addition, the text has a lot of nice little anecdotes, such as that MS found that Excel is used a lot as a very primitive database, and that this use should be respected.



Excel (none / 0) (#124)
by Canthros on Sun Jan 12, 2003 at 11:03:22 PM EST

MS found that Excel is used a lot as a very primitive database, and that this use should be respected.
Which really makes me wonder why they bother with Access. But, yeah. That is so, so right it's scary. Really, really scary.

--
It's now obvious you are either A) Gay or B) Female, or possibly both.
RyoCokey
[ Parent ]
Huh? (4.33 / 3) (#84)
by autopr0n on Sat Jan 11, 2003 at 06:57:12 PM EST

What if you wanted to work on more than one document?

Um, open another window? I don't understand your argument at all.


[autopr0n] got pr0n?
autopr0n.com is a categorically searchable database of porn links, updated every day (or so). no popups!
again, huh? (3.00 / 1) (#85)
by autopr0n on Sat Jan 11, 2003 at 07:01:33 PM EST

or "My Documents" folder. I bet, like most people, it's a big mess of different stuff. Why haven't you organized it into neat little subfolders, you disorganized fool? Well, because whenever you looked for something, you'd have to trek up and down into all these subfolders, remembering how your mind worked 6 months ago.

Actually, it's because most people are lazy. I've never had any trouble navigating any folder hierarchies I've set up, but most of the time I just drop files in 'queue' and leave them there. forever.


[autopr0n] got pr0n?
autopr0n.com is a categorically searchable database of porn links, updated every day (or so). no popups!
One of my favorite technical subjects (4.60 / 5) (#86)
by epepke on Sat Jan 11, 2003 at 07:05:36 PM EST

Thanks. And I substantially agree with you, but it's more fun to disagree, so I will.

The only User Interface guru is your mum.

There are three stages that people go through when they are getting good at this. At the first stage, they think they know everything. At the second stage, they think their customers and stakeholders know everything. At the third stage, they realize that it's very important to listen to the customers and stakeholders but also to know when not to do what they ask. Expecing your mum (or mom for the U.S.) to be a User Interface guru is like expecting a film critic to write or direct a film. If you don't see how that isn't necessarily a good idea, see if you can find Myra Breckenridge by Roger Ebert.

Now have you noticed the subtle flaw? What if you wanted to work on more than one document?

It needs to be noted that the biggest proponent of getting rid of save is Alan Cooper. As the "father of Visual Basic" (that's what it says on the blurbs of his books), he is probably indirectly responsible for more bad user interfaces than anyone else on the planet.

That having been said, the problem with Save is not that it is bad to have a time when you could finalize your changes, experiment with things you don't intend to save, etc. but that it was not designed to do that. It was designed to solve the problem of RAM versus permanent storage. Nobody sat down and thought about the actual user problem and came up with a good system and good names. But when Save came with all programs, people hammered the square peg into the round hole and used it for that reason.

The trouble is that because nobody has really thought about the problem, the alternatives suck. "Make everything undoable" is the usual response. Well, have you tried to remember fifteen levels of undo? Undo is just fine for changes you've just made, but it sucks for anything else. Besides, it doesn't deal with branching. I've often thought that a kind of date-tracking mechanism would work (i.e. show me this document as it looked on June Thirty-Seventh in the year of the Squirrel), but nobody has sat down and worked through the system.

HyperCard, which was enormously successful in spite of the fact that hardly anybody remembers what it was, used a saveless paradigm, where one could save snapshots and then later go back to a snapshot. The weakness of this is that you seldom know that you're going to have to go back to a snapshot before you mess up. Also, a game called System's Twilight (an excellent puzzle-solving game, BTW) was saveless, but then again it had to be or the game wouldn't work.

Filenames are so passe... ...now let me just enter ten key-value pairs in a database.

False dichotomy, really. But again, file names weren't designed to solve the problem of finding files in a huge system. They were designed as unique keys for relatively wimpy file systems, which were extended for technical reasons to hierarchies. There are a lot of ways of doing things better than filenames or relational databases. The Be OS metadata concept was very good in this regard. There's even one simply change that could be made to existing file systems that I think would be better than the status quo: allow a file to reside in more than one folder. Yes, you can kludge this with aliases and links, but that's the save problem all over again.

Well, because whenever you looked for something, you'd have to trek up and down into all these subfolders, remembering how your mind worked 6 months ago.

Hard as that it, it's not as hard as predicting what your mind is going to work like 6 months in the future, which is what the filenames ueber alles approach requires you to do. I'd like to be able to find files that I don't remember much about, but I remember it was before last year's Siggraph, and I worked on the Flebeski account around the same time. I could do this, but I don't because no file finder makes it easy. This will take some work, but it isn't hopeless.

It's time for a total rethink! Er... er... virtual reality 3D walk through file managers! Or a CLI, but like a new CLI, that's new.

It may surprise you that the uncontested Uebermensch of UI gurus, the great Donald Norman himself, considers 3D file managers ridiculous. Or not, as you acknowledge that you're setting fire to straw men.

There are plenty of good rethinks that have been done and could be done. I recently bought a Wacom tablet. The stylus has an eraser on the end. Great stuff. And there are certainly good pieces of research out there. The Teddy interface and SketchPad seem to have hit the sweet spot for some kinds of 3-D gesture interfaces.

The big problem with the WIMP interface, seldom mentioned, is that it's really good when you have small screens and simple programs. However, with several hundred menu items and dozens of windows and millions of pixels, all of a sudden the basic idea of having everything visible at once doesn't seem so hot. I'd like to see more interfaces that emphasize certain kinds of information, such as the zooming data path and cylindrical menus researched in the 1990's. As far as I can tell, the Mac's zooming doc is the only common use of this, and I like it a lot. (I mean zooming while running the pointer across, not the hateful Genie effect.)

Another bugbear of UI gurus, who want every OK button to say something different: which is like having a different type of handle on every door in your house.

Again, it may surprise you that Apple have always told by people never to change the name of the OK button. Or, again, maybe not. Apple, for all of their faults, spent an awful lot of money hiring people to work out a lot of stuff, and they did a pretty good job. They actually added value to the WIMP interface. However, one of the many things that Microsoft and the Free Software/Open Source people have in common is the idea that it's good enough to say, "Uh-huh. Look, Beavis. It's a widget. Cool."

For myself, I say that there are instances where it is appropriate to get away from OK, but they're rare, and they should be thought out very carefully. Often, rewording the text works better.

And WIMP is, I think, misplaced for mobile devices - one reason why I would back the mobile phone companies against the PDAs. (Have you seen the O2 XDA with Pocket Windows? You can right-click in it, FFS.)

Pocket Windows, Windows CE, and all their bastard brethren just basically suck. The Palm OS is much better. Of course, that's why Palm isn't doing so well these days.


The truth may be out there, but lies are inside your head.--Terry Pratchett


Well... (none / 0) (#119)
by gilrain on Sun Jan 12, 2003 at 05:54:05 PM EST

I'd like to be able to find files that I don't remember much about, but I remember it was before last year's Siggraph, and I worked on the Flebeski account around the same time. I could do this, but I don't because no file finder makes it easy. This will take some work, but it isn't hopeless.
Well, certain unrespected OSes make it pretty easy. Of course, maybe that doesn't count.

[ Parent ]
System's Twilight, zooming, etc (none / 0) (#125)
by andfarm on Sun Jan 12, 2003 at 11:38:27 PM EST

First of all, System's Twilight actually does have saving. It just saves every time you quit. I just played it, so I should know :-)

Really, I think Dock zooming is pretty stupid, though. It makes it so that clicking on things is more difficult. ("Click on that... NO, GET BACK HERE!") The click zones are the same, but it can get visually confusing.

[ Parent ]

Way cool! (none / 0) (#158)
by epepke on Thu Jan 16, 2003 at 01:35:17 AM EST

First of all, System's Twilight actually does have saving. It just saves every time you quit. I just played it, so I should know :-)

I'm glad that someone still plays that. I think it's a great game, of a very small class of games I'd like to see more of.

I could never figure out how to use the W in the last transformational grammar puzzle, though, so I solved it without it.

Really, I think Dock zooming is pretty stupid, though.

So don't use it. I like it, though.


The truth may be out there, but lies are inside your head.--Terry Pratchett


[ Parent ]
Save ? (4.00 / 1) (#127)
by drsmithy on Mon Jan 13, 2003 at 12:24:16 AM EST

That having been said, the problem with Save [...]

Just what *is* the problem with a Save dialog ? To me it make perfect sense - just as with a physical document when I'm finished working with it I label it in some fashion I put it away, a Save dialog allows me to do exactly that - give it a meaningful label and put it somewhere in the filesystem. What is the problem ?

[ Parent ]

The problem is that (3.00 / 2) (#144)
by Holloway on Mon Jan 13, 2003 at 09:25:34 PM EST

with a physical document when I'm finished working with it I label it in some fashion I put it away
You're much tidier than most people. Some people even have unlabeled papers on their desk over night!


== Human's wear pants, if they don't wear pants they stand out in a crowd. But if a monkey didn't wear pants it would be anonymous

[ Parent ]
consistency (4.00 / 1) (#88)
by ryochiji on Sat Jan 11, 2003 at 07:28:26 PM EST

I think one thing a lot of people forget when talking about UI theories is consistency and predictability.  I think "ease of use" is relative to what users are used to and what they have come to expect.  You can have a wizbang new UI element that does magic, but if people don't know what to do with it, it really isn't intuitive.

Case in point:  images on Macs.  Whether you're using a browser, MS Word, AIM, or iPhoto, if you see an image, you know you can drag and drop it onto the desktop and it'll be saved there as a file.  You just know.  And nothing's easier to learn than something you already know.

---
IlohaMail: Webmail that works.

Bingo. (4.00 / 1) (#90)
by porkchop_d_clown on Sat Jan 11, 2003 at 10:19:47 PM EST

I recently saw the first brand new (to me) UI element I've seen in a long time. iPulse, from the IconFactory uses a kind of nested analog meter concept to show all major system stats in an icon sized chart. It's incredibly efficient and amazing and after weeks I still can't remember what some of the little colored arcs mean.

Multiply that by every widget in a large application and you have.... a complete inability to get your job done.


--
Wouldn't it be a victory for the oppressed people of Iraq, of North Korea, of Iran, if their police-state regimes were overthrown? Even by a cowbo
[ Parent ]

Untrue (3.00 / 1) (#126)
by drsmithy on Mon Jan 13, 2003 at 12:18:21 AM EST

Case in point: images on Macs. [...]

Web browsers are the biggest exception to thise rule, where they nearly always save images that are also links as links, rather than images. Mainly because - just like on Windows - the behaviour has to be decided by the developer and they might have a different idea as to how it should function than you do.

[ Parent ]

User interface gurus are useful (4.75 / 4) (#89)
by izogi on Sat Jan 11, 2003 at 09:16:30 PM EST

The only User Interface guru is your mum.

My mum probably knows what she doesn't like, but that doesn't mean that what she does like will work better. Often she doesn't realise that something's not working as well as it could be. If she does recognise something wrong, she often won't know how to fix it or assume that it's unfixible.

A few weeks ago I was commenting on the support list of a popular shareware software package used by amateur astronomers. The author sent out a broadcast asking about what people would like to see in the next version, and immediately everyone jumped in with all of the new features they wanted. It's been around for a long time and by now it has a lot of features.

I also posted a request, essentially asking that the user interface be re-assessed from a usability perspective. Although it's a very powerful application, its interface is a mess. Basically there are toolbars everywhere and it's not clear how to do simple things, and so on. The whole thing is feature-centric instead of task-centric, and it's done in such a way that it's still often difficult to find whatever feature you want. I gave several examples of bad interface design with some quite easy improvements, although ideally I was suggesting that a revolutionary redesign was needed.

The author didn't respond, although I hope he read it. The immediate polite reaction from a couple of other users on the list was that the user interface was far better than similar competing products that were full of annoying dialogue boxes popping up everywhere, and they were right. My point was that there aren't any powerful astronomy apps out there that have anything but hopeless interfaces. ie. Nobody's ever known what it's like to have a good interface in this domain. They either don't comprehend the concept of having something that's easy to use, or they think that because the tasks are already hard, they don't realise that they can be made easier.

The problem seems to be that all of the users are so used to it, they assume it's their responsibility to adjust to whatever the software provides. Most expert users have done just that. They've trained themselves to go through countless numbers of steps that should really be unnecessary, and they treat it as part of doing what they do. Never mind that it's horrendous to learn in the first place, and they're probably throwing away hours of time without realising it. This is not a beginners product, but it could be quite easily.

When it comes down to it, this is what interface guru's do... the good ones, at least. They pick up on interface problems that nobody else does, users and developers alike. They study the problems, and they develop ways to fix them and make use more efficient. One of the biggest problems is getting people (developers and users) to realise that there's a problem, and that it could be made much better.


- izogi


#1 Lie: let your mom design the UI (4.60 / 5) (#93)
by NFW on Sun Jan 12, 2003 at 12:41:10 AM EST

There's something to be said for making the software easy to learn, but not if it comes at the expense of usability later on. If your software is good enough that your users stick with it for a while, the learning curve will account for only a small percentage of the overall experience.

Don't let novices and non-users alone design the UI. Novices are only going to be novices for a little while. With experience, they become experienced users. With a long-lived product, the amount of time they spend as novices will be insignificant as compared to the amount of time they spend as experienced users. By designing for novices alone, you shortchange the users over the long run.


--
Got birds?


Sticking around (3.00 / 1) (#98)
by Znork on Sun Jan 12, 2003 at 04:24:23 AM EST

Agreed, as long as you take care to ensure that the novice can use the application enough to get them hooked. Following standard paradigms usually works for that; a user should be able to do basic work by using the File menu in combination with a toolbar for the most essential operations.

Also take care with toolbars... the saying 'a picture says more than a thousand words' is proven rather drastically faulty by most toolbar icons. Usually those icons cant even convey a complete sentance. Write eloquent explanations.

[ Parent ]

Grate? (4.00 / 1) (#95)
by Silent Chris on Sun Jan 12, 2003 at 01:38:08 AM EST

"Ah yes, Latin. Bonus points for those who spot the quotation at the top."

Maybe I missed it, but I've never seen it referred to as "Grate Latin lies".  Bonus points indeed.  If anybody can find it with that spelling, I'd be impressed.

Grate (4.00 / 2) (#112)
by aeolist on Sun Jan 12, 2003 at 11:29:53 AM EST

A British, maybe English, maybe even English Public (=private) School thing. It's from 'How to Be Topp' by Geoffrey Willans and Ronald Searle, the first of the Molesworth books. Molesworth describes his school (St. Custard's), masters, friends, enemies and fantasies. They're written in bad schoolboy English, and fantastically illustrated by Searle. Irresistable to many.

It's here [www.stcustards.free-online.co.uk] in much of its glory.

I like WiMP interfaces, and CLIs. Don't really have much to add to the body of the debate.

[ Parent ]

I didn't need those fingers, anyway. (4.75 / 4) (#104)
by selkirk on Sun Jan 12, 2003 at 07:21:31 AM EST

Another bugbear of UI gurus, who want every OK button to say something different: which is like having a different type of handle on every door in your house.
No. Its really more like making the emergency stop button for a piece of heavy factory machinery substantially different than the "take my careless fingers off" button.

At a glance, you can tell which button you need to push (big red one) without putting too much thought into it.

Which dialog would you rather see:

Engage the "finger widow 2000"?
[start machine] [cancel]

Engage the "finger widow 2000"?
[OK] [Cancel]

Now consider that the finger widow 2000 software might also have some other sort of similiar but safe and routine alert like this:

Reset the "finger widow 2000"?
[OK] [Cancel]

Which style of naming buttons would you rather have, lefty?

Apple human interface guidelines for Alerts
Apple human interface guidelines for push buttons

Doors do the same thing, software alerts do not (4.00 / 2) (#105)
by selkirk on Sun Jan 12, 2003 at 07:30:07 AM EST

Doh. Where is the "edit post cuz I forgot to say something" button?

I meant to add that the doors in your house all do substatially the same thing and have the same doorknobs.

However, alerts in software generally have different reasons behind them, and so customizing the buttons helps distiguish between them and allows you to progress through them faster with less errors.

[ Parent ]

true up to a point (3.00 / 1) (#114)
by dash2 on Sun Jan 12, 2003 at 12:16:16 PM EST


It is true that not all dialogs do the same thing. And it is also true that the "OK" text does have an ambiguity. Sometimes it's used in alerts:

You have 3 new messages!
[OK]

and sometimes it's used in dialog boxes:

Configure your cookie policy
[various options]
[OK] [Cancel]

I think that is a bad ambiguity.

However, I think that using a single button name to confirm choices, all or most of the time on dialog boxes is a good idea, despite your point about the different functions these dialogs carry out. The reason is that the dialog box itself can and should make clear what the results of hitting the button should be. To extend my door analogy: if there is a dangerous monster behind the door, you don't change the door handle design. You put a sign on the door saying "Beware of the shibboleth". This tells the user what is going to happen when the handle is turned, but it still conveys the idea that this is a door you can go through.

And I do think the idea of a "dialog" with choices the user can make and then confirm or cancel is comprehensible enough to have a single interface, despite the many different things the dialog can control. Just as a door can lead to many different rooms.

(Digression: what that single interface should be is a different question. I think [OK] [Cancel] was not bad, but the introduction of [Apply] has totally screwed it up - once you've hit apply, you the [Cancel] button is still used to close the dialog, even though it won't cancel your changes.

The Gnome control center, with [Try] and [Revert] is groping towards a different model, but I find it confusing. Maybe [OK] [Try] [Cancel], and if you hit Try followed by cancel, your changes really are rolled back - unlike with [Apply].)

Anyway, what I was going to say was: don't trust my opinion - ask your mum!
------------------------
If I speak with the tongues of men and of angels, but have not love, I am become sounding brass, or a clanging cymbal.
[ Parent ]

Are you sure you don't want to cancel the missiles (3.00 / 1) (#154)
by lordpixel on Tue Jan 14, 2003 at 09:59:21 AM EST

To take this thread to its logical extreme

Are you sure you don't want to cancel the missile launch?

      [Cancel] [OK]

or

Are you sure you don't want to cancel the missile launch?

      [Cancel] [Launch]

Or to take a real world example. ATM from an UK bank asks:

Do you want a receipt?

                        [YES]
                        [NO]

every time before it gives you the money. I always press NO.

Now, one day I type in my usual amout and then I see:

Blah blah blah blah receipt?

                         [YES]
                         [NO]

I press NO. Out comes my card... but no money.
Log back in and check balance... <phew> everything's OK... but what happened?

Well, about the 3rd time this happened to me, I actually stopped to read the message, which said, more or less:

This machine has no paper and so can't print a receipt. Continue anyway?
                           [YES]
                           [NO]

But, of course, I didn't actually read the text, because I was doing something I'd done 1000 times before. This is particularly insidious, because the options are reversed from what they normally are. Usually pressing NO gets you cash and no reciept. Now pressing NO gets you no cash and worrying that the bank debited your account.

Perhaps for the regular question:
                         [With Receipt]
                         [No Receipt]

and for the out of paper:
                         [Continue]
                         [Cancel]

You're right. There's usually nothing wrong with using [OK] when the action is the only button, or if its a normal safe operation.

However you're lumping way way too many things together. You see, as we've just demonstrated, there are many cases where a little thought about interaction design will mean you get it right before you do user testing (which is time consuming and expensive). I don't care if you read it on a guru's website, learned it in a lecture, read a book on HCI (many of which are based on papers and studies and exactly those same user tests you recommend).

Maybe you just plain learned the hard way by screwing up in the past. Fine, but part of advancing the state of the art is to learn from other people's mistakes as well as your own. Its certainly worthwhile evaluating who is doing the talking though. Some of these gurus have been doign real live user testing for 20 years, some haven't. It would be interestint to hear who you think is a quack and who you think has a clue?

As ever, the truth lies somewhere inbetween believing everything the guru in the ivory tower says, and completely re-inventing the wheel for yourself. There is good knowledge out there, but yes, you do have to filter. Dismissing all outside knowledge and fixating on user testing as "the one true way" is no more reasonable than the reverse (learning it all from a book). If nothing else, you likely don't have time to test every little thing (eventually, your mother gets bored).

I'm with you on the [Apply] thing though. 99% of preferences and settings should be live update. See the way Apple does preferences and control panels (and pretty much always has). One thing I'd like to see someone try is to implement "Undo" on such a live preferences dialog. Once you move away from having to "OK" the prefs dialog, which closes it, then you hit new issues. The trouble with "Revert" in a live dialog is the user might leave the dialog open for 30 minutes and make several changes.. where to revert to? A problem which might be addressable with multi-stage Undo (a metaphor that has the advantage of being familiar to most users).

I could go on about the rest of your article, but this is already the longest K5 comment I ever wrote.

I am the cat who walks through walls, all places and all times are alike to me.
[ Parent ]

You need better gurus. (4.50 / 2) (#107)
by eann on Sun Jan 12, 2003 at 09:55:29 AM EST

There are plenty of well-respected people in the industry who say similar things. To name a few: Donald Norman, Edward Tufte, web-centric Jared Spool, Jeffrey Veen, and Jakob Nielsen. Some of them carry things to extremes, but often that's just to make a point.

In general, you're right. It's probably wise to avoid advice to mkae major structural changes just because some self-proclaimed expert doesn't like the way we've been doing it. However, thanks to the pervasiveness of computers, software (and web) usability is fast becoming a major research focus, and we should be paying attention to the results.

As for the suggestion to use our mothers, well, that's valid sometimes. I generally don't write apps for my mom, though; I find I get better feedback from my target audience. Novices are easy to come by in most fields.


Our scientific power has outrun our spiritual power. We have guided missiles and misguided men. —MLK

$email =~ s/0/o/; # The K5 cabal is out to get you.


Voice control (4.00 / 1) (#118)
by phuzz on Sun Jan 12, 2003 at 05:02:57 PM EST

Whilst everyones arguing about mouse vs keyboard, or CLI vs WIMP.  I thought I'd throw another one in:
Personally I can't wait 'til I can shout at the box in the corner,
"Oi, computer, show me those photos to the party we had up Staton bank last year" {I wonder, would you ever say please?},
and the computer sticks them up on screen.  Ok, this is kind of a far-fetched (in the sense of, it's going to be years before it happens), and it wouldn't work very well in a traditional cube farm.  (Everyone shouting at their computers, there's a Dilbert catoon about that now I remember.)  But it would be the easist way of making the box do what you wanted.  Of course you would have to put loads of meta info about each file in, but with a voice input it would be quite easy, also you might expect your digital camara in a few years to remember a bit about the circumstances of each pic.
Ok, so voice control isn't the best tool for all circumstances but it's one we use every day between ourselves, maybe it has a place in our computers.

OS X has voice control built in (4.00 / 1) (#120)
by Mindcrym on Sun Jan 12, 2003 at 08:19:29 PM EST

Your idea isn't as futuristic as you may think. Apple's OS X has voice control capabilities already built in. I don't use it on a regular basis, but its kindof fun to play around with. You can say, "Computer, what time is it," and a voice synthesizer will say the time aloud to you. You can set it up to do more complex things than that though.
-Mindcrym

[ Parent ]
OS/2 had Voice Control in 1996 (4.00 / 1) (#135)
by Wildgoose on Mon Jan 13, 2003 at 05:42:44 AM EST

A quick google for example:

OS/2 e-zine

It's a real shame IBM couldn't market their way out of a paper bag. I knew that the dancing elephants spinning plates on poles was referring to pre-emptive multi-tasking. But the general public?

[ Parent ]

Gah. (4.00 / 1) (#139)
by porkchop_d_clown on Mon Jan 13, 2003 at 09:07:52 AM EST

OS 9 had it, other OS's have had it - OS 9 even supports voice authentication as a login mechanism.

Unfortunately, my kids managed to convince it that their "voices" sounded just like the theme to Pokemon - which was playing on the TV next to the computer.....


--
Wouldn't it be a victory for the oppressed people of Iraq, of North Korea, of Iran, if their police-state regimes were overthrown? Even by a cowbo
[ Parent ]

the problem with Voice Control ... (4.00 / 1) (#136)
by l0st3d on Mon Jan 13, 2003 at 06:03:10 AM EST

... is that you can only hold so many things in your head at once (usually 7 or 8 I think - depending on how clever you are) and speaking takes up some of these registers (for want of a better word) so you can'n perform the same complex tasks when speaking that you can when typing or writing ... so while it may be fine to use this interface to retrieve your pics, it's rubbish for writing your Theoretical Physics PhD ... a combination of interfaces would be more effective - voice, keyboard, mouse, etc, each for it's appropriate task ...

[ Parent ]
Never. Never. Never. (4.00 / 1) (#138)
by porkchop_d_clown on Mon Jan 13, 2003 at 09:06:16 AM EST

5 seconds after you enable it in your office, your computer hears someone else yelling at their computer "reformat /y c:" and pfft....


--
Wouldn't it be a victory for the oppressed people of Iraq, of North Korea, of Iran, if their police-state regimes were overthrown? Even by a cowbo
[ Parent ]

Dilbert (3.00 / 1) (#152)
by phuzz on Tue Jan 14, 2003 at 07:17:21 AM EST

There's a dilbert cartoon with thaqt exact joke, I've tried googling for it but no luck, anyone else know a link to it?
Of course that sort of thing is why I said it wouldn't work in an office, I know it's not practical, but it would be cool.

[ Parent ]
Perhaps not Dilbert? (none / 0) (#162)
by screwdriver on Wed Jan 22, 2003 at 04:25:29 PM EST

I don't remember that in Dilbert, but in PvP that situation was shown. I don't have a link for the specific comic, however.
The situation was like this: Francis is showing the voice recognition capabilities of his PC to Cole, and Cole shouts: "File, Exit, Run, format c, Enter, Y, Enter!".

[ Parent ]
book recommendation (3.00 / 1) (#121)
by Mindcrym on Sun Jan 12, 2003 at 08:39:11 PM EST

For anyone interested in the problems of interaction design the book The Inmates are Running the Asylum by Alan Cooper is a must read.
-Mindcrym

Terrible book (3.00 / 1) (#122)
by epepke on Sun Jan 12, 2003 at 09:43:13 PM EST

In an oblique way, though, it did make me realize what was really wrong with user interfaces. Also, your post inspires me to rework a piece that I had written a while ago as an Op-Ed


The truth may be out there, but lies are inside your head.--Terry Pratchett


[ Parent ]
what was terrible about it? (3.00 / 1) (#130)
by Mindcrym on Mon Jan 13, 2003 at 12:52:54 AM EST

Or is that what you're going to write about in your article?
I thought the overall message of the book was pretty good: don't let programmers design the interface that the customer will eventually have to use.
-Mindcrym

[ Parent ]
Yep, in the article [n/t] (2.00 / 1) (#146)
by epepke on Mon Jan 13, 2003 at 11:21:22 PM EST


The truth may be out there, but lies are inside your head.--Terry Pratchett


[ Parent ]
Re: Terrible Book (3.00 / 1) (#147)
by akehurst on Tue Jan 14, 2003 at 01:49:26 AM EST

Could you please explain why you think 'Inmates...' is a terrible book? I found it to be quite a wake-up call for the industry. -Justin

[ Parent ]
No solution (2.50 / 2) (#151)
by e8johan on Tue Jan 14, 2003 at 07:14:13 AM EST

One problem with most HCI classes and such are that they primarily show you how to detect a bad design, but not how to make a good one (even though they try to supply tips).

Yet another thing is that they want us to leave all the baggage and do it all in a "better way". My mother has problems with WinXP, as it has changed the look and ways one has to approach certain things. This shows that sometimes habits are stronger than the new "better way" approach.

Gurus. (none / 0) (#165)
by Michael Moser on Mon Feb 17, 2003 at 09:45:06 AM EST

I don't know about _your_ gurus, but
Mr. Bruce Tognazzini is certainly a different
case

http://www.asktog.com/

and

http://www.asktog.com/menus/designMenu.html

- if only because he has a WEB site where he
is posting lots of stuff that teaches how to ask _questions_ about UI.

Great UI design lies | 165 comments (155 topical, 10 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!