Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

The unchanging GUI

By Apuleius in Technology
Mon Feb 19, 2001 at 02:54:46 AM EST
Tags: Software (all tags)

Periodically someone on K5 or in other social circles will ask why the graphical user interface is changing so little nowadays (that is, on the desktops of the world at large). I have two unoriginal reasons why, and a little rant on why the freezing of the GUI is not necessarily a bad thing.

The question of why the computer interface isn't changing is regularly asked on IRC, often on Kuro5hin, and over on Edge.Org, Jaron "Virtual Reality" Lanier, practically despairs over it.

The first reason I am asserting is that the physical devices used for the human-computer interface sharply limit the range of motifs, metaphores, and ideas that will work well with it. The computer interface nowadays is constrained to a pointing device, a keyboard, a pair of speakers, and a two-dimensional, usually flat, multicolored screen. Throw in a microphone if you feel like it.

The pointing device comes in several versions, but the rolling mouse seems to be the only version most people will tolerate for long periods of time. Some prefer a trackball, but the remaining options are even less likely to be adopted. The nipple is approaching extinction, and the finger pad is for limited use only. The stakes here are no longer academic; they are ergonomic. That leaves not much space for physical creativity.

The same goes for the keyboard. The layout will be tweaked for years to come to maximize the tolerance people show for the keyboard, and there is an ongoing redesign of the layout of speed keys, but the interface is defined and frozen.

Other input devices are unlikely to make much headway. Speech recognition's only use is to let you keep working when you take that carpal tunnel-mandated break. Prolonged talking into the computer is a pitiful way to wreck your vocal cords. Foot pedals are similarly unattractive. We may learn to like enhanced mice, like the roller-for-a-middle-button type that Logitech sells, or maybe mice with tactile feedback, but these changes would not be revolutionary.

As for the output devices, the speakers are limited in the interactivity they give. We use them mostly for music, partly because canned sounds as computer messages can get very tiresome after a while, and partly for privacy. The screen itself may get more versatile as it becomes lighter, or as LCD technology allows us to look more closely into it, but once again, these changes will not be earthshaking.

This physical configuration allows for huge amounts of creativity, but creativity doesn't quite pay well for the concept I am hawking here, which is mundane computing (MC). MC is what is requiring optimization in the graphical user interface and limiting innovation. MC does not reward clever ideas. It rewards clever ideas that people will use 8 hours a day. Vive le difference. I want the term 'mundane computing' to catch on as an anti-hype agent.

Jaron Lanier's work pioneering virtual reality shows that he is a creative genius. But his ideas simply do not work on the interface front because of the limits of the devices involved. People do not like headmounted displays. People do not like motion sickness. And when it comes to organizing the information they use in conjunction with computers, (unless they do CAD/CAM), people do not need fancy 3d viewing programs.

So, with tweaks here and there, the 'desktop' metaphores for the GUI will prevail (though they already are moving outside what was once obviously a 'desktop' motif). This is not a bad thing. The user interface for the automobile is not changing much nowadays, either. However, there is much to be done in the optimization arena, and for the same reasons.

Because computing has established its place in the modern office, there is a realm of computing that should be segregated from the rest of the field, which I am calling "mundane computing." MC is the second reason why the GUI is not about to change much, nor should it. By Mundane, I do not mean 'boring.' But first of all, I mean 'no longer novel'. MC is every use of computing which has been in the public view long enough, which people employ for long parts of their day, and in which the computer and software are obviously a means to an end, and not an end in themselves. Word processing, spreadsheets, graphic design, email, and many other uses of computers should fall in this realm.

In MC, a user interface must work well, and this 'works well' attribute is enhanced by a previous familiarity on part of the user. So, in MC, there is a barrier to entry by new ideas, (which is of course the other reason) and progress is largely of the form of evolution, not revolution. Hackers should rejoice, however, because dammitall, evolution remains necessary in the graphical user interface. Enough evolution has already happened to show my point. The desktop motif is no longer so close to a real desktop. Remember when you discovered that you could mark a rectangle on the background of a Macintosh and then move all the icons within it in unison? Remember when it was that you realized that was an utterly useless feature? I forgot when I recalled that last one.

There are other ideas (motifs, widgets, et cetera) that have sprung up in various programs, which are creative but unlikely to be used in mainstream programs. A nifty thing I discovered (and tried to use) is Open Data Explorer. Take the pipe motif from the Unix shells, use simple widgets to represent it graphically, and then make the widgets perform Matlabesque tasks. Nice idea, but I'll stick to Matlab and Octave. This is just one example out of many things that are creative, clever, but unusable for long periods (for most people, anyway). The creative programmer is going to find resistance against his ideas entering into mundane computing.

Since the need for optimal usability for the GUI will drive it to change in the coming years, let's take a look at where it will go: the bone-rattler example for a GUI was something I encountered in a physiology class. It was a simulation written with Xlib in which there were quite a few labeled buttons to 'press'. The target for each button, however, was a 6 by 6 pixel bullet next to the label, rather than a rectangle surrounding both. Mousing needs to be minimized, and therefore targets enlarged. The new Mac trick of treating the mouse cursor like a magnifying glass when appropriate is a step exactly in that direction. Another is the ability to have sloppy focus in most X window managers. (Note that Microsoft is looking at what X window managers do well.) What will be ultimately a limit to changes in the GUI is the need to use only those widgets that represent two-dimensional physical objects. Three dimensionals won't work on a CRT.

But with a slowdown in the evolution of most software, (knock wood, cross fingers, throw salt over left shoulder) the average user will come to expect software to be more stable, more reliable, faster, and contain fewer nasty security surprises. Wouldn't maturity in the software industry be nice? Just a thought.


Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure


o Will change radically 7%
o Should change radically 12%
o Is fine. Gimme a damn xterm. 50%
o Will be replaced by direct neural interfaces 29%

Votes: 57
Results | Other Polls

Related Links
o Kuro5hin
o Edge.Org
o despairs over it
o Open Data Explorer
o Also by Apuleius

Display: Sort:
The unchanging GUI | 16 comments (14 topical, 2 editorial, 0 hidden)
What will make the gui change... (4.33 / 6) (#1)
by theboz on Sun Feb 18, 2001 at 10:24:13 PM EST

We have to look at the computer as a whole in order to understand why the gui hasn't changed much. You are correct in your statements about current hardware being a limiting factor, but it will change.

We are starting, although slowly, to move away from the desktop computer. Look at how versatile PDA's are now. You can do anything on them. Microsoft is making the mistake of porting their desktop OS and software over to Windows CE, which uses alternative input devices on the PDA's (I know there are some very small notebook type computers running CE as well.) If you look at the palm interface, it is much cleaner and easier to use than WinCE because it is designed to be used on a small screen with a pen. It is very minimalistic in nature because the input device is more prone to error than a keyboard. Also, the screen is much smaller so you can't fit as much on there.

Also take a look at WAP enabled phones. Then, throw them all away. Just kidding. The phones are difficult to use in what we have now. Even for simple things like sending SMS messages I am annoyed by needing to press the numbers multiple times just to say a short message. However, eventually someone will find a solution, such as giving us a pen or voice recognition, so we don't have to type the message. This too would radically change how we use computers.

Don't think of computers as only your desktop and server. Think also of your DVD player, your palm pilot, cell phone, ATM, etc. All these things are types of computers with their own GUI specific to the input device you are using and the purpose of the machine. The gui is alive and changing, even if the paradigm of the desktop computer doesn't.


Evolution not revolution (4.33 / 6) (#2)
by MoxFulder on Sun Feb 18, 2001 at 10:26:28 PM EST

Thanks, Apuleius, for the thoughtful and interesting writeup.

I agree that change in the basic human-computer interface should be slow and evolutionary, rather than sudden and revolutionary. Even the development of today's Graphical User Interface did not materialize from nowhere to replace the Command Line Interface ... first you had X Windows with a few terminal windows open, then some basic GUI elements like menus, then dialog boxes, tabs, taskbars, etc. etc.

And many of us, especially in the Linux community, still enjoy or even prefer the power and flexibility of the older Command Line Interface. I, for one, find Windows and MacOS frustrating because it's hard to pull up a command line to do things like pipes, scripting, wildcards, etc. etc. The GUI interface has been able to replace wildcards adequately with multiple selection and the mouse, but pipes and scripting are still the exclusive domain of the CLI.

I don't believe that interface designers have become unresponsive to the changing needs of users. For example, consider the development of the wheel mouse: Before the World Wide Web, I rarely needed to scroll through text unless I was writing with a word processor. In that case, I would have my hands on the keyboard and it was easy to scroll by hitting the PageDown key. But with the advent of the Web, I now spend more of my time on the computer reading and less time writing. When I'm reading a web page, I have my hand on the mouse to click on links ... so it would be nice to have a very quick way to scroll through a long document without repeatedly moving my mouse pointer (and my eyes!) over to the scroll bar! Hence the wheel mouse ...

Do other K5 readers think that there's a good reason that the Graphical User Interface should be completely overhauled? Are there any major problems with it that can't be solved with an evolutionary approach (or by reverting to a good ol' command line ;-) ?

"If good things lasted forever, would we realize how special they are?"
--Calvin and Hobbes

CLI on Windows and Mac; graphical script creation (4.50 / 2) (#11)
by pin0cchio on Mon Feb 19, 2001 at 10:28:56 AM EST

I, for one, find Windows and MacOS frustrating because it's hard to pull up a command line

Under Windows, there's Red Hat Cygwin, a Win32 port of GNU userland. Classic Mac OS has MPW (a decent CLI) and AppleScript (a somewhat verbose scripting language akin to VBS). Mac OS 10 apparently has full CLI support (through its Terminal) and a BSD/GNU userland.

to do things like pipes, scripting, wildcards, etc. etc. The GUI interface has been able to replace wildcards adequately with multiple selection and the mouse, but pipes and scripting are still the exclusive domain of the CLI.

Ever played around with Widget Workshop? This Rube Goldberg-esque software toy from Maxis has some good ideas about how pipes could work in a graphical environment: Create a new "widget," or script. Place a ps aux object, lay a pipe to a grep object with a search string coming in through the top, pipe that to the 'stdout' connector on the right side. Shell scripts can be built graphically; somebody just has to sit down and implement it.

[ Parent ]
Good point (none / 0) (#13)
by MoxFulder on Mon Feb 19, 2001 at 07:00:16 PM EST

You're absolutely right about the possibilities of using Cygwin and AppleScript (to say nothing of MacOS X's BSD core ;-) to take advantage of a powerful command-line interface under Windows and MacOS. I think the decision by Apple to port their GUI over to *nix was a brilliant stroke that will hopefully win them a bigger chunk of the operating systems market.

I personally use Cygwin at work. It's a good development environment, but it's somewhat hampered by the underlying Windows operating system. Setting it up was a real pain the first time, it has some weird filesystem inconsistencies, and you can't just compile any old Unix tarball on it ... Nonetheless, it's a pretty good system.

I haven't used either AppleScript or any graphical scripting tools. I find that I have a very hard time drawing charts or graphics to express what my programs do. This might just be because I haven't practiced it enough ... but I think that for now, I'll stick with Bash, Perl, grep and other good ol' command line utilities.

"If good things lasted forever, would we realize how special they are?"
--Calvin and Hobbes

[ Parent ]
what about standardizing application usability (3.40 / 5) (#5)
by nickp on Sun Feb 18, 2001 at 11:39:05 PM EST

I wish the article went into more detail about differences of interface across applications. To me that's what the word "frozen" GUI is about. Being a Linux user I constantly have to grapple with inconsistencies among interfaces in different programs. The look-and-feel of X, KDE, GNOME applications is so different, but the main problem is that there is no standard way of setting user interface preferences (as well as other program options). KDE made huge progress in this direction, but the consistency it provides applies only to KDE apps.

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Huh? (3.00 / 1) (#7)
by joto on Mon Feb 19, 2001 at 12:49:58 AM EST

What is an X app? How does it differ from Gnome and KDE apps? Aren't those X apps as well?

Of course X apps differ. They are written with different toolkits, or without a toolkit at all. That is mainly a piece of history. Nowadays we have Gnome, KDE and Motif (or lesstif, if you prefer). All of them are free and quite functional. And quite similar. It's buttons, scrollbars, menus, everything you are used to from Windows.

Ok, there are some small differences. You may have to wrestle with them sometimes. But most of the time, I wrestle with bugs, not toolkit differences. I still don't think most Gnome or KDE apps are worth using, there are a lot of bugs that needs to be sorted out. Due to random bugs in most apps, my usage of X11 is mainly limited to xterm, emacs and sawfish. Everything else is just too annoying.

[ Parent ]

Shifting Metaphors (4.20 / 5) (#6)
by zephiros on Mon Feb 19, 2001 at 12:22:20 AM EST

When the Next Great Challenge for computers was gaining general public acceptance, it made a lot of sense to build an interface based on real world metaphors. When you're dealing with someone who has never used a computer before, familiar paradigms help reduce the delta between what the user knows and what the user needs to know.

That said, I think the window for "UIs for the computer illiterate" is closing. Most users no longer come to the table expecting an application to work like any real world object. The delta is now between the new application and other applications the user has used in the past. IMO, this sort of enforced self-referentialism is what is slowing the pace of UI innovation. A new UI no longer needs to be better than any other UI, it needs to be so much better that it justifies the cognitive dissonance associated with learning it. As computers become more a part of daily living, this particular barrier is going to get larger and larger.

OTOH, I think there's still a lot of opportunity for incremental change. Since using a computer isn't a potentially life-threatening experience (unlike driving), there's much less risk associated with tweaking the interface. Further, since computers tend to be modular, one doesn't need to redesign the entire machine in order to make changes to the UI. Replacing a car's steering wheel with a joystick is a pricey proposition. Buying a mouse with a wheel is a $30 investment, and one can always take it back to CompUSA if it's unusable.

Finally, while the mouse/keyboard/windows paradigm is the de facto UI, it's only been dominant for six or seven years now. I think it might be a bit premature to sound the death knell for innovation. Someone is always building a better mousetrap.
Kuro5hin is full of mostly freaks and hostile lunatics - KTB

Quick thoughts (4.00 / 4) (#8)
by Miniluv on Mon Feb 19, 2001 at 02:10:06 AM EST

I think the focus on the overall group of elements that make up a desktop environment is somewhat shortsighted, because in reality the entire group is limited by the most limiting component. For example, my video subsystem comprises my CPU, my RAM, my video card (processor and RAM), and my my monitor. Whatever has the least capability determines the complete outcome. If my video card is incapable of resolution above 800x600 then that's all I will get.

What does that have to do with anything? It means that your most limiting piece of desktop technology is currently your two dimensional CRT/LCD display. Yes, you can play some tricks with shading, occlusion, and other nifty OpenGL-ish tricks to make things appear 3d, but they don't really. They provide some semblance of depth perception, but that doesn't convey very well on a highly artificial appearing surface such as a glass screened monitor.

LCD flatpanels are somewhat better, the LCD provides richer colors, and crisper images and thus enhances the illusion of depth but still falls short. The only technology I've seen that is both functional, potentially practical, is something like SGI's visualization desk. I think that this may evolve into a sort of 3 dimensional holographic work space, which would really be the ideal next step in the desktop evolution. It would be much more ideal to represent file trees as 3 dimensional relationships, and it has the potential to really change the way we look at a lot of data structures and their relationships.

"Its like someone opened my mouth and stuck a fistful of herbs in it." - Tamio Kageyama, Iron Chef 'Battle Eggplant'

VR et al (3.00 / 2) (#9)
by fluffy grue on Mon Feb 19, 2001 at 05:10:49 AM EST

Incidentally, my research is leaning towards VR interface issues. But not the VR which was over-glamorized in the movies to the point of stupidity (which, sadly, has caused VR to be somewhat of a "dirty word" since it was buzzworded and hyped to death, much like AI was about 25 years ago; it seems that Jason Lanier has yet to outgrow that phase, too).

Specifically, I am working on how to go about doing a semi-immersive 3D environment using standard hardware - mouse, keyboard, monitor. None of that goggles shit, no forceballs, or the like.

Quake is a good start, but its interface is incredibly single-minded, and doesn't exactly lend itself to productivity with multiple applications.

I do not expect to replace office applications or the like. Rather, I want to open up whole new application spaces. I dunno what though, aside from the obvious collaboritive virtual environments (MUCK-type systems etc.). I'm sure someone else will be able to think of something, though. ;)
"Is not a quine" is not a quine.
I have a master's degree in science!

[ Hug Your Trikuare ]

Read "The Diamond Age"... (none / 0) (#15)
by Hernan Laffitte on Fri Mar 02, 2001 at 09:30:51 PM EST

... for some suggested uses of the kind of interface you are mentioning.

[ Parent ]
Do we skip over the graphical interface? (4.50 / 2) (#10)
by slaytanic killer on Mon Feb 19, 2001 at 07:10:08 AM EST

Will pervasive computing beat VR in terms of timeframe? If we do have pervasive computing, then it is very possible that everything will have the tiniest possible interface, nothing so strong as a GUI, and the entire user experience will be about manipulating lots of litte things intuitively.

Does computing want to be invisible, or immersive? And what kind of overlap do we think is natural between the two?

GUIs should be designed for children, not adults (4.00 / 1) (#12)
by SIGFPE on Mon Feb 19, 2001 at 12:38:26 PM EST

When we think about language we often think about adults speaking languages. But people learn languages as children and so the key requirement for a language to exist is that it be learnable by children. Learnability by a child is very different from learnability by an adult - certainly in the case of language we know that there are linguistic features that children easily learn but as adults we can almost never grasp. When people try to design artifical languages they produce dumbed down languages like Esperanto. They have completely regular grammars to make them easy to learn by adults (I don't necessarily mean dumbed down in terms of what can be expressed by them).

I've a feeling something similar might hold for GUIs. GUIs are dumbed down because they are marketed towards adults. If it requires any intelligence to learn it's hard to sell and hence we have interfaces like Windows. However if a GUI was directed at children I think it could be far more sophisticated. If kids are brought up with a GUI almost from birth it could have far more complexity than those of today and children would just learn it like all of the other millions of things that kids have to learn as a part of growing up. But as long as GUIs are designed for adults who are not prepared to actually make any effort to learn anything (or are simply incapable of learning) I think the GUI will stagnate.

Minor detail. (none / 0) (#14)
by Apuleius on Mon Feb 19, 2001 at 09:59:12 PM EST

An evolveable version of this essay is to be found here.

There is a time and a place for everything, and it's called college. (The South Park chef)
Web applications (none / 0) (#16)
by Hernan Laffitte on Fri Mar 02, 2001 at 09:44:08 PM EST

I think that Web-based applications are the (pardon the expression) "wave of the future". They have the potential, if implemented well, of being as disruptive as the GUI was in the '80s.

(This is, of course, not an original thought. The network is the computer, dot net, everybody pretty much is hyping this concept.)

The only problem I see with the Web-based apps is that sometimes the naive user can't tell the difference between the browser's widgets and the site's. I saw it actually happen with Hotmail. In my opinion, it's the browser that gets in the way with too many buttons and menus. It should have less controls that are visible all the time and occupying precious screen real state. Kiosk mode should be the default browser configuration :-)

Useit has lots of useful information on the dismal state of the Web interface design.

The unchanging GUI | 16 comments (14 topical, 2 editorial, 0 hidden)
Display: Sort:


All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!