Recall the frustration you feel as a user when a familiar interface has been rearranged badly, or even at all. It's like having the keys on your keyboard rearranged or the nerve for your big toe controlling your index finger. Interface rage is fairly well documented (I think Norman talks about it in
Design of Everyday Things
) and established interfaces have a lot of inertia. Conventional interfaces are often kept over more usable, new, interfaces for this reason - the new interfaces have to demonstrate radical improvements for people to change.
When programmers, the most expert users, are confronted with a new expert interface, you get interface rage to the power of ten.
Let me illustrate with some examples, helpfully provided by those that initially rejected this article :)
is an interesting piece of design for programmer usability. The designers wanted to write a very modern, Smalltalkish (simple) language with a cleaned up C++ object model. Smalltalk itself breaks enough imperative language conventions it never took off as a mainstream application language. (I know! It has a strong and loyal community that have used it for everything from building nanotech assemblers to piloting the space shuttle, and it can be learnt by even the most mentally retarded koala. The last thing I want to do in this article is start a language flamewar.)
The creators of Java desperately wanted it to become a mainstream programming language. So they took many existing conventions for writing individual statements, while leaving the object model intact. They were so worried about legacy programmers they even broke Java's naming convention to help them. For instance, Pascal's println() was used instead of printLine(). The creators worried even more about C, where everyday operators and conventions were swallowed whole.
This superficial familiarity delayed the first flush of C programmer rage until they actually tried to used the language, rather than causing hatred upon sight. Given Java without method implementations is basically an OO focused less-baroque Ada95, this was a valuable hurdle to leap.
is a widely used scripting and teaching language with a clean and simple style. Wherever it is mentioned someone must also curse a critical language element - whitespace as a block delimiter. This concept causes disorientation amongst established programmers.
After all, indentation itself has been the cause for many flamewars and has a clear effect on the texture of code. To a veteran of such wars, the Python interpreter is an unexpected tyrant, and more effort is required to evaluate it as a tool. Interestingly, the ++ operator was added to a recent release of Python to increase it's utility and usability.
Perl killed my brother and ran off with my wife
Despite my best intentions, I am a language bigot, and
code sends involuntary shivers down my spine. The clever one line hacks the Perl community loves give me the screaming heeby-jeebies.
This is a shame. Perl has a
of developed code, and good tutorials and references. It appears to be an excellent tool for writing quick scripts, as well as determining the semantics of transmission line noise.
Experts invest a lot of time in learning expert interfaces, much of it at a level below conscious thought. As such immediate distaste is a reasonable first response to radical interface changes, such as when changing programming languages. Exposure to many programming languages and paradigms early in a programmer's career, such as at university, may combat this. An awareness of the instinct would also be useful.
Ultimately, computing is about solving resource optimisation problems in time and space, with one of the resources being programmer time. If a better tool exists to solve a problem, it should be used. Evaluating a programming language fairly as a tool requires an internal psychological battle between enforced scientific rigour and sub/unconscious screaming out that your technological prosthesis doesn't work properly anymore. Rigour should win.