Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
The rise of Stupid Everything

By NotZen in Technology
Wed Jul 09, 2003 at 06:53:23 PM EST
Tags: Technology (all tags)
Technology

The rise of open-source technology and the associated competition between software projects for developers, combined with the constant search for 'killer app' functionality is leading inexorably towards the rise of 'stupid software' - software that doesn't try to be everything to everyone, but does a simple job well and, vitally, allows extensions to be written trivially.


ADVERTISEMENT
Sponsor: rusty
This space intentionally left blank
...because it's waiting for your ad. So why are you still reading this? Come on, get going. Read the story, and then get an ad. Alright stop it. I'm not going to say anything else. Now you're just being silly. STOP LOOKING AT ME! I'm done!
comments (24)
active | buy ad
ADVERTISEMENT
There's an ancient, but still thoughtful and correct, article called The Rise of the Stupid Networks which details why stupid networks are better than smart ones.

It basically says:

If you build intelligence into your networks you are optimising them for a single purpose at the expense of their flexibility.  Any gains you make from this optimisation will be overtaken by general gains in the medium term, leaving you behind in the long term as unexpected uses are found for more general networks.

The examples in the article are the phone networks (optimised for voice) and the data networks (not as good with voice at the time, nowadays just as good at voice and far more useful in ways that were never originally imagined).  However, if we change 'network' to 'conduit', and think of it as anything that carries data between two places, it becomes more generally applicable.

I was reading Clay Shirky's article Given enough eyeballs are features shallow too? which says that if enough people use a piece of software, one of them will have an amazing idea that will make the software indispensable, and that thinking of the idea is actually frequently harder than implementing it (true in many situations - take Bayesian filtering as an example, someone published an article on using Bayesian filtering to spot spam, 6 months later there are at least a dozen different completed projects which implement this).

Putting these two ideas together, I realised that what we need is stupid software - software that's incredible simple, that basically gets the data from A to B, but allows other people to slot their ideas on top of it.  Some of the first programs to do this were games - Doom had editors that allowed people to modify some parts of it, but Quake allowed people to completely modify the game beyond recognition.  Nowadays many erstwhile game companies sell what are basically engines and tech demos (see Quake 3 for a perfect example) for others to build their own games on top of.

But what really clinched it for me was the decision by the Mozilla creators to take their massive, monobloc web-browser/mail client/chat program/kitchen sink and split it into much smaller programs that focus on a single task.  Not only that, but they simplifed the interfaces significantly and hid everything possible from the users except for what they needed to see.  Which isn't the important bit.  The important bit is that they made ie supremely easy to write extensions.  This means that you've got a stupid program (actually, a very smart program, but it needs to be to pretend to be as basic as web browsers seem to be) that acts as a lowest common denominator for anyone to add their cool idea on top of.

In these days of increasingly open source software, the programs that get the most support will be the ones that allow programmers to scratch their itch the easiest, the ones that are designed from the ground to allow you to build on them.  The ones which are, if I may, that are the stupidest.

I emailed Clay Shirky with this idea and he responded that the idea was even more flexible:  protocols are another concept that can be thought of as a conduit.  Thinking about it,  a simple, extendable protocol is worth 1000 complex, specialised ones.  There's a big fuss about weblogs and syndication at the moment - the idea is finally taking hold, largely because with RSS it's fairly easy to publish journal entries,news stories,comics, etc. in a manner that anyone can read with their choice of reader.  The Echo project plans to supersede RSS, and it's because they are doing this because have found the protocol hard to extend, and not simple enough (or rather, so simply specified that it can be interpreted in many ways, so that interoperability suffers).

Of course, some systems are externally extendable - most of us use SMTP and POP3 for our email, but the range of different programs that are used to access it is huge.  By having a standard transmission protocol, we've allowed developers to concentrate on scratching their itch on the way that the results are displayed and organised it.  A complex  system that defined the way that email should be treated at all levels would be much harder to extend, and so much less likely to be widely taken up.

There are many examples of simple, generalised, extendable systems being massively successful, from HTML to TCP/IP, to PCs themselves.  In almost every situation, the one that does a simple job that leaves the door open for later extension and improvement has been more successful.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o The Rise of the Stupid Networks
o Given enough eyeballs are features shallow too?
o their own games
o decision
o kitchen sink
o extensions
o Also by NotZen


Display: Sort:
The rise of Stupid Everything | 66 comments (40 topical, 26 editorial, 0 hidden)
+1 FP: not new, but good te be aware of (4.20 / 5) (#1)
by Confusion on Wed Jul 09, 2003 at 06:17:24 AM EST

I like these kind of articles: they tell you something you already know, but in a way that suddenly makes it crystal clear, so you can actually propagate the meme.
--
Any resemblance between the above and reality is purely coincidental.
Me too :-> (4.66 / 3) (#2)
by NotZen on Wed Jul 09, 2003 at 06:21:10 AM EST

A lot of 'wisdom' is really obvious, but you'd never have thought of it.

I read the "Rise of Stupid Networks" piece and felt the same way.

[ Parent ]

Hm, not very new... (4.33 / 3) (#11)
by megid on Wed Jul 09, 2003 at 08:15:33 AM EST

Well its not entirely surprising. Next you will tell us that pipes and streams are great ideas.

Anyway, you mix two ideas:

  • Component system (any good shell)
  • Plugin model (Winamp, Eclipse).

To separate those two is important -- a plugin model is a lot more specific, but streamlines the development processes greatly (i.e. taking work off your shoulders). I personally prefer the Eclipse platform over bash programming.

The problem with this component model when building enterprise size programs is the glue between them -- and those problems dont differ a bit from clever "traditional" "monolithic" programming. modularization wasnt invented with "dumb software".

--
"think first, write second, speak third."

as others have mentioned... (4.00 / 1) (#23)
by pb on Wed Jul 09, 2003 at 10:19:05 AM EST

Not a new idea, to such an extent that I feel the need to post a few more relevant links:

Many people have something to say about the Unix philosophy, and therefore much has been written about it, all of which seems to encompass your basic ideas, and more.

Another classic is The Rise of ``Worse is Better'', which explains the success of Unix from another perspective.
---
"See what the drooling, ravening, flesh-eating hordes^W^W^W^WKuro5hin.org readers have to say."
-- pwhysall

..or to put it another way.. (none / 0) (#61)
by andr0meda on Sat Jul 12, 2003 at 11:22:55 AM EST


"the cathedral & the bazaar"

Do not be afraid of the void my friend, is it not merely the logical next step?
[ Parent ]
Stupid article (3.50 / 2) (#26)
by delmoi on Wed Jul 09, 2003 at 12:41:23 PM EST

There are only a few huge-ass networks like the internet or the telephone network. However, there are hundreds of thousands if not millions or even tens of millions of programs. Each one of those programs was written with different needs and different purposes and in different environments (I mean social environments, i.e. a bank compared to an IRC chatroom)

It's ridiculous to say software should be "Stupid" and for god's sake the quake engine is not stupid in any sense of the word. Yes, it can be customized, but it's still a huge code base upon which the game sits. I mean, the guy wrote his own ANSI C interpreter for that project! And it is 'optimized', for 3d games. making a 2d project in it would be painful.

If you make everything 'stupid' and 'infinitely configurable' means that you end up with nothing. I mean, a bare CPU with no OS is the ultimately configurable thing. The whole point of software is to take that infinite set and pair it down until you get something that can do what you want.

And you totally miss the point that sometimes customization ability comes from "smartness" not "stupidity". Like the example of quake I gave. I'm sure JC put a lot of thought into his ANSI C interpreter. There are lots and lots of examples of over configuration. Every option you add to your software is another line in a config file or another script someone else has to write. In the end, you're just creating more code. And if hardly anyone ever needs to change it, it's just a waste of developer time.
--
"'argumentation' is not a word, idiot." -- thelizman
Apparently (4.33 / 3) (#27)
by NotZen on Wed Jul 09, 2003 at 01:45:18 PM EST

You missed the bit where I said:

This means that you've got a stupid program (actually, a very smart program, but it needs to be to pretend to be as basic as web browsers seem to be) that acts as a lowest common denominator for anyone to add their cool idea on top of.

Quake's a fairly unspecialised FPS - it doesn't require plots, stealth or dialogue that makes Deus Ex hard to mod (see this month's Edge magazine's Mod article for more details).  Instead, it gives you lots of basic behaviours which were then open for other people to take advantage of.  Which isn't to say that Quake/2/3 weren't good, fun games, but they were much less specialised than many other FPS titles that came after.

[ Parent ]

Deus Ex (none / 0) (#41)
by Jenner on Wed Jul 09, 2003 at 11:28:24 PM EST

Deus Ex isn't as easily modifiable as Quake for the simple reason that Quake is a generalized system while Deus Ex is a specialization built on top of a similar generalized system, that is, the Unreal engine. Now, arguably Deus Ex is a complete game and not a mod, but these days that distinction gets pretty fuzzy. Rather than being an example of something that doesn't adhere to the "building stupid" rule I'd say Deus Ex is an example of how stupid-built systems can and should be used to build specialized functions. If you want to mod something without a plot... get Quake. Not Deus Ex. On the other hand, Deus Ex probably remains the right choice if you want to create something as specialized or more specialized than Deus Ex.

[ Parent ]
definition of "stupid" (4.50 / 2) (#29)
by Arkaein on Wed Jul 09, 2003 at 02:20:30 PM EST

I think that this article implies a slightly different meaning of stupid than what we'd traditionally see. It's not stupid as in poor or incompetent, but more as unplanned. It explains about how the most successful technologies are not built to meet the exact need of particular applications, which may change drastically over time, but are excellent frameworks for further development.

You use the examples of banking systems and telephone networks as counter examples to this paper's argument. I think these types of systems are actually excellent examples where "stupid" systems can rule.

Say I wanted to create a new telephone network from scratch. I can see two fundamentally different methods of doing this. One way is to completely spec out the needs of the network starting from basic ideas and working toward more specific details, and then implement the applications needed according to these specifications. This is essentially a top-down approach. Another way is to look at the basic needs of my new system, and find existing systems that I can use in component fashion, either as is or with some modifications. Some genuinely new systems will likely need to be developed, but the majority of what is needed is probably already available as existing database applications and networking protocols and code. This is a bottom-up approach.

Proprietary systems tend to be more top-down, while open systems tend to be more bottom-up, I would say. I think that fundamentally this article is saying that as the body of available software and libraries grows, the bottom-up approach will get stronger and stronger, because people who insist on reinventing the wheel are giving the competition (and I say this to mean any project within the same sphere) a big head start.

In light of this view, even Quake is a "stupid" system, though a brilliantly done one :-) It started with a custom framework (the custom C interpreter, along with the 3D graphics core) which allowed a huge variety of 3D shooter based games to be developed, with much less work than starting from scratch. The actual Quake game itself (where I would say the "smarts" are, going along with the article's theme) is just one application built upon the framework, though the most important one.

----
The ultimate plays for Madden NFL 2003
[ Parent ]

Stupid == information hiding (none / 0) (#65)
by p3d0 on Sat Jul 19, 2003 at 07:45:19 AM EST

I think "stupid" here means that a good program does its best to remain ignorant of anything that's none of its business.

It certainly doesn't mean the resulting programs are small or simple, or didn't require any thought. It means their interaction with the outside world is simple. The user's mental model of how they work is simple.

In fact, the software that seems simplest from the outside is often exactly the software that required the most insight from its programmer to make it that simple. Being an excellent programmer is a thankless job. :-)
--
Patrick Doyle
My comments do not reflect the opinions of my employer.
[ Parent ]

This isn't new (5.00 / 3) (#30)
by omghax on Wed Jul 09, 2003 at 03:03:02 PM EST

Look at all of Unix's command line tools. Same philosophy.

Although X and anything built for it break this rather horribly.

Not saying it's new (none / 0) (#34)
by NotZen on Wed Jul 09, 2003 at 06:08:43 PM EST

It's an observation of types of processes and some thoughts about them.

It doesn't have to be new, lots of people still haven't heard of it, or applied it so generally.

Although, I have to say, the feedback I've had so far has inspired me to go back and pretty much double the size of the article to cover Unix tools explicitly, and show the different kinds of extension relations that tools can have with each other.

[ Parent ]

Well known phenomenon with simple explanation (2.25 / 4) (#31)
by demi on Wed Jul 09, 2003 at 04:11:48 PM EST

This is an inherent property of American culture, and possibly even a geographic artifact of the North American continent itself. Pretty much all American-developed models of organizational and theoretical inquiry are prone to degeneracy, imbecility, and monstrosity. This is detailed in the "degeneracy thesis", advanced for centuries by leading European intellectuals, and described here by James W. Caesar:
The thesis held that, due chiefly to atmospheric conditions, in particular excessive humidity, all living things in the Americas were not only inferior to those found in Europe but also in a condition of decline. An excellent summary of this position appears, quite unexpectedly, in The Federalist Papers. In the midst of a political discussion, Publius (Alexander Hamilton) suddenly breaks in with the comment: "Men admired as profound philosophers gravely asserted that all animals, and with them the human species, degenerate in America -- that even dogs cease to bark after having breathed awhile in our atmosphere."
It really goes far to explain .dll hell, feature creep, the vi editor, memory bloat, and so forth. Better that we all accept this fact and learn to live with it.

i just wanted to congratualte you. (none / 0) (#37)
by rmg on Wed Jul 09, 2003 at 07:52:30 PM EST

this is truly a job well done.

_____ intellectual tiddlywinks
[ Parent ]

the converse of the thesis in article you cite... (none / 0) (#44)
by alizard on Thu Jul 10, 2003 at 02:28:58 AM EST

America über alles is an even more dangerous delusion. For everybody.


"The horse is dead. Fuck it or walk away, but stop beating it." Juan Rico
[ Parent ]

Needs more research (5.00 / 1) (#35)
by aido on Wed Jul 09, 2003 at 07:26:14 PM EST

You give the impression that somebody (I guess you're thinking of Paul Graham) suddenly had the bright idea of using Bayes rule to filter spam, and 6 months later lots of projects are implementing it. But this approach has been around for since at least 1998.

My understanding of mozilla is that it is cross-platform application development framework. The mozilla web browser is a technology demonstration for the framework. There are lots of other applications that use the framework. So mozilla kinda already is " designed from the ground to allow you to build on it". It was designed to be that way. Firebird is just another application that uses the framework. The ease of writing extensions is also tied into the way the framework is designed.

And od course UNIX has been around a lot longer than Doom...

1998? Bah (none / 0) (#50)
by awgsilyari on Thu Jul 10, 2003 at 02:59:07 PM EST

Bayesian has been the mainstay of document classification for decades. The fact that it took so long for people to start using it to filter spam says something about stupidity, not intelligence.

BTW, as far as serious document classification goes, Bayesian is an extremely ineffective technique. You get some guy like Paul Graham to mention it and everyone goes apeshit, even though it's been studied for 30+ years and everyone (well, anyone who's bothered to actually study the subject) knows how poorly it does compared to "real" classifiers.

Bayesian classifiers suck. Ever tried using them to filter mail into 30 different mailboxes? It's pathetic.


--------
Please direct SPAM to john@neuralnw.com
[ Parent ]

It works for 5 (none / 0) (#53)
by NotZen on Fri Jul 11, 2003 at 05:03:28 AM EST

I'm using it to classify documents into 5 boxes (spam, friends, lists,commercial and other) and that seems to work.

What systems work better?

[ Parent ]

LSA (5.00 / 2) (#54)
by aido on Fri Jul 11, 2003 at 07:59:45 AM EST

The mail classifier that ships with Apples mail client works pretty well. I don't have any figures to prove it, but anecdotally it feels to me like it works better than baysian approaches that I've tried.

The apple mail client uses Latent semantic analysis to recognise spam.

There are a number of approaches that have been shown to be better than naive Bayes for text classification, e.g. support vector machines.

Anyway, I think that feature-selection is much more important than the learning algorithm. On most classification tasks, with a fixed feature-set, naive Bayes usually performs poorly in comparison with other learning algorithms (e.g. Ripper, C4.5, SVM). I think the reason that the baysian clasifiers used by mozilla and spam-bayes perform so well is that they have an extensive feature-set that is well tuned for the task of spam filtering.

[ Parent ]

I have 13 mailboxes.... (none / 0) (#64)
by gte910h on Tue Jul 15, 2003 at 03:17:08 PM EST

And it can even dicern between automated mail not about online poker, online poker mail, mail about my fraternity poker game, and mail about my tuesday night poker game. And spam. And subscriptions. And mailing lists.

  --Michael

popfile is beautiful

[ Parent ]

Thanks (none / 0) (#38)
by NFW on Wed Jul 09, 2003 at 08:20:25 PM EST

That was interesting reading.


--
Got birds?


Favourite quote: (5.00 / 2) (#39)
by tzanger on Wed Jul 09, 2003 at 08:53:28 PM EST

My mentor, Donald Shepherd, has a quote that has served me well over the years:

"If you design something smarter than it needs to be it will use the reserve to fuck you."

Amen, Don, Amen.

You have a mentor? (none / 0) (#42)
by skim123 on Thu Jul 10, 2003 at 01:59:33 AM EST

Ever see that Seinfeld episode?

Money is in some respects like fire; it is a very excellent servant but a terrible master.
PT Barnum


[ Parent ]
I watched it occassionally (none / 0) (#51)
by tzanger on Thu Jul 10, 2003 at 08:51:40 PM EST

But yes, I really do (or rather did, I don't quite do that same kind of work anymore) have a mentor.  Actually I have had a few over the years.

[ Parent ]
Just out of curiosity... (none / 0) (#55)
by skim123 on Fri Jul 11, 2003 at 02:00:32 PM EST

What's the process for obtaining a mentor? Is there a standard procedure?

Money is in some respects like fire; it is a very excellent servant but a terrible master.
PT Barnum


[ Parent ]
Um... (none / 0) (#56)
by tzanger on Fri Jul 11, 2003 at 04:17:11 PM EST

I'm ont aware of any procedure per se...  I knew the basics of what I was doing and had some skills beyond that, and Don (and Kevin, another guy I learned an awful lot from) taught me a lot of the stuff that you just don't learn until you've done things a bazillion times or through other kinds of experiences.

I'd liken it to an apprenticeship, only much less official.

[ Parent ]

i agree, but why stop at applications? (1.00 / 1) (#40)
by rmg on Wed Jul 09, 2003 at 09:54:23 PM EST

your title implies it: stupid everything...

i'm thinking, we need to stop making such smart articles. for example, it would take substantial retooling to make this article, for example, into one about rototillers, but the need for such an article is a historical inevitability. a few articles stick out in my mind as stupid articles we should keep around for future use:

  • the stephen king obituary.

  • StormShadow's article about health care or whatever it was.

  • Netcraft confirms: *BSD is dying

it is my belief that we at k5 spend too much time reinventing the wheel in worship of the ideal of originality (see Larry Wall's cult of originality). to become a truly flexible and versatile news site, and finally hold it's own against slashdot, we must look for ways to make our articles stupider. much stupider.

_____ intellectual tiddlywinks

Bravo (4.00 / 1) (#43)
by mishmash on Thu Jul 10, 2003 at 02:13:57 AM EST

I did like this article, and thought it was well-written, with some cogently-delivered arguments.  

Perhaps this all harks back to the original Unix philosophy:  that a system of small, simple and robust parts can take on tasks far more complex than the sum of its parts.

Cool thinking, but... (none / 0) (#46)
by amarodeeps on Thu Jul 10, 2003 at 09:15:19 AM EST

I'd like to understand this more formally. As many have now pointed out, UNIX was framed with this sort of idea in mind. However, I'd like to know at what level this should be implemented, as the piece pointed out a few different instances--Networks, Mozilla, Protocols, and operating systems are all examples of the loose idea explored by the article. However, I think it is clear that at some point specialization is necessary. At what point should the specialization take over (and, is this codified in any way, and is there any writing/research on this already--I'd guess yes)?

It seems really to reduce down to what gives you the greatest benefit in terms of time saved vs. flexibility. But it seems that as technology improves, the specifics of this relationship changes as we attempt to maintain the balance.

Just throwing stuff out there, it's early...interesting stuff though...



i dont agree at all (3.00 / 2) (#47)
by turmeric on Thu Jul 10, 2003 at 10:25:42 AM EST

the mozilla source code is about as far from 'simple' as you can get. it is a gigantic mess. just like unix. sure 'everything is a file', except that you cant use the same functions on files that you use on sockets, dependinhg on which unix platform you are on, and you cant write/read to a tty or pty the same way you can to a stream, or maybe you can, it depends on the system, but oh yeah. 'everything is a file'.

except actual files. which can be fake links to files possibly over a network connection that died 5 hours ago. (NFS). and except for, oh little stupid details, like video cards, monitors, network devices, keyboards, etc. and since 'non blocking' io is some kind of mystery meat the unix people dont even like to talk about, you ahve to perform magic to even get a meaningful 'file' for a mouse (buffered mouse input would be loads of fun wouldnt it.)

and 'interprocess communication', it works so well that... oh wait actually it doesnt work at all. the most portable IPC is thru sockets, which came from the internet, not unix. so for one unix program to talk to another, frequently instead of using unix, they use sockets because unix so completely failed to be 'simple' that it was impossible to depend on IPC for anything.

way to go, 'open source'. given enough eyeballs, all emperors have no clothes.

You're losing your touch Turmeric. (none / 0) (#49)
by amarodeeps on Thu Jul 10, 2003 at 02:04:15 PM EST

Everyone knows that UNIX was invented by Bill Gates, not those open source hippies.



[ Parent ]
Abstraction (none / 0) (#48)
by tarsi210 on Thu Jul 10, 2003 at 01:39:02 PM EST

This is just a case of abstraction -- removing yourself from the details and raising yourself to a higher level.

Just like in programming, when you want to quickly prototype or make some program that involves just a bunch of windows and buttons and such, you use a language and an IDE that abstracts away from you the details of the underlying structure. That's the whole basis behind the MFC, the PFC, standard C libs, etc. It allows the developer to write at a level of understanding that isn't tied close to the hardware.

I just see Mozilla and such as an abstraction of a concept. They provide the underlying nasty bits to do all the hard, close-to-hardware/OS work, and you just plunk whatever little enhancements you want ontop of it all. You don't need to know how they did it, or exactly how, because as long as it works well....well, who cares.

Philosyphia
Communications, Complexity and Tradeoffs (5.00 / 1) (#52)
by OzJuggler on Fri Jul 11, 2003 at 01:21:36 AM EST

This is computer communications theory.
The Service is the functionality that takes place.
The Interface is the point through which the Service is accessed and the syntax of the API and the operations that API exposes.
The Protocol is the grammar of the stream of symbols that represent requests or responses or general messages.

In this framework, the desire for networks of stupid nodes might be achieved as follows.

  1. Start with a task for which you need to create new software.
  2. Functionally decompose the task, but express each subtask as a Service node with an Interface representing the operations it performs, and connected to the parent task by a Protocol.
  3. Write and test the code for the APIs.
  4. Repeat from step 1).
People, this is plain old stock standard software development. The focus in the parent article has just been on the fact that the functional decomposition does not have to be limited to the module level, and subsequently the protocol does not have be limited to function calls.
To say that the protocol between all nodes everywhere should be the same protocol (like XML for example) is a double edged sword. It gets bonus points because as long as the data is in a common format, someone can drop in a node that does the kind of semantic mapping needed to get a software Quake3 Service to communicate with a real life Coke Vending Machine service. The down side is that the latency between stimulus and response of the system becomes progressively larger as more levels of subdivision occur and as slow and clumsy protocols are introduced where something nimbler might be more practical.

I'll go out on a limb and say that the reason why bloatware is so bloated and slow is mostly because people are practicing exactly the kind of software strategy you are advocating. A holistic approach reassures us that no matter how much we rant about 'keeping it simple', we will never escape the cost/quality/speed tradeoff. Sometimes a tightly bound chunk of complex functionality may be preferred...or even unavoidable.

Whether the network is simple or complicated depends only on what threshold your "packages" or "nodes" are viewed at. Seen in its entirety, the world's information technology system is composed of transistor "On/Off" Services with a input/gate/drain API and a DC electron Protocol. Very dumb nodes connected toegether in a complex way, to produce the desired system functionality. So we already have what you are asking for, its just that the picture is too complex to understand when viewed that way. Ultimately if there is any problem here it is only a question of perception and deliberate ignorance.

The only thing we can definitely strive for is every step of the way to implement our design decisions in a manner that lets us remake those design decisions differently later with minimal impact and cost.
"And I will not rest until every year families gather to spend December 25th together
at Osama's homo abortion pot and commie jizzporium." - Jon Stewart's gift to Bill O'Reilly, 7 Dec 2005.

There are a few other points to be made. (5.00 / 1) (#60)
by Alhazred on Sat Jul 12, 2003 at 08:56:02 AM EST

One is closure. The utility of a system or subsystem is enhanced when it attains closure. At that point a system can decompose tasks and delegate them to 'lower level' copies of itself. This is almost always the most efficient form of decomposition at non-trivial levels of complexity.

Another is simply having a unified namespace. This is necessary for composition. The more things which can share a single system of refering to each other, the more power the overall system has at that particular level.

RAM is more powerful than blocks on disk because every address is equally accessible at every processing step.

The point of all of this debate is 'how do you best provide modularity?' The issues are not data-processing theoretic, they are human-centered. Is it easy to compose a new service or do you need to be a software engineer to do so? Are composed services efficient? Are they reliable? Can they be easily supported, documented, tested. What is the scope of accessibility of a unit of functionality? Security. Etc.

Granted your basic point is true, FORTRAN subroutines made software modular around 1959. The debate really amounts to 'is there a way to make modules at higher levels of abstraction that have desirable characteristics wrt the points mentioned above?'
That is not dead which may eternal lie And with strange aeons death itself may die.
[ Parent ]

Software Legos (5.00 / 2) (#57)
by jpjerkins on Fri Jul 11, 2003 at 05:56:41 PM EST

omghax said it right - just look at the wealth of Linux/Unix command-line tools, and you'll see a great example of this concept.

I see the text tools of Linux as software Legos. Focused, single-purpose tools (which may be yet be internally complex) which are useful individually, and yet may be easily combined to solve many complicated problems.

Focus

One thing these software Legos share with the plastic kind is their individual focus on a particular problem domain. None of these tools can do more than a small handfull of related functions. grep by itself isn't any good at anything at all, other than finding lines that match the given regular expression. ls is useless except for listing files on the hard disk, with only the most basic of filename filtering. However, together, they provide a great filesystem query system. Their simplicity makes them easy to write, easy to maintain, and easy to comprehend and remember - crucial if it is to be of any utility in an environment based on combinatorial interaction.

Interaction

The OTHER thing that these tools share with Legos - and this CAN NOT be overlooked - is the open and SIMPLE manner in which they can interact. All Lego bricks have plastic dots on the top, and allow those same plastic dots to be snapped into the bottom. Every Lego brick, regardless of shape, color, or obvious intended use (I really liked those clear cockpit pieces) shared this interface. It allowed me to build buildings, bulldozers, and airplanes with those simple bricks. Cockpit pieces and wing pieces and ordinary 2x4 bricks could be snapped together to make any shape my imagination produced.

And what's funny was, the more exotic the part, the fewer uses I could find for it. Slanted parts were good for decorating my masterpieces; they made the blocks look less blocky. Cockpit parts were great for cockpits, and the occasional window...but not much else. But regardless of whether it was an ordinary building or a fighter with forward-swept wings (still sitting in my parents' house), my most-used parts were the rectangular bricks. Simple, boring, but infinitely useful.

The Linux Model

Command-line tools in Linux all have these interfaces:

  • Command-line arguments
  • File streams, always including STDIN and STDOUT
Command-line arguments allow you to specify details of how the tool is to behamve - in human-readable text. And STDIN and STDOUT resulted in I/O that was individually useful, yet (with shell support for piping) infinitely combinable.

I'm not saying that "everything's a file" is necessarily the best way to do this. I'm just saying that it is one way to do this, and it's popular, and it works. Windows, while supporting this mechanism, seems more largely built on the 'plugin' model (drivers vs. services vs. applications vs. shell extensions vs...). This, too, works, but there is very little reuse of programs in that environment. A piece of code is written to do that one thing, and it does only that thing, and nothing else. Combinatorial interaction is almost unheard-of in Windows.

Going Forward

GUI applications are great for several reasons: Learn-at-a-Glance GUI apps can be much easier to learn than command-line apps, simply because their options are presented instantly and in an organized fashion. Idiot-proofing The interface layer of graphical applications can be written to disallow ill-advised actions, and guide the user in the best direction. Optimal Display of Arbitrary Data GUI apps have the advantage of being able to display any kind of data with the most optimum representation - tables, formatted text, pictures, and 3D scenes can all be displayed and edited intuitively

The clipboard concept helps the situtation, but doesn't completely solve it:

  • Distinct Apps Apps are still annoyingly distinct, needing to be opened and closed separately
  • The Event ModelThe event-driven programming model (a huge advance) complicates matters, in that GUI apps stay open, waiting on user commands to take action.
  • Arbitrary Data Formats And there's still the huge issue of data type and format - it's not easy to paste text into MSPaint, and impossible to paste an image into Notepad.
I think we're stuck with the data format issue. MSPaint and Notepad will never work together as smoothly as grep and ls. But I think there are some things we can do:

Vive le Piping STDIN and STDOUT should be revived, clipboard-style. It should be possible to copy all active data to the clipboard, send it to a newly-started app, wait for that app to close, then process the results, also stored in the clipboard (it might be a good idea to have a second, hidden clipboard for this purpose). Spell-checkers, grammar-checkers, Find and Replace, File Open, and File Close could all work that way. Graphics filters, video and audio effects, email filtering, even >gasp< script languages like Perl, Python, and Ruby.

Simplify, Simplify, Simplify. GUI applications could then focus on a small problem, just like command-line utilities. MDI (multiple-document interface) should go away; having multiple documents open and limiting programs to one running instance would horribly complicate things.

Unleash the User I'm not proposing that the "Spell-check" button launch a new app. I'm proposing a new "system" button that sits up by the Close and Minimize buttons. It would bring up a list of GUI utilities that accept the active data's format. You click the app, launching it. Then the new app is in control - asking you what to search for, how to handle a misspelled word, the particular Ruby script you want to run.

Utopia!

All the developer has to do is make sure the system knows what data types his app will accept and generate (/etc/appdata/, anyone?). Instantly his Notepad clone has spell-checking, find, replace, email, publish to web, and any other features already installed on the client's machine. And if your word processor doesn't have the feature you want, you can download the appropriate utility. You want edge-detect in Paint? Download it! Upgrade it! You want a spam filter for your email client? Buy one! Write it in Ruby!

Let the flames begin...

The Humane Interface (none / 0) (#58)
by kubalaa on Sat Jul 12, 2003 at 12:10:19 AM EST

Jef Raskin has said basically the same thing for a while. Instead of apps, have commands which operate on certain data types. Have a document-centric environment. Provide automatic conversions between data types. http://humane.sourceforge.net/home/index.html

[ Parent ]
Some thoughts (4.66 / 3) (#59)
by Alhazred on Sat Jul 12, 2003 at 08:38:53 AM EST

Personally I think Mozilla/Netscape is a horrible example of 'modular software'. Its not really modular at all, its just a bunch of unrelated apps that have been wedded at the hip! The fact that they could seperate them out into distinct apps is evidence that they never should have been lumped together in the first place. And as a matter of fact they are still basically lumped, try to figure out how to set up mozilla so you can have 'mailto' URLs handled by an external program, or so that 'view source' puts the source into a text editor and not The Abominable Grey Window From Hell(tm).

Yes, theoretically Mozilla is a modular piece of software in the sense that it uses a component architecture and you CAN write plug-ins, plus there are APIs or at least languages that let you control the GUI etc, but NOTHING could be considered more complex or difficult to program. Creating Mozilla extensions is probably the most difficult programming task imaginable. I can personally attest to the fact that its about 900 times harder than writing a Linux hardware driver.

In a more general vein the achilles heel of all the kind of thinking you're engaged in is just the sheer complexity and diversity of data. Even where common standards exist we find that any particular way of representing information is good for some tasks and bad for others. In other words an XML contact database might be ideal for easy editing and display, highly portable, etc. but when you want to search through 50,000 contacts or have 40 people use a big CMS system then an RDBMS is the right representation. Obviously you CAN bridge the gap, but THAT BASICALLY IS WHAT 99.99% OF ALL ENTERPRISE SOFTWARE DOES. Millions of IT people make their livings all day every day crunching one format of data into another.

Progress in IT is coming not from modularity, but from standardization of representations and toolsets for performing conversions and other data processing operations.

I'm all for extensible software, but by the same token maybe what we need is less extensibility and more simplicity of composition. XML is a primary example. Its utility is in the fact that I can process any XML based syntax without the need to have to worry about ANY of the details of accessing, parsing, character representation, or a dozen other smelly little details. I can just write code that is focused on NOTHING but the problem domain. Plus the format itself introduces a beneficial level of regularity into what people do. CSV files were OK, but there are endless variations on escaping, line endings, seperator characters, quoting, etc. XML abolishes these issues.

Consider the much vaunted Unix command line. Extensibility is non-existent in the sense that grep is grep is grep, you can't 'add a feature' to grep. Its power is manifested in the ability to easily compose. There are 2 sources for this power. One is a common name space, all programs and all data reside in a common filesystem hierarchy where they can easily refer to each other. The 2nd was in the regularization of the mechanisms for presentation of data to a piece of software, the so-called 'pipe' and the use of byte-oriented I/O streams.

Note however that the weaknesses of that system are apparent as well, or at least its limitations. Grep really isn't useable if your data isn't newline seperated records. In fact 99% of the Unix command line tools are useless on binary files. Additionaly the entire paradigm was limited to the command line. GUI apps have no equivalent concept.

The reason why GUI apps cannot be composed the way command line apps could? It is inherent in the nature of the beast. A Unix pipeline processes data in a linear fashion from one end to the other. GUI apps (generally) tend to provide random access to data. There is no 'data flow' to organize your composition around. Morover when you are displaying data and operating randomly on it the software is much more sensitive to internal data representation.

Take a word processor as our example. There is an on-disk file format, and its not too hard to add in file reading and writing plug-ins to handle other file formats. In fact you can fairly easily envisage composing complex file filters. In fact Unix users do it all the time, they simply do it external to their GUI apps in a shell environement! It is very hard however to integrate with features that operate directly on the data in the course of the use of the program. For instance how do you integrate in a standard spell-checking application?

The spell-checker is obviously NOT general purpose if you are passing it a pointer to a data structure in memory that represents your document.  Nobody else is likely to use that data structure, and its even unlikely you can cross a language boundary in any case (ie C and Python need glue code to let them even work with each other's data at all). The solution has been to transform the data into a standard format, often a standard file format intended for disk-based storage, let the external function operate on that, and import the result back into the main program.

Obviously such a technique is not terribly extensible. The only general extension utilities I know of for GUI apps are Unix spell-checkers. No doubt there are a few others, but in any case the main 'framework' still has to be able to display whatever data the plugins work with. Microsoft did manage to create their OLE/ActiveX system which allows for this in a modular fashion, but the solution is highly complex and personally I always found the results klunky at best (though useable and handy at times).

To finally get back to Mozilla for a moment. What Mozilla DOES represent is perhaps the path to the future in the sense that with XML (esp with namespaces), CSS, DOM, XSLT, and Scripting built in to one 'framework' app it is now conceivable to compose a single data set which you can display and work on as a unitary piece and which contains virtually arbitrary content and structure.

Personally I think all that remains to be done is for someone to build a dirt simple toolkit for building a Mozilla plugin. No, I am not learning XPCOM, sorry... I want a DOM tree pointer and a way to call back to internal Mozilla functions like the XSLT processor, the network layer, and JavaScript. I want it to be as easy as pie. THEN maybe I will feel like we have serious modularity in the GUI world.
That is not dead which may eternal lie And with strange aeons death itself may die.

Firebird (none / 0) (#63)
by NotZen on Mon Jul 14, 2003 at 10:39:43 AM EST

I was under the impression that the Firebird plugins were deliberately made easier to create than the standard Mozilla ones...

[ Parent ]
Quite possible (none / 0) (#66)
by Alhazred on Sun Jul 20, 2003 at 04:10:44 PM EST

I don't know zip about Firebird. Certainly willing to learn. I just see data sitting in that DOM tree in your mozilla document, nice and displayable with some css and all segmentable into different vocabularies with namespaces, as about the best standardized data representation we're likely to get in the forseeable future.

Mostly at this point my attention is focused on working with that data using scripts since they are the quickest and easiest thing to create, with SOAP and tossing things back and forth to a server via URLs or XSLT document() calls thrown in.

Its an evolving environment, but its getting to a point where some real interesting things are possible.
That is not dead which may eternal lie And with strange aeons death itself may die.
[ Parent ]

Even more thoughts (none / 0) (#62)
by ascension on Sun Jul 13, 2003 at 04:22:08 PM EST

Interesting attempt to extend an idea, but I have several issues with your essay and the one on which it is based.

1. The choice of the word stupid is an unfortunate or poor one, because (as evidenced by other comments) it does not capture the sense of your idea. Perhaps modular, interoperable, extensible, or implementable would be more accurate. It is hard to evaluate the validity of your argument if I am not sure what is actually being argued.

2. I think you should decouple your article from being so tightly bound to David Isenberg's "Rise of the Stupid Network". Use the general idea (not coherently stated) rather than relying on the specifics. I say this, because I think his proposition is flawed and contradictory. His article is more of a rant against the monopoly the Baby Bells have over basic communications infrastructure than it is a reasoned analysis. He does not like the assumptions that the Telecom companies use to manage their monopoly position, and is personally frustrated with trying to engineer new services to work under the assumptions and constraints of a mature system (fair enough). From his perspective, extending the system to accommodate new services would be much easier and would make much more sense if the network was of a different type. So he proposes a different kind of network that would be based on "dumb transport in the middle, and intelligent user-controlled endpoints". This "network would be engineered to `Deliver the Bits, Stupid', not for fancy network routing." This network would "stuff bits in one end and get them out the other without getting tangled up in cobwebs of legacy assumptions." And from this he postulates that the "age of centralized control is ending." This proposition trips over itself in its internal contradictions. The stupid network he envisions could only be accomplished by a homogeneous network where everyone would have to connect using the same physical medium (heterogeneity implies routing), and this homogeneity could only come about through centralized control (starting to look like Telecom at this point). His proposed solution is odd given that he believes that the heterogeneous nature of the internet (a network of networks) is its strength. He also looks to IPv6 and its enhancements to support advanced services and to address the current limitations of IPv4, but this also runs counter to just delivering the bits. In reality, supporting a multiplicity of data types/uses requires serious intelligence by the network nodes, and all this has to be built into the system as a set of assumptions. The nodes of this network will be doing anything, but just shipping bits. The phone networks and data networks are also treated as monolithic entities which they aren't. They are both networks of nodes that interconnect using different physical mediums requiring different protocols, some of which underlie both networks (fiber optic backbones). What is distinct is how the user connects to the communications infrastructure and how the last node and physical transmission medium affect the possible applications and future applications the user can put that connection to. Even the twisted pair between my house and the phone company is now being used for unexpected uses like ADSL.

3. You seem to advocate for modular extensible software, and argue that this type of software or software developed in this fashion will, from a survival of the fittest perspective, gain ascendancy and become dominant. Fine, but you need to show why this is so. You also imply that this is somehow due to the advent of the open source software movement, and the search for killer applications in general. I think you would have a hard time proving this and would do better to drop the implication. It does not add anything to the argument anyway.

4. I agree with the previous poster on the importance of standards as a defining influence on what does and does not created but that is an essay in itself.

sincerely,
ascension



The rise of Stupid Everything | 66 comments (40 topical, 26 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest © 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!