Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Bill Joy fears the future

By bobsquatch in News
Sat Mar 18, 2000 at 03:11:15 PM EST
Tags: Technology (all tags)
Technology

Bill Joy, Chief Scientist at Sun and author of Jini, managed to sound almost like a Luddite when he wrote an article for the latest edition of Wired. "Why the future doesn't need us" talks about how the imminent availability of genetic and nano tech may put "knowledge-enabled" mass destruction capabilities into the hands of small groups. As technology enables individuals to do more, it obviously allows individuals to do more harm -- how do we deal with that?


Here's an excerpt where he states his thesis:
The 21st-century technologies - genetics, nanotechnology, and robotics (GNR) - are so powerful that they can spawn whole new classes of accidents and abuses. Most dangerously, for the first time, these accidents and abuses are widely within the reach of individuals or small groups. They will not require large facilities or rare raw materials. Knowledge alone will enable the use of them.

Thus we have the possibility not just of weapons of mass destruction but of knowledge-enabled mass destruction (KMD), this destructiveness hugely amplified by the power of self-replication.

I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.

In some sense, we've already seen what technology in the hands of individuals can do: look at the effects of the internet on knowledge-based economies like music. With a music-copying-and-distributing device available to anyone of moderate means, we're left with two choices: either accept that music distribution will not be artificially blocked and develop a new business plan around that fact, or accept increasingly draconian efforts like DCMA to monitor and control the use of computer technology.

Now, consider the situation when everybody can buy a gene-splicing-and-virus-distributing device. If general-purpose biotech becomes as free and open as general-purpose infotech, don't we face the same disagreement, with much greater consequences?

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o "Why the future doesn't need us"
o Also by bobsquatch


Display: Sort:
Bill Joy fears the future | 14 comments (14 topical, editorial, 0 hidden)
You know... people have been "empow... (5.00 / 1) (#2)
by shepd on Sat Mar 18, 2000 at 01:45:02 AM EST

shepd voted 0 on this story.

You know... people have been "empowered" by cheap, accessible, killing weapons ever since a person picked up a rock and crushed someone with it. These new weapons just allow more destruction. Just like the human race didn't end in the movie "The Quiet Earth", it isn't going to end because of some moron releasing deadly diseases and such. (Just my 2 cents...).

Re: You know... people have been "empow... (none / 0) (#6)
by 348 on Sat Mar 18, 2000 at 07:08:41 PM EST

Agreed, but the sad part is that as a society, the more technology we have the dumber we become. Who know in 50 or 100 years. . . He's propably right. Pretty sad.

Logic is a systematic method of coming to the wrong conclusion with confidence.
[ Parent ]

Very interesting... Actualy there's... (none / 0) (#1)
by kraant on Sat Mar 18, 2000 at 07:06:14 AM EST

kraant voted 1 on this story.

Very interesting... Actualy there's been a guy on UseNet making these kind of statements for a while... called " Tiny Human Ferret " go do a search on him on deja or the like you'll be pretty impressed...
--
"kraant, open source guru" -- tumeric
Never In Our Names...

Not sure where to begin (none / 0) (#3)
by fluffy grue on Sat Mar 18, 2000 at 05:03:11 PM EST

To me, these seem more like the paranoid ramblings of someone who assumes the definite worst than of a supposed visionary who wants to Internet-enable every refrigerator and toaster on the planet. I particularly have an issue with this passage:

Much of my work over the past 25 years has been on computer networking, where the sending and receiving of messages creates the opportunity for out-of-control replication. But while replication in a computer or a computer network can be a nuisance, at worst it disables a machine or takes down a network or network service. Uncontrolled self-replication in these newer technologies runs a much greater risk: a risk of substantial damage in the physical world.
Out-of-control replication, even if it does involve "only" computers and networks, do cause substantial damage in the physical world. How many businesses lost incredible amounts of money in lost time and productivity due to the Melissa virus?

As far as his anti-progress arguments go, I agree that there may be some problems, but the potential good vastly outweighs the potential evil. Your run-of-the-mill terrorist doesn't have access to (or the ability to properly use) nuclear weapons, and your run-of-the-mill terrorist won't have access to destructive nanites or the GM technology needed to target certain genetic signatures. In addition, his arguments only look at the extremes, namely Kurzweil's desire to basically be a Borg vs. Star Trek's campy writing showing how bad a Borg species would be. I don't see any reason why a cybernetic species would immediately go out and try to assimilate other species into their cybernetic ranks.

As far as GM and "playing God" goes, see my ramble in the cloned pigs article.

Personally, I don't see what's wrong with the "worst case" scenario happening. The dinosaurs' random extinction after millions of years of niche-induced evolutionary stagnation led to something arguably better in the future; I've yet to see any proof that the dinosaurs had the Internet. Any given point in time isn't the be-all end-all, and it's very selfish and long-term defeating to consider only the current local maxima rather than the overall good of the universe across all spacetime. We're just a temporary fluke, a blip, a blink of an eye. All that his arguments lead to is a desire to be like the dinosaurs - not changing, not allowing for any growth, or the growth of future species. It's very species-centric to believe we're the pinnacle of evolution.

I'd hate to say this, but as much as Bill Joy tries to make himself out as a visionary and not a luddite, he seems to only be a visionary when it comes to his lifetime and a luddite when it comes to anything afterwards.
--
"Is not a quine" is not a quine.
I have a master's degree in science!

[ Hug Your Trikuare ]

Re: Not sure where to begin (none / 0) (#9)
by nascent on Sun Mar 19, 2000 at 12:06:08 AM EST

To me, these seem more like the paranoid ramblings of someone who assumes the definite worst than of a supposed visionary who wants to Internet-enable every refrigerator and toaster on the planet.

If you don't understand the inevitability of his predictions - or at least the inevitability of facing these scenarios - I can see how it would look like fear mongering. Those that don't heed the past are condemned to repeat it.

Out-of-control replication, even if it does involve "only" computers and networks, do cause substantial damage in the physical world. How many businesses lost incredible amounts of money in lost time and productivity due to the Melissa virus?

Do you not see that what he's talking about is many order-magnitudes greater than Melissa??

...and your run-of-the-mill terrorist won't have access to destructive nanites or the GM technology needed to target certain genetic signatures.

This illustrates how little you're understanding the risk here. Moore's law stipulates a doubling of computer speed every 18 months (actually the density of the chip, but...), so if this holds through to 2030, then we will have computers about a million times (1.048 million) faster than what IBM released this month. That's about a thousand TERRAHERTZ. On your desktop. As he explicitly points out, this is a crime that will be "knowledge-based". ie, the tools and raw materials will no longer be requisite for exploiting this power. When this occurs will be purely a function of who can get enough bright minds together willing to do the coding (whatever form that might take).

In addition, his arguments only look at the extremes, namely Kurzweil's desire to basically be a Borg vs. Star Trek's campy writing showing how bad a Borg species would be. I don't see any reason why a cybernetic species would immediately go out and try to assimilate other species into their cybernetic ranks.

First, if you look at the distant future (which 30 years is with computers), then you will not only see the extreme but you will surpass it. He's being overly cautious, actually. Secondly, he explains quite lucidly why a speices would want to destroy us. We're not useful. Termites hog up resources that we'd rather have for ourselves. Namely, our houses. We don't dislike termites just because they're there, but we have no problem erraditcating them when doing so becomes less costly than not.

nascent
http://www.intap.net/~j/
nascent
http://www.intap.net/~j/
[ Parent ]

Re: Not sure where to begin (none / 0) (#10)
by fluffy grue on Sun Mar 19, 2000 at 12:14:25 AM EST

Granted, you raise some very good points, and do a good job of debunking my initial ramblings. However, my points regarding the worst-case scenario - the extinction of humanity and replacement by some other species not necessarily being a bad thing - still stand. Your other post indicates to me that you agree with me on that point.
--
"Is not a quine" is not a quine.
I have a master's degree in science!

[ Hug Your Trikuare ]
[ Parent ]

Re: Not sure where to begin (none / 0) (#13)
by nascent on Sun Mar 19, 2000 at 03:01:26 AM EST

[nodding]

That's why I didn't take umbrage to it. =)

nascent
http://www.intap.net/~j/
nascent
http://www.intap.net/~j/
[ Parent ]

Re: Not sure where to begin (none / 0) (#14)
by CodeWright on Mon Mar 20, 2000 at 11:19:54 AM EST

nascent writes:

As he explicitly points out, this is a crime that will be "knowledge-based". ie, the tools and raw materials will no longer be requisite for exploiting this power. When this occurs will be purely a function of who can get enough bright minds together willing to do the coding (whatever form that might take).

When the technological hurdles between the present state-of-the-art and widespread availability of molecular engineering (read: nanotech) are overcome, the "bright minds" capable of utilizing this technology first (ie, the tech-geeks who create it) will (and already do) have very clear ideas of the risks involved -- before "self-reproducing" plagues would be technologically feasible, HIS-2 (Human Immune System v2.0) will already have been designed (and work started on the 2.1 hack) -- an active nanobot immune system would be more than capable of stopping nano plagues (after all, the HIS v1.0 has protected us from biologically evolved self-reproducing molecular plagues for all two million years of our existance).

Of course, Bill Joy (the Luddite) is right -- homo sapiens WILL be obsolete -- but not humanity as a whole -- the new versions (human immuno-enhanced, etc) will be far better survivors than their ancestors. Not long after HIS-2, it will be possible to have nanobots map ALL the neural connections in someone's brain and "upload" them to other avatars (ie, pure silicon existance, enhanced/modular bio-robotic rovers, sentient nanobot societies, etc).

But what's wrong with that world?

There's nothing to fear -- the transcendant humans of that (not-so-far-future) era will live lives whose scope and depth we can only dream of today.

The only people who could possibly be afraid of a world like that are the ones who don't want to participate in the bounties it offers (ie, shoe-tossers) -- but why should their reticence (as backed by force to prevent technological advancement) stop those who desire to participate?

The people who want to increase the store of knowledge will get around to achieving that level of technological proficiency someday, unless the Luddites get ambitious enough to self-destruct the entire species. It's not a question of "IF" homo sapiens will become extinct, but "when".



--
A: Because it destroys the flow of conversation.
Q: Why is top posting dumb? --clover_kicker

[ Parent ]
Bill Joy is a luddite moron (2.00 / 1) (#4)
by Nyarlathotep on Sat Mar 18, 2000 at 05:35:06 PM EST

"Fear of the future" should be considered ammong the irrational fears, like fear od small cute bunny rabits (execpt for bun-bun), or fear of english paisteries. What a complete moron. I think it's partially the result of the low quality sci-fi we are producing today. Maybe we need to pass a law that you are not allowed to watch stupid shit like startrek untill you have read every Asimov story. I saing this toung-in-cheak, but it dose highlight the problem. Hollywood only seem interested in talking about killer robots and deadly diseases. The unknown is cool, evolution is good (even when it means replacing humanity with smarter geneticaly engenered humans or smarter robots), and the only people who will have problems when the future happens are those who don'twant it to happen. I would like to see a few god sci-fi movies / popuilar books with the following theames: 1) Alians hear our radio, predict that we should be interesting by the time tey could get here, and descide come to visit, but we have failed to evolve since we have so many stupid luddites, so they get bored and leave.. telling us that our starving populations are not worth the effort for them to give us the technology to feed them. I would like to see this told from the Alians point of view and make it cristal clear that the alians are the good guys and humanity is the bad guys. Alternativly, it could show two possible futures based on a luddite being ellected president. The future wthout the luddite president would have the Alians getting here being close to technological equals.. due to relitivisting effects of their trip.. and humanity would be more able to cope with the alians limited technological superiority, i.e. we would learn faster. We wuold end up sending a join mission out to explore another system or soemthing equally col. The future with the luddite president would entail the alians ariving with vast technological superiority.. and humanity being psychologically unable to handle it. This future would involve lost of luddite humans tring to murder non-luddite humans and alians.. and generally getting their ass kicked. 2) we creat artificial intelegence which is smarter then us and prociedes to replace us, but the artificial intelegence is making the world a nice place to live an humans are continually tring to kill it because they feal threatened and want to controll everyhting. (General sort of AI replacing humanity == good, humanity == obsolete and bad) Anywho, todays sci-fi winght have been meaningful 50 years ago, but today it is just rambalings morons who don't understand the most basic things about physics and psychology.. and I would not expect to see a significant drop it the numbers of nuts like Bill Joy untill we have decent literature.
Campus Crusade for Cthulhu -- it found me!
Re: Bill Joy fears the future (none / 0) (#5)
by ramses0 on Sat Mar 18, 2000 at 06:28:14 PM EST

First of all, Kuro5hin is different from slashdot.  In the same way that
Macintosh is different from Microsoft, K5 has a different focus, a different
audience, and different types of users than slash.

One of the biggest differences you might not have noticed is that K5 users
decide what stories get posted.  If you log in you'll see a button on the
right-hand side called "Moderate Submissions".	You (as a regular user) get to
decide which stories that have been submitted to K5 will be posted to the front
page.  Try it, it's neat.

That means that any "old regurgitated slashdot stuff" is here because users
voted to see it here.  Take a look at the <a

While I'm pretty sure you were trying to troll, you'll find that it usually
doesn't work here at K5.  Yet another reason that K5 is better than slashdot
;^)=  

Anyway, good luck with your life, and if you see anything related to
"Technology and Culture, from the trenches" go ahead and submit up a story. 
See what happens.

--Robert
href="http://www.kuro5hin.org/?op=displaystory&sid=2000/3/15/83821/1916">"10
0M free university"</a> story, and look at the comments from the guy in
Sweden.  That's not old slashdot stuff.

[ rate all comments , for great justice | sell.com ]
Confused? Explanation here. (4.00 / 1) (#7)
by rusty on Sat Mar 18, 2000 at 08:47:50 PM EST

Heh. bit of a mixup with this comment. It's a reply to a rather rude "K5 sucks, you're a cheap /. ripoff, blah blah blah" comment which I deleted, cause the AH who posted it was spamming it all around the site, and frankly, it's in the FAQ, and we've heard that before.

Anyway, ramses0 had a nice answer, so maybe next time I'll leave that crap around for you all to refute in your irreproachably self-possessed way. :-)

____
Not the real rusty
[ Parent ]

Re: Confused? Explanation here. (none / 0) (#11)
by Inoshiro on Sun Mar 19, 2000 at 12:32:27 AM EST

Yes, but could we fix the dangling hyper link? :-)

--
[ イノシロ ]
[ Parent ]
Re: Confused? Explanation here. (none / 0) (#12)
by rusty on Sun Mar 19, 2000 at 02:54:11 AM EST

Hm. Short answer: no. Long answer: yes, but it'd be more trouble than it's worth, really. I never said I'd go around fixing up other people's formatting mistakes. :-)

____
Not the real rusty
[ Parent ]
Anti-Luddite (none / 0) (#8)
by nascent on Sat Mar 18, 2000 at 11:25:36 PM EST

If offered the option of replacing my left eye and attached cording with a mechanical version, I would in a heartbeat - need or not. So a Luddite, I'm not.

In that light I not only tolerate Joy's conclusions but accept them as a foregone conclusion. We will get to the point of intelligent machines. We will become quite proficient with nanotech. Since the outlined bad results are simply a matter of proof-of-concept, the idea that we won't damage or destroy ourselves is pretty damned funny.

Call me a prick, but I'm also perfectly willing to let humans run their course, REGARDLESS of what that might be. Remember, extinction is a facet of evolution.

nascent
http://www.intap.net/~j/
nascent
http://www.intap.net/~j/

Bill Joy fears the future | 14 comments (14 topical, 0 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!