Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Dilution of Information in Distributed Filesharing Systems

By bgp4 in News
Mon Jul 17, 2000 at 04:30:12 PM EST
Tags: Internet (all tags)
Internet

I've written a rant on distributed filesharing mechanisms (such as Gnutella) and how their usefulness will decline as more systems are developed. The basic idea is that as more and more diverse networks arise, the same amount of information will be diluted into one of these many systems. This decreases the value of each individual network as well as all networks taken as a whole. Having to fire up 5 or 6 clients to find what you need will be too much work for most. What I'm looking for are ideas on how to combat this problem, be it a client that interfaces with all the networks or simply formalizing the protocols from all the networks into XML docs.


Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o rant
o Gnutella
o Also by bgp4


Display: Sort:
Dilution of Information in Distributed Filesharing Systems | 27 comments (17 topical, 10 editorial, 0 hidden)
same with instant messaging (2.00 / 1) (#1)
by ramses0 on Mon Jul 17, 2000 at 02:43:01 PM EST

This is also closely related to the problems with instant messaging software. When there are 10 different IM clients, IM becomes just not useful anymore.

--Robert
[ rate all comments , for great justice | sell.com ]

Re: same with instant messaging (none / 0) (#15)
by baka_boy on Mon Jul 17, 2000 at 07:39:03 PM EST

IM only fails to be useful if each seperate client/protocol tries to pass itself off as the be-all, end-all solution to every messaging need. I think that if a player in any Internet protocol battle wants to have a chance, they need to find a patron group.

Napster has the college kids downloading tons of MP3's, AOL has their product bundled with a softer, gentler ISP package, etc. Find a niche, and guard it ferociously.

[ Parent ]

Same attrition (none / 0) (#26)
by detroit on Tue Jul 18, 2000 at 10:11:47 AM EST

I think, like IM, a few of the clients will deteriorate or merge, and it'll end up a natural oligopoly. You'll never have one im client or one distributed system, and that's probably a good thing (gives jabber-like projects and gnutella a chance). I can see how people are averse to universal clients, but when there are so many competing ideas out there, they're just about the only convenient option if all you want is the connectivity and not the fancy 'check your portfolio' features of specific proprietary clients. A modular, univeral filesharing client/server would be really interesting to implement, but would prolly take care of whatever bandwidth I had left.

Jeff

[ Parent ]
Supplies of information (2.00 / 2) (#3)
by baka_boy on Mon Jul 17, 2000 at 03:03:51 PM EST

Okay, to everyone browsing with editorials turned on, I aplogize...this should have been topical the first time. You're assuming that information is some sort of linearly-increasing (or constant) quantity. Look at what happened with the explosion of the Web, though: not only did massive amounts of existing material go online - the availability of that data, plus the communication tools it came with, created an explosion of new information, ideas, and interaction.

I think that the having the ability to near-instantly share something you have found, created, or improved with a sizeable fraction of the rest of the human population will (and is already) do nothing but increase the need for more channels to distribute it. People will jump to those new avenues as soon as their current tools are overburdened, outdated, or lack the information they seek. I think we'll see a sort of evolution as tools grow and change, rather than rapid, fatal mutations or population explosion.

Worry more about the infrastructure of the Net than about those client tools and protocols. Look at the panic Napster caused at so many offices and college campuses, as the trading of fat MP3's through narrow pipes choked them. When everything digital can be swapped with a few hundred thousand random people, we're going to see just how much Cisco, Qwest, et. al. deserve their big fat stock valuations.

Re: Supplies of information (none / 0) (#8)
by bgp4 on Mon Jul 17, 2000 at 04:08:29 PM EST

Here's the way I think of it. Think of the beginning days of the web. Things were really taking off, everyone was using http on port 80 with mosiac. Netscape decides instead to develop their own protocol (netscape random protocol = NRP) runs it on port 112, and releases their browser to only look at that information. Then MS jumps in, makes MSRP, and it's the only thing IE can do. Search engines only search for things in their respective network (port 80/http, port 112/NRP, port?/MSRP) and don't cross reference anything.

Where would the net be now if that had happenend? That's the state of affairs for distributed filesharing systems.
May all your salads be eaten out of black hats
[ Parent ]
Re: Supplies of information (3.00 / 1) (#11)
by baka_boy on Mon Jul 17, 2000 at 05:21:28 PM EST

You're over-simplifying things, though. HTTP is just one protocol out of many that happened to become extremely popular. What ever happened to gopher? How many end users have nice graphical LDAP or NIS clients? How many ports are open on your average workgroup server? Each of these show a protocol that served or serves a valuable purpose for some group, but may or may not have the visibility or popularity of the big three (HTTP, POP, SNMP).

Networked systems show many of the same behavioral trends that complex biological ones do. First, something catastrophic sets off a period of quick change, mutation, and seeming chaos; then, each of those new designs is tested against its competitors. Some die out completely, others become relegated to a niche, and a few prove flexible and powerful enough to thrive in many environments.

Your assumption that search engines can't cross-reference also seems difficult to justify. Just because current distributed system search tools are weak and inefficient doesn't mean they won't improve. Current web search tools can index Usenet, static HTML, databases, and RSS/RDF data feeds.

Gnutella and friends, on the other hand, are practically newborns. The absolute best thing that can happen to them, if distributed file sharing and networking tools are ever going to mature, is for them to bump around in the dark for a while in as many strange and inefficient forms as possible. Each generation will be progressively more useful, until we finally have something great.

[ Parent ]

Re: Supplies of information (none / 0) (#21)
by Anonymous Hero on Tue Jul 18, 2000 at 08:46:15 AM EST

Worry more about the infrastructure of the Net than about those client tools and protocols. Look at the panic Napster caused at so many offices and college campuses, as the trading of fat MP3's through narrow pipes choked them. When everything digital can be swapped with a few hundred thousand random people, we're going to see just how much Cisco, Qwest, et. al. deserve their big fat stock valuations.

What the heck are you talking about? One minute you seem to be making sense then all of a sudden you just diverge into an anti-Cisco, anti-Qwest rant. WTF?

[ Parent ]
Crosspost aggressively! (4.30 / 3) (#12)
by marlowe on Mon Jul 17, 2000 at 06:11:04 PM EST

Grab files you really like off of each network and post them on all the others you know of. If enough people do this, it will be as if there were only one network.

-- The Americans are the Jews of the 21st century. Only we won't go as quietly to the gas chambers. --
The unstoppable decadence of intelectual property (2.00 / 4) (#13)
by Pac on Mon Jul 17, 2000 at 06:15:54 PM EST

[Sorry, I wrongly posted this as editorial. Again:]

<rant subject="note on piracy">
How do we justify the price of CD? Distribution costs?Furthermore, how do we justify the fact that a music CD, a media capable of packing some orders of magnitude more information than a vinil record, comes with more or less the same quantity of music? Protecting our profits, thats how we justify it!

And so we are all pirates now. As RMS would say, we come to your cities, kill you man, rape your women, kidnapp your children and take away anything of value you can find, right? Well, I don't think so.

Please, tell me who is profiting when you download a song from Gnutella? The person who offered the download? No. Gnutella authors and mantainers? No. Some vaporous entity (Gnutella.org, Gnutella.com)? Try again.

I agree the author of the said song should be rewarded in some way. Maybe some cents for each download. Maybe a dollar for each download. But the fact remains that it will always be an opt-in situation. You can't really control the duplication of digital content. We will have to live with it.
</rant>

As for the creation of an inter-network protocol, it would probably be a good thing. But I doubt we can really avoid the darwinian battle. To make things worst, even as we speak The Ogg Vorbis CODEC project is advancing in the direction of a new format (unencumbered of patent and copyright issues) to replace MP3.

Evolution doesn't take prisoners


Not much in the argument as to why... (4.00 / 1) (#14)
by Wah on Mon Jul 17, 2000 at 07:23:19 PM EST

...the quality of Networks would erode.

Sure having 5 or 6 Dist. networks will mean a lot of redundant info, but that doesn't mean that one copy won't hang out on the one you like. As long as only one copy exists on any network, you would have access to it (And all of a sudden there are two).

I think Napster has an advantage by specializing in one type of file. While I couldn't provide technical commentary on why this would be, I also think one filetype makes security easier.

But even considering the fact that many programs will *exist* that doesn't necessarily mean they will all be used.

There will be dominant players and those, almost by definition, will be the most useful. Much of this discussion will be dependant on what the law considers to be o.k. (mainly whether or not commercial entities are allowed to offer the service, there shouldn't be a question for non-coms)

--
Fail to Obey?
Why use XML as glue? Go native. (4.00 / 1) (#16)
by mebreathing on Mon Jul 17, 2000 at 08:00:10 PM EST

I think the notion of writing a metaclient that integrates with Napster, Gnutella, and all the others is fine and dandy if that's really important to you. But XML is here to stay. We should be just designing apps that natively speak XML. By doing so, we are contributing to a framework of interoperating applications that speak the same language. And once we start embracing XML, all our apps will be able to interoperate in ways we can barely imagine right now.

This has been discussed on ShouldExist, and people have posted some great resources about the subject.

Re: Why use XML as glue? Go native. (none / 0) (#17)
by bgp4 on Mon Jul 17, 2000 at 08:31:52 PM EST

Ah! holy chicken monkeys...

I have never seen ShouldExist before, let alone read that article. It pretty much states what I've stated with a pile more technical content. Geez, I feel like a goof ;)

Seriously, they raise some good points about XML and it's pro's and con's in this situation. People are pretty quick to weild the XML sword thinking it will solve all your problems... and it may if you weild it right.
May all your salads be eaten out of black hats
[ Parent ]
Sigh... (4.00 / 1) (#22)
by Alhazred on Tue Jul 18, 2000 at 08:49:50 AM EST

This is no different than any of the Operating System wars, PC hardware platform wars, mini-computer system wars, etc etc etc etc that have been "fought" over the last 50 years.

The primary value of communications technology IS the networking effect, the more that use it the better it works for all. This was true also of operating systems. This is why eventually MSDOS/Windows took over the desktop.

The fact that all these file sharing networks are OSS based (for the most part) does not change those economics one bit. Gnutella and FreeNet and etc will battle it out. Eventually they will either learn to interoperate or one or another of them will for whatever reason descend into obscurity just as AmigaDOS, GEM, C/PM86, etc etc etc have gone the way of the wolly mammoth.

The only question is "how good will what we end up with be?" Are we going to get stuck with a crappy technical solution, like we did on the PC desktop, or not? Keep your vision clear, never stop looking around you, and see the patterns forming. Personally I doubt that ANY of the existing technologies will survive.
That is not dead which may eternal lie And with strange aeons death itself may die.
Re: AmigaDOS (none / 0) (#28)
by Anonymous Hero on Tue Jul 18, 2000 at 05:31:30 PM EST

AmigaDOS did not fade because of lack of interoperability (in fact, it's still got an enthusiastic and active userbase). AmigaDOS happily works with all manner of networks, even to the point of having AmigaDOS device handlers that allow you to open a shell window and type "cd ftp://somehost.somewhere.com" and have it appear as if part of the local filesystem, or "cd arc:work:myarchive.tar.gz" - depending on what you install in your SYS:Devs/DOSDrivers directory, and provided you've got a tcp stack installed (which itself allows low-level access thorugh a TCP: device handler, as well as the normal bsdsocket.library API).

There are similar handlers for Netware, SMB, and a load of others.

There's several Java virtual machine clones, in varying states of bugginess, that work reasonably well with one of the several web browsers available on the platform.

The entire GNU suite of software, and X window system, is also ported to AmigaDOS. (on www.ninemoons.com).

There are thousands of open-source and closed source amiga applications available on the aminet (www.aminet.org) - Please, please, please take a look if you are looking for ideas/source code to port to Linux - there's plenty of stuff that is both very cool, and still unavailable on Linux, yet open source. MUI (www.sasg.com) is *still* easier to program for than either Qt or Gtk...

As an aside, a new company (www.amiga.com) now owns the Amiga Trademark and Intellectual Property, and has just released an SDK for its new architecture, based on a generalised Virtual Machine concept from www.tao-group.com, like Java, but for any language which can compile to its virtual architecture (currently Java, (GNU) C, (GNU) C++, and straight Tao Assembler). The JIT compilers themselves are ported to the virtual machine, and the system can therefore dynamically adapt to heterogenous multiprocessing environments. While this is a fascinating and possibly wondrous architecture in itself, it has little to do with the original Amiga arch. except the name. www.aros.org is a project to produce an open-source amiga-compatible OS that runs natively on the x86 architecture. It already runs doom and quake :-). There's still new amiga news at www.amiga.org

AmigaDOS did a lot of things right - sometimes better than UNIX, in fact, although it was largely a very similar system.

[digression:except , of course, that it was pretty-much single-session, although was pre-emptive multitasking (i.e. multiple users were possible through a filesystem called MuFS, but only one could be logged in and using the system to run multiple programs at a time), and lacked true memory protection or proper resource-tracking (it used semaphore locking, and called it "cooperative memory protection", but since that required applications to honour the semaphores, that bombed...)

Through it's assign/logical device mechanism, it had a fs/interface abstraction mixed with a UNIX-style LVM (it's difficult to explain this architecture without demonstrating it, but it's what allows the easy addition of filesystems to handle really wierd things - eg. cd'ing to winodws in the window manager and having dierctory listings return the gadgets they contain - like linux /proc/ but generalised to everything in the system, and everything the system can access across any network for which amiga drivers exist. Think proto-plan9 ).

It had an extremely efficient message-passing microkernel-like architecture, and used message-passing by reference, thus interprocess communication of multmegabyte-sized chunks of data with extremely low latency was incredibly easy, a matter of passing a pointer rather than all the data (this had a downside of making memory protection extremely difficult), thus giving incredibly high data throughput for the time, and, combined with the amiga's co-processors, meant that near-broadcast quality (for the time) soft-realtime audio and video work was possibly on a 7 MHz (count 'em!) computer.

It had dynamic shared libraries, all of which were reentrant (i.e. only one copy needed memory at any time, even if in use by multiple programs.) - It was also possible to dynamically patch individual function calls within shared libraries at runtime, allowing every bit of the system to be updated/customised by third parties bit by bit, even if ostensibly closed-source.) ]

What killed Amiga was incredibly bad management and marketing by the parent company, CBM - The Amiga division NEVER MADE A LOSS, the amiga was an incredible computer, but all the money was drained out of Amiga R&D and promotion into marketing over-hyped, under-specced CBM PCs, by idiot PHBs.

[ Parent ]

Metcalfes Law (3.00 / 1) (#23)
by kris on Tue Jul 18, 2000 at 09:30:36 AM EST

What you are talking about it publically known as "Metcalfes Law". See Jacob Nielsen for more information abot this law and its effects. The law states that

the value of a network grows by the square of the size of the network



Metcalfes Law (3.00 / 1) (#24)
by kris on Tue Jul 18, 2000 at 09:30:52 AM EST

What you are talking about it publically known as "Metcalfes Law". See Jacob Nielsen for more information abot this law and its effects. The law states that

the value of a network grows by the square of the size of the network



Re: Metcalfes Law (none / 0) (#25)
by kris on Tue Jul 18, 2000 at 09:32:03 AM EST

I am sorry for posting this twice. I got an error from my browser after the first submission and hit reload. Is there a way I can delete such a duplicate posting from myself on this site?

[ Parent ]
Dilution of Information in Distributed Filesharing Systems | 27 comments (17 topical, 10 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!