Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Microsoft's Latest Ploy to Stifle Innovation

By ry2me in Op-Ed
Mon Oct 29, 2001 at 10:35:40 AM EST
Tags: Freedom (all tags)
Freedom

As Signal 11 pointed out several days ago in this article, Microsoft's Scott Culp, Manager of the Microsoft Security Response Center, stated in this article Microsoft's request to end 'information anarchy' - the free exchange of known security vulnerabilities to such public forums as BugTraq.

Please note, this is a resubmitted article (just in case you're feeling some deja vu).


Corporations will try nearly anything to gain even the most minute of advantages over their competition. Like an animal chewing its leg off to escape a trap, companies may resort to all sorts of dirty tricks and innovation-stifling approaches, to ensure their paychecks keep rolling in every two weeks. Microsoft is resorting to just such an approach, as evidenced by its recent TechNet Bulletin entitled "Its Time to End Information Anarchy."

As a computer software developer, I feel that writing the code that produces today's modern applications is something of an art. Unfortunately, the future of modern applications from individuals and start-up businesses is in serious jeopardy from behemoths such as Microsoft. By creating legal boundaries and insurmountable business hurdles in order to bring a new piece of art to market, Microsoft is ensuring its future, and ensuring that us independent programmers remain "starving artists."

Free Exchange of Ideas
Scott Culp, Manager of the Microsoft Security Response Center, claims that the free sharing of information pertaining to security holes in current operating systems (which he terms `Information Anarchy') is the root cause of the recent spate of worms (malicious, self-replicating code spread to unsuspecting computers), such as Ramen, NIMDA, Code Red and others. Furthermore, he states:


"For its part, Microsoft will be working with other industry leaders over the course of the coming months, to build an industry-wide consensus on this issue. We'll provide additional information as this effort moves forward, and will ask for our customers' support in encouraging its adoption."


Mr. Culp seems to be unaware of how the `security community,' as he calls the users who report such vulnerabilities, operates. A popular online mailing list named BugTraq, of which this author has been subscribing to for some time, and which is arguably the most-read online forum for such issues, operates under a pre-established set of guidelines. All users who post to the forum are expected to follow their rules. It states that a user, who finds a security flaw in a piece of software published by an individual or commercial entity, is expected to notify the company directly of the flaw. The BugTraq FAQ identifies the proper protocol for reporting security vulnerabilities:

1. Contact the product's vendor or maintainer and give them a one week period to respond. If they don't respond post to the list.
2. If you do hear from the vendor give them what you consider appropriate time to fix the vulnerability. This will depend on the vulnerability and the product. It's up to you to make and estimate. If they don't respond in time post to the list.
3. If they contact you asking for more time consider extending the deadline in good faith. If they continually fail to meet the deadline post to the list.

Notice how item number 1, above, which states BugTraq's basic operating premise of giving ample time to a vendor to fix a security flaw, makes the main argument of Mr. Culp's paper inaccurate (namely, that the purpose of providing resources such as BugTraq is only in the interest of disseminating security holes and not in fixing them). Mr. Culp is basically stating that corporations such as Microsoft are unwilling or unable to provide a security fix in a "reasonable" amount of time, and instead of working to fix this deficit, are asking that users refrain from publishing their findings in the first place. If we are expected to relinquish our rights to publishing our findings of insecure programming, how can we reasonably expect this company to provide security fixes in a "reasonable" amount of time? Should Microsoft have its way, I would be surprised if we ever saw a security fix released.

Microsoft's Unfair Competition Ploy
If Mr. Culp's article can be taken as a prelude to future Microsoft policy, than he appears to be stating that the company plans to move on this issue by generating industry-wide support (no doubt through various propaganda campaigns). Envision for a moment the next step.

Should Microsoft be successful in garnering industry support (at least through spreading misinformation - in the form of FUD ), it would most likely bring about a Congressional bill for a new law that would make it illegal to disseminate knowledge of security vulnerabilities. This bill would make it illegal for a person to post to a public forum or other publicly accessed information site about how to create malicious software to exploit this new vulnerability. This bill would be all encompassing, meaning that an individual may not post information about vulnerabilities about any operating system, be it from Microsoft or an open-source platform. Thus, open source users would be unable to actively develop such software as the Linux operating system, which relies on countless users poring over the programming code used to create the software. It would thus give Microsoft the advantage of being able to hire teams of individuals to find security vulnerabilities in its software, where Linux, which is a non-profit group, would be unable to allow users to report any security problems they may find.

Microsoft would be gaining an enormous advantage over not just open source software in general, but over Linux, the most difficult competitor Microsoft has ever had to face.

Why Peer Review is So Important
As many security experts will agree, the best way to locate new bugs is to allow users to pore over the code for a program, allowing people with proper training to find the bugs and flaws that every non-trivial software system has. These users, of which there are thousands, have been instrumental in the success of the free operating system Linux, which Linus Torvalds developed in the early `90s. By eliminating the possibility of such peer review for any computer software, Linux would no longer have the competitive edge that Microsoft currently lacks. It will stifle innovation in the open source world by making it impossible for developers to fix new security problems in existing software (i.e. they can't fix problems they don't know about).

Such an anti-competitive, innovation-stifling move must not be allowed to pass in America. Microsoft has already been accused of monopolistic business practices; this is only one more piece of evidence that the company plays by its own set of rules.

What To Do
Many users rely upon Microsoft to provide secure software to meet business and personal demands. With such outlandish, anti-competitive and innovation-stifling proposals emanating from such positions of decision-making power as Mr. Culp's, it is time we demand his resignation, not the end of `information anarchy.' Only through the free exchange of ideas and code will our mission-critical systems software become more secure.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Poll
Should 'Information Anarchy' be regulated?
o Yes: let the government handle it 0%
o Yes: let the users practice self-restraint 13%
o No 86%

Votes: 38
Results | Other Polls

Related Links
o this article
o article
o BugTraq
o BugTraq FAQ
o FUD
o Also by ry2me


Display: Sort:
Microsoft's Latest Ploy to Stifle Innovation | 28 comments (22 topical, 6 editorial, 0 hidden)
Entire premise of article is flawed (3.54 / 11) (#2)
by Carnage4Life on Fri Oct 26, 2001 at 06:37:00 PM EST

Did you bother to read Scott Culp's article on MSDN or did you do what the rest of your Slashdot brethren do and just take the word of CmdrTaco or whoever else it was on Slashdot who submitted the article? Your article claims Microsoft is against releasing vulnerabilities which is untrue. Scott Culp specifically said he wasn't against releasing vulnerabilities but instead believes that he line should be drawn at releasing exploit code,
Quotes from Scott Culp's article:
Providing a recipe for exploiting a vulnerability doesn't aid administrators in protecting their networks. In the vast majority of cases, the only way to protect against a security vulnerability is to apply a fix that changes the system behavior and eliminates the vulnerability; in other cases, systems can be protected through administrative procedures. But regardless of whether the remediation takes the form of a patch or a workaround, an administrator doesn't need to know how a vulnerability works in order to understand how to protect against it, any more than a person needs to know how to cause a headache in order to take an aspirin.

This is not a call to stop discussing vulnerabilities. Instead, it is a call for security professionals to draw a line beyond which we recognize that we are simply putting other people at risk. By analogy, this isn't a call for people for give up freedom of speech; only that they stop yelling "fire" in a crowded movie house. Most of the security community already follows common-sense rules that ensure that security vulnerabilities are handled appropriately. When they find a security vulnerability, they inform the vendor and work with it while the patch is being developed. When the patch is complete, they publish information discussing what products are affected by the vulnerability, what the effect of the vulnerability is - that is, the type and extent of damage that an attacker could cause through it - and what users can do to protect their systems. This type of information protects users by giving them the information they need to decide whether to apply the fix, but it doesn't put them at risk.
So unless your entire article is based on a false premise and putting words in Scott Culp's mouth. If you rewrite your article with justification for why exploits should be exposed as soon as a vulnerability is found, I'll be all ears. Until then -1 is what you get from me.

PS: As for the stuff you mentioned about Open Source being the way to solve security problems, I refer you to The Myth of Open Source Security by John Viega the author GNU Mailman or this hole that went unnoticed for years in the Linux kernel.

Releasing exploits as a "sitck" approach (4.16 / 6) (#3)
by khym on Fri Oct 26, 2001 at 06:58:08 PM EST

Though this might not be the actual intent of releasing exploits, I thought that the reason for doing so was a stick (versus carrot) approach to get software vendors to fix the bugs: "If you don't fix this security bug in a reasonable amount of time, we'll tell the whole world how to exploit it".

--
Give a man a match, and he'll be warm for a minute, but set him on fire, and he'll be warm for the rest of his life.
[ Parent ]
So . . . (2.14 / 7) (#6)
by regeya on Fri Oct 26, 2001 at 10:43:12 PM EST

. . . checking your system against the 'sploit is double plus ungood think?

And taking exception to this line of thought (mainly because it was the stance Slashdot took) is a bad thing?

Here we go again--kuro5hin is not Slashdot, therefore the only position to take is whatever happens to be the opposite of Slashdot.

I mean, I can see taking exception to the article at hand, but to say, "I disagree, so I'm going to vote it down," is probably one of the best examples of why the democratic process that kuro5hin uses just doesn't work.

Thanks for being part of the problem. Care to be part of the solution?

[ yokelpunk | kuro5hin diary ]
[ Parent ]

What the heck are you talking about? (2.83 / 6) (#7)
by Carnage4Life on Fri Oct 26, 2001 at 10:53:20 PM EST

. checking your system against the 'sploit is double plus ungood think?

And taking exception to this line of thought (mainly because it was the stance Slashdot took) is a bad thing?


I don't know what post you read but I'm pretty sure it's not mine since nowhere in my post do I say this. I claim that the author hasn't read the article since his opinions match those of someone who only read the slashdot blurb without reading the article

I mean, I can see taking exception to the article at hand, but to say, "I disagree, so I'm going to vote it down,"

Nope, again you must have read someone else's article. I voted it down because the entire premise of the article is BOGUS. It presents an arguments against things that the Microsoft employee did not sat. This is like me writing an article attacking Bruce Perens for saying that Open Source is not a successful way to create software when he said no such thing then expecting people to vote it up.

is probably one of the best examples of why the democratic process that kuro5hin uses just doesn't work.

Oh, it doesn't work because people vote down the articles you like? Tough Shit. If you want to read a site where only the posts you like are viewed, download scoop, install it on your machine and you'll be good to go.

[ Parent ]
hmmm (3.66 / 6) (#13)
by montjoy on Sun Oct 28, 2001 at 03:49:07 AM EST

"an administrator doesn't need to know how a vulnerability works in order to understand how to protect against it, any more than a person needs to know how to cause a headache in order to take an aspirin. "

As an administrator this scares me. If I know what causes a headache, I know how to prevent it.

[ Parent ]

Open Source != Peer Review (3.85 / 7) (#8)
by stuartf on Fri Oct 26, 2001 at 11:44:11 PM EST

Why do people believe that open source automatically means peer review? And is there any evidence that the informal peer review style of open source is better than a formal peer review of closed source code? Frankly, I wouldn't trust 99.99% of Linux users to even be able to audit the Linux kernel code, and the same applies to the Windows source. We're talking fantastically complicated pieces of software here, just because thousands of people use it doesn't mean thousands could even attempt to review the code.

Some of lots of more than some of none (4.00 / 4) (#11)
by wnight on Sat Oct 27, 2001 at 03:03:36 AM EST

Well, there are two things here. How many people CAN hack the kernel (even just comprehend it when reading), and how do people learn to program.

As for the first... I expect that most "real" programmers (anyone who knows a couple languages including C and has either studied formally or at least has the mindset to read Knuth, etc) could debug kernel issues if they ran into a specific problem. I mean, I might not be able to *rewrite* the thread scheduler, or the VM. Howver, if I noticed a specific problem and could reproduce it consistently and with a small chunk of code, I could probably find the problem spot even if I couldn't think of a good clean way to fix it.

Even if nobody ever patched a line of kernel source though, providing the source helps new programmers. (Ok, simpler tutorials help the VERY new) get good enough that they could one-day contribute.

When I started using the Apple // it came with the equivalent of a compiler (a basic interpretter and an assembler) built into ROM. You could use it with pre-written programs, most of my friends did, but you had the tools to do more. Today most OSes don't come with compilers and the free ones you can get are a pain to setup (especially for a newbie). The open mentality of open-source is more important than having any specific piece of source or having anyone patch it.


[ Parent ]
And how long would it take you? (3.66 / 3) (#15)
by stuartf on Sun Oct 28, 2001 at 03:42:44 PM EST

Howver, if I noticed a specific problem and could reproduce it consistently and with a small chunk of code, I could probably find the problem spot even if I couldn't think of a good clean way to fix it.

That's exactly the point. You're never dealing with a small chunk of code, you're dealing with thousands and thousands of lines of it. You'd have to figure out where to start before you'd even have a hope of finding a bug. And then you have all the other problems associated with reading other peoples code. To be effective at doing this, you'd probably have to spend a few months familiarising yourself with the code. Debugging code you're not familiar with is hard - unless it's Hello World.

[ Parent ]

Elaboration (4.00 / 2) (#17)
by wnight on Sun Oct 28, 2001 at 07:22:57 PM EST

Actually, I meant, a small chunk of my code with which to reproduce it. If you can write a very short test program which exibits the problem then you probably have a very good idea of what is going wrong.
As to how hard it is to fix... Depends on what sort of problem you found. If you know ASM you can probably check if it's a compiler bug or a kernel bug. If it's a kernel bug, most of the code is broken into small source files that handle a single function. It shouldn't be hard to find. And even if you can't fix it, you can give the maintainer of that code a really good idea of what's going wrong and why.


[ Parent ]
History? (4.00 / 2) (#18)
by Surial on Mon Oct 29, 2001 at 04:28:49 AM EST

While I certainly agree that Open Source implies Peer Review, and that there is no reason why every bit of Open Source is automatically secure, history sort of proves that Open Source tends towards more security.
The other big advantage of Open Source in regards to security is that an entity with sufficient financial gain can always pay the cash to hire a developer to audit the code. The costs of such are more or less within control of the entity. Not so with Closed Source.
--
"is a signature" is a signature.

[ Parent ]
Re: History? (none / 0) (#19)
by abdera on Mon Oct 29, 2001 at 07:23:25 AM EST

history sort of proves that Open Source tends towards more security.

Like telnetd? The FreeBSD telnetd is used quite extensively beyond FreeBSD, including NetBSD and several Linux distributions. Sure you can use SSH, but OpenSSH was not integrated into OpenBSD until December 1, 1999, and later ported to FreeBSD, Linux, and others. Buffer overflows have been a frequently-used exploit since 1988, and this one was not discovered (by devlopers/users at least) until this year.

#224 [deft-:deft@98A9C369.ipt.aol.com] at least i don't go on aol
[ Parent ]

General Tedency != always (none / 0) (#20)
by Surial on Mon Oct 29, 2001 at 09:38:25 AM EST

Yes, there are some notable exceptions which all serve as excellent examples proving that you can't rely on userbase alone.

This particular example is mistargetted though; anybody using telnet is approaching braindead as far as security goes. The number of people genuinely interested in having telnet void of security holes are probably small, as I expect the majority of them to switch to things like SSH instead.

I think the more security-related a product is, the more it will benefit from Open Source style Peer Review. I don't have any proof for this, but I think it's obvious; the more security-related a product, the more people who use the product that are concerned about security, and ostensibly, the more Peer Review it gets. In addition, any peer review it gets will probably have more focus on the security aspect vs. other things like extra features, or speed/memory efficiency.

In particular, I think OpenSSH, Apache, and the (TCP/UDP)/IP layer in Open source environments are getting a fair share of security attention by volunteer auditors.

Irrelevant to all this, I do believe that any negative effects (security wise) of opening the source to a product are more than offset by the positive effects.
--
"is a signature" is a signature.

[ Parent ]

Peer review no excuse for lack of in-house review (none / 0) (#22)
by abdera on Tue Oct 30, 2001 at 08:32:26 AM EST

This particular example is mistargetted though; anybody using telnet is approaching braindead as far as security goes. The number of people genuinely interested in having telnet void of security holes are probably small, as I expect the majority of them to switch to things like SSH instead.

Since OpenSSH was not available until December 1, 1999. I'm not entirely sure how long the buffer overflow exploit has been in the telnetd program, but I would expect that it has been there all along. At the very least, it would be safe to assume that it has existed since July 1995, since HP-UX 10.x is affected. So then, were all of the sysadmins using telnet between 1995 and 2000 "approaching braindead?"

I think that the best option would be to open the source to peer review, but still maintain strict in-house review la OpenBSD.

#224 [deft-:deft@98A9C369.ipt.aol.com] at least i don't go on aol
[ Parent ]

Security a 'recent' thing? (4.00 / 1) (#23)
by Surial on Tue Oct 30, 2001 at 12:26:36 PM EST

Hmm, those are interesting facts.

I for one have only begun caring about security (partly because of increased interest in Open Source code, and partly because I am the most knowledgable person about encryption and protocol security within my company; not that I pretend to know more than an absolute amateur on the subject).

Combined with the number of exploits recently, and the increase in script-kiddie presence on the internet, perhaps the peer-reviewing public is becoming more concerned about security? I certainly hope so.

I'd definitely agree that In-House review can't just be replaced with Open Source and hoping for some good Peer Review. I would suggest though that security-slant products such as OpenSSH will have more benefit from Peer Review than In-House review.

That still doesn't fully explain why bind has suffered from continued security problems and why this telnetd overflow exploit have been around for so many years though. Perhaps the 'in-house' authors of BIND were not very concerned with security, or perhaps they weren't very knowledgable in that particular subject.

One thing I am sure of: It's so easy to make a security screwup that any extra eyes on your source are a Good Thing(TM).
--
"is a signature" is a signature.

[ Parent ]

Two incorrect assertions in your post (5.00 / 1) (#21)
by Carnage4Life on Mon Oct 29, 2001 at 09:51:07 AM EST

While I certainly agree that Open Source implies Peer Review

Open Source does not imply peer review. Peer review is the through examination of a product by competent developers. Nothing about Open Source implies that a.) the people looking at the source will be competent or b.) the people looking at the source will examine it throughly.

I went into more depth about this in my article currently on the front page of K5. There also more links provided in my comments in that article.

history sort of proves that Open Source tends towards more security

History does not show any such thing unless your perception of history is limited to web servers and then a subset of them (IIS and Apache). If anything history has shown that there's almost no correlation between the source licensing of a product and how secure it is. For every Apache/IIS comparison, one can bring up BIND, sendmail, Linux distros (especially Redhat), wu-ftpd, and a bunch more projects that show that glaring security flaws are as much a hallmark of Open Source as they are of proprietary software.

[ Parent ]
You're smart enough to find it, just send a patch (1.50 / 2) (#24)
by Golden Hawk on Tue Oct 30, 2001 at 04:31:48 PM EST

This poster has a good point, this article only applies to obscure bugs that are right on the surface, that a general user can find. These kind of bugs in open source code are rare, because open source coders (and volunteers in general) really care about thier work, so the number of bugs the public will see are small, and they don't qualify as 'security' bugs anyway. So we've established that all bugs found in open source code that need to be reported will involve examining and auditing the code. If you're good enough to do that, why not just fix it yourself and send in a patch? Bug tracking lists don't mean much at all to someone who has the talent. It seems, if these lists go down, the scriptkiddy will suffer, and the true hacker will grow monumentally in strength :) On top of that Nimda's brethren will die in the womb, so people will stop careing about security and be caught blissfully unaware when the real deal decides to pay them a visit. All the while open source will be no more effected than it is now.
-- Daniel Benoy
[ Parent ]
You're smart enough to find it, just send a patch (5.00 / 1) (#25)
by Golden Hawk on Tue Oct 30, 2001 at 04:32:27 PM EST

This poster has a good point, this article only applies to obscure bugs that are right on the surface, that a general user can find.
These kind of bugs in open source code are rare, because open source coders (and volunteers in general) really care about thier work, so the number of bugs the public will see are small, and they don't qualify as 'security' bugs anyway.

So we've established that all bugs found in open source code that need to be reported will involve examining and auditing the code. If you're good enough to do that, why not just fix it yourself and send in a patch? Bug tracking lists don't mean much at all to someone who has the talent.

It seems, if these lists go down, the scriptkiddy will suffer, and the true hacker will grow monumentally in strength :) On top of that Nimda's brethren will die in the womb, so people will stop careing about security and be caught blissfully unaware when the real deal decides to pay them a visit. All the while open source will be no more effected than it is now.
-- Daniel Benoy
[ Parent ]
Blah. (1.60 / 10) (#9)
by rebelcool on Fri Oct 26, 2001 at 11:47:41 PM EST

Yet another rehash of open source good/microsoft bad. News at 11.

-1.

COG. Build your own community. Free, easy, powerful. Demo site

I'm not so sure. (4.00 / 5) (#12)
by Kasreyn on Sat Oct 27, 2001 at 02:12:18 PM EST

You raise a very interesting point about how legislation may have a blanket effect on open source software in addition to corporateware, and stifle bug hunting in, for example, Linux.

But I for one am uncertain as to whether a law being passed to stop what Culp calls "information anarchy" would even touch Linux. The congresscritters who'd be voting on that would probably be doing so because they'd see it as an intellectual-property issue a la DMCA and circumvention. I can see how MS might be able to convince them that their bugs and exploits are their intellectual property, or some such garbage. I doubt they'd see it quite the same way as Culp unless they were well and truly bought off. Therefore, it would probably read along the lines of, "no disseminating bugs or sploits in commercial/closed source software". If so, then it would mean MS's standards would continue to decline while the Open Source community's would continue to go up. In the end (we hope), market forces will cause the better product to win out.

(crosses fingers and hopes hard)

So, I agree with your analysis of the current situation - partly - but not your estimation of what will happen. And by the way, Linus had help. If you (like many seem to) hate RMS too much to mention his part, then mention the thousands of sweaty geeks who programmed the rest of the Linux OS. No, I am not one of them, but I know one of them. ;-P

Also, phrases like "stifling innovation" and calling for Mr. Culp to lose his job seem a bit extreme to me, seeing as how all MS has done is more of what they usually do (so, how many years have they been calling the OSS crowd "un-American"?). We've seen this out of them before; Culp is nothing new, nor is this idee fixe of his. (pardon lack of accents, I'm not sure how to type those in.)


-Kasreyn


"Extenuating circumstance to be mentioned on Judgement Day:
We never asked to be born in the first place."

R.I.P. Kurt. You will be missed.
Tee hee. (3.00 / 2) (#16)
by quartz on Sun Oct 28, 2001 at 05:14:07 PM EST

In the end (we hope), market forces will cause the better product to win out.

You mean, the same market forces that made Bill Gates the richest man in America because he had "the better product"?



--
Fuck 'em if they can't take a joke, and fuck 'em even if they can.
[ Parent ]
Un-american? (4.00 / 1) (#26)
by fiffilinus on Wed Oct 31, 2001 at 05:49:12 AM EST

The premise seems to be that US law is valid globally. Not so. Many Open Source developers are not US citizens, so even a congressional bill as envisioned would not stifle innovation, merely innovation in the USA would be hindered. If in the interest of a company or industry association mid term to long term economic disadvantage to the entire US nation is acceptable, so be it. The rest of the globe will be happy at having a serious competitor bind itself.

Full disclosure exists for a good reason, folks. (none / 0) (#27)
by bediger on Fri Nov 02, 2001 at 01:55:24 PM EST

You know, full disclosure policies for lists like "Bugtraq" exist for a really good reason, despite what Scott Culp and other MSFT shill-boys say.

Back in the day, when every OS, including rubbish like CDC's NOS and NOS/VE, had a rabid, dedicated following, all the Good Systems Programmers at client sites would dutifully report security problems to the vendor. Just as regularly, Good Systems Programmers were utterly and completely ignored. Only if and when a security problem rose to the level of a Vice President at an important client site would the security problem get fixed.

The reason why "full disclosure" policies exist is because of the lessons of history - OS vendors do not respond to reports of security bugs without the threat of disclosing an exploit to the public. This situation is analogous to why the USA didn't initially have secret ballots ("Australian ballots"), and why the FBI and CIA didn't have initially congressional oversight - ideally, a society doesn't need secrecy around balloting, and the USA doesn't need to guard against its guards.


-- I am Spartacus.
Microsoft's Latest Ploy to Stifle Innovation | 28 comments (22 topical, 6 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!