Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

Hack reporting

By Signal 11 in Op-Ed
Sat Dec 16, 2000 at 06:08:27 PM EST
Tags: Round Table (all tags)
Round Table

I've seen quite afew articles on the topic of people finding holes in websites, leaving default passwords (or no password at all) on critical routers, xDSL customer equipment, etc. Alot of people want to do the right thing but are afraid of getting in trouble for trying to help by reporting these things. I have a few proposals that might help, read on for the details.

First off, I'd like to share my own opinion on the issue. I feel that right now there is alot of paranoia in the world about computers and computer security. As a result, there is also alot of paranoia and misinformation out there. Even in the industry amongst competent people the same problems exist - paranoid and misinformation, although to a lesser extent. People are afraid to advance their knowledge of systems and network security because they fear that they will become the suspect in the next (inevitable) compromise of security. I believe that, like myself, many people who read Kuro5hin have seen flawed security setups during normal use of the system, and would fix it and/or report it, if not for the personal risk to themself.

In a previous article, I outlined a somewhat different approach of using government intervention to help secure critical parts of the network. While that may very well be a viable option, I have another one as well - a private organization which would have no ties to the government.

This organization would have contact database for a variety of ISPs as well as be able to verify vulnerabilities as they are reported by people who know what to look for. Any such test would, of course, be logged for legal purposes. These individuals would contact the appropriate people, informing them both of the problem, and a solution (if available). Anonymity of the person reporting the vulnerability would be assured both by policy and by safeguards built into the reporting system. Although based on trust, it would be expected that the reporter would keep this information to his/herself for a few weeks while the issue is addressed.

Reputation would be critically important to such an organization, as well as having several lawyers on tap to deal with the frequent oblivious and beligerant administrators (and law enforcement!). The opportunity to fix their systems on their own would, of course, be made available to them. In any event, a database would be publicly accessible noting vulnerabilities which have been discovered, and the current status of them. To keep script kiddie activity to a minimum, only organizations which have refused to acknowledge or fix a problem would be listed in the database - this is so that the person making the initial report could obtain a status update without revealing his/her identity.

I believe that an organization which could rapidly respond to these problems and ensure an anonymous reporting system would be invaluable, and provided such an organization could maintain good standing in the security community and not engage in "black hat" activity, I suspect law enforcement would be willing to work with them, or atleast give them some slack, in dealing with and responding to possible situations before they become front page news. Similar to "crime hotlines", this would act as a computer security hotline. Maybe at some point a reward might be offered for people locating vulnerabilities in critical systems, who knows?

There is ample evidence to suggest that this is workable from a legal angle. Again, looking at "crime hotlines" run by private organizations, these places obviously have been able to maintain anonymity of the reporter. Some less stellar uses of hotlines have also been used. In addition to this, organizations like Dance Safe have showed up as well. In brief, they offer to test ecstacy drugs that you may get at many raves to see if any impurities have been added - as such things can pose a health risk. Law enforcement in many areas have been receptive to this, preferring to "go easy" on ecstacy users in exchange for lowering the number of hospitalizations - an equitable tradeoff, I think.

Conclusion: I think a private organization with a good reputation would be able to help secure these problems, bring attention to a major issue, and do so without having people take personal risks, thus improving both the number and substance of reports made about security vulnerabilities in the wild.

Your thoughts?


Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure


Would you use such a service?
o Yes 40%
o No 14%
o Inoshiro 44%

Votes: 61
Results | Other Polls

Related Links
o Kuro5hin
o previous article
o less stellar
o Dance Safe
o Also by Signal 11

Display: Sort:
Hack reporting | 24 comments (24 topical, editorial, 0 hidden)
Funding will be an issue. (4.00 / 4) (#1)
by NicGCotton on Sat Dec 16, 2000 at 03:14:08 PM EST

Few governments would give such a body funding without it having a proven track record, not to mention outlined reporting and security standards. And the orginazation couldn't get these without funding. How could this circle be broken?

Also, how would such a group be able to operate without hiring anyone with a less than clean record, and how would they guarentee the behaviour of said persons?
<i don't like sigs>
reputation (3.00 / 8) (#2)
by Signal 11 on Sat Dec 16, 2000 at 03:26:23 PM EST

How could this circle be broken?

For the same reasons people write code for the Apache project or submit patches for the Linux kernel. Reputation and recognition. If nothing else, it would look good on your resume. That, and hanging around like-minded people is always a plus. Sometimes just having the right environment and attitude goes a long way towards recruiting people for something like this early on. If I had the funds (which I may in about 6 months) I would happily start up such a group in the Minnesota area, get a couple servers, and start working - if nothing else, it's an excellent opportunity to meet other cool people.

The other incentive is that people who did a good job on this project would be more appealing to private agencies like uu.net and QWest because they need talented security people. In addition to private-sector recruitments, likely those people would be in demand for government positions as well, such as at the DoD, or local law enforcement. Obviously the people working inside this organization would be well-known to law enforcement, given the material, so I'm not terribly concerned about having a "cozy" relationship with them. So long as they stay on "their" side of the fence, there would be no issues.

Also, how would such a group be able to operate without hiring anyone with a less than clean record, and how would they guarentee the behaviour of said persons?

You can't; But you can limit it. Procedures in place within the organization such as logs, packet sniffers, peer review, etc. Same way law enforcement does it, but alittle more relaxed - we keep records. Obviously there will be a few people who push the envelope, and they would be dealt with as they would at any other company or non-profit organization. Can it be prevented? Of course not, everything depends on some level of trust. But I think the opinion of your peers and meeting in meatspace would go a long way towards preventing such activity.

As far as "clean record", I could care less about that, and an organization like this need not dismiss those people out of hand either. I can't speak for everyone else, or anyone who might create such an organization, but my opinion is that everyone has done some things that maybe they regret in security. Just about all of us have gone places we shouldn't have been, etc. The important thing is that the person is mature, and recognizes the responsibility they have, both to the organization, and the 'net at large. It's a matter of trust - we've all made mistakes, let's try to move forward instead.

Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

Re: reputation (4.25 / 4) (#3)
by forgey on Sat Dec 16, 2000 at 04:03:08 PM EST

Well, you may not care about someones clean record, but other people are going to care about it.

If you have someone in this organization who has been caught (or convicted) for hacking and they happen to be involved in a report of a security vulnerability at an overly paranoid company this may worsen the negative reaction they are going to have towards you.

Quite a few top security companies refuse to hire anyone with a backround for hacking. They simply can not have the negative light that would cast upon them in their clients eyes. I have read a few interviews with some prominent security figures who mention that although they know a lot of their employees without records were probably doing the same illegal things in their youth that the few hackers they refused to hire were doing that didn't change the fact that they wouldn't hire anyone with a record.

Sure it sucks, but that is the way it goes.


[ Parent ]
Sad, but true... (4.33 / 3) (#6)
by Miniluv on Sat Dec 16, 2000 at 04:40:22 PM EST

Several of the more prominent "hacker think tanks" have made their stands on this issue, but most of them have also waffled on it. There are also highly reputable companies, Security Focus comes to mind, who hire on a skills and maturity basis, rather than purely one of criminal record.

Sure, past history is one of the best indicators of future conduct, but sometimes you have to look past it. There are circumstances where it's an irrelevant consideration during the hiring process. For example, imagine a security equivalent to Underwriters Labs. All they do is take a product and poke, prod, fold, spindle and mutilate it until they determine what it's relative level of security is. No penetration testing of a network, no access to "secret" information, pure research. This is a situation in which hiring a convicted "hacker" isn't a downside, and may in fact be a positive. You're hiring somebody with a proveable, beyond a reasonable doubt to a jury of their peers anyhow, track record of ingenuity and knowledge.

"Its like someone opened my mouth and stuck a fistful of herbs in it." - Tamio Kageyama, Iron Chef 'Battle Eggplant'
[ Parent ]

Re: Sad, but true... (3.00 / 2) (#21)
by tabish on Sat Dec 16, 2000 at 11:44:26 PM EST

The problem with that sort of setup is that there's a big difference between a tightly built and secure product and a company effectively securing their installations of that product.

OpenBSD hasn't had a remote hole in three years, but that doesn't stop the end administrator from using "password" or "password1" as their root password. (Well, maybe it does prevent you from doing that, but you know what I mean)

Besides, many of these security holes that people talk about are in custom, proprietary code... I can think of few shopping sites that would be willing to turn over their boxen (or source code) to an organization like this before they go live with a new setup, especially considering the amount of time an audit of this sort would take.

[ Parent ]
Gonna start it? (3.50 / 4) (#4)
by driph on Sat Dec 16, 2000 at 04:11:37 PM EST

Interesting idea that someone should jump on.

Anyone interested in the opportunity? I think it's something that could start small and grow as the ability and integrity of the group is proven.

How about a Scoop site? Build a set of guidelines and rules, and allow anyone to participate.. perhaps the moderation queue would be managed a bit differently than Kuro5hin, as issues of a sensitive nature to the involved companies would spring up... It'd have to be less hands off on the administrative end, I believe.. Maybe story submissions would be limited to trusted members, to ensure that the project holds strong to the "white hat" ideal.. Thats for you to decide.. But I think it could work..

Vegas isn't a liberal stronghold. It's the place where the rich and powerful gamble away their company's pension fund and strangle call girls in their hotel rooms. - Psycho Dave
*cough* (2.00 / 4) (#9)
by Signal 11 on Sat Dec 16, 2000 at 05:10:53 PM EST

Maybe me. :) Just maybe. hehehe.

Society needs therapy. It's having
trouble accepting itself.
[ Parent ]
Dude! (2.33 / 3) (#15)
by pb on Sat Dec 16, 2000 at 05:54:35 PM EST

...like you need to do anything else that's high-profile for the rest of your life.

Don't get me wrong, I think it's a great idea, and a great cause, but... couldn't you use a break?

Study your geometry, pay your fines (or slog through your legal crud) and get back to school, and best of luck to you.
"See what the drooling, ravening, flesh-eating hordes^W^W^W^WKuro5hin.org readers have to say."
-- pwhysall
[ Parent ]
Bah! (2.00 / 5) (#18)
by Signal 11 on Sat Dec 16, 2000 at 06:46:57 PM EST

Bah, who cares what everyone else thinks? I'm not going to slow down because others can't keep up.

Society needs therapy. It's having
trouble accepting itself.
[ Parent ]
well... (3.00 / 3) (#16)
by 31: on Sat Dec 16, 2000 at 06:16:50 PM EST

As long as you don't do it as Sig11 :) ... imagine all the flames directed at you know, directed at an organization tracking security stuff... I wonder how many more attacks that would draw...

[ Parent ]
Look on the bright side... (2.00 / 4) (#19)
by Signal 11 on Sat Dec 16, 2000 at 06:47:22 PM EST

... we wouldn't need a honeypot for our servers.

Society needs therapy. It's having
trouble accepting itself.
[ Parent ]
A step beyond CERT? (4.40 / 5) (#5)
by Miniluv on Sat Dec 16, 2000 at 04:33:30 PM EST

The only problem I see with this idea is that it'll be very hard for this organization not to fall prey to the politics involved in this sort of situation. It's a very tough line to walk, especially when it's a larger ISP or hosting company, who potentially may need a week or two to follow internal change management procedures on implementing a security fix. When is it acceptable and when is it heel dragging?

From my perspective, the reason CERT has failed so miserably is that they went too far towards appeasing the vendors, often at the apparent expense of the community as a whole. Bugtraq regularly sees arguments about the timing issues involved in exploit releases, and this is an issue of a similar nature. I honestly do not see any ISP or hosting provider paying for a service like this, and that's the "market" they'd need to market towards. Reputable hosting providers, and ISPs as well, regularly pay for external security audits to be performed, and probably feel that's all they really need to do. Honestly, it's a hard argument to refute or agree with. Sure, security is a process not a snapshot, and they ought to be conducting internal audits on a regular basis. On the other hand they're a business, and thus should be striving to make the greatest profit possible without compromising service.

The other hurdle is who certifies that the employees of this third-party organization have a clue? Do we require them all to have CISSP certs? Perhaps we disallow the hiring of MCSEs? Obviously ISPs are going to be uncomfortable with this organization probing their network to verify vulnerabilities if they don't know who's running the probe, so that's one portion I'd put on the far back burner.

On the whole it's a good idea, great idea actually, but it's gonna be a cast iron bitch for someone to try and put in practice. There're a LOT of angles to consider, and an ever changing landscape to which they have to adapt. I'd be interested in being part of something like this though, so if you think it's pursueable, please keep us informed.

"Its like someone opened my mouth and stuck a fistful of herbs in it." - Tamio Kageyama, Iron Chef 'Battle Eggplant'

Yes, it's a step past CERT (2.60 / 5) (#8)
by Signal 11 on Sat Dec 16, 2000 at 05:10:04 PM EST

I think you're right that CERT failed. I also agree with you 100% that there are no hard, fast rules about what is "good faith effort" and what is not. It's a judgement call. A company like Microsoft (hotmail.com) ought to be held to a higher standard than, say, Joe's Hometown Hardware Shop if their website has a problem with their software. But how much? I don't know.

This organization wouldn't have a blueprint to follow, and no safety net. But I would also put forth that the issue goes largely unaddressed with the current solutions. We need an organization like this. The question is mostly that of resources and reputation, not of feasibility.

Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

Intimately linked... (4.50 / 2) (#12)
by Miniluv on Sat Dec 16, 2000 at 05:25:20 PM EST

This is definitely wandering into a dark alley without a flashlight, and every lesson an organization like this can learn is going to be pretty harsh. There is some precedent for this, a model like Underwriters Labs is a good example. I think one of the best ways to get around the double standard problem is by soliciting these companies to be something akin to partners in the effort. Ask them to communicate their internal change management guidelines, at least as far as timelines are concerned, and formulate a joint procedure for notification and monitoring that allows them the best chance to fix the problem and the monitoring agency the best chance to put pressure on them with a fairly timed public release. The big thing would be to never ever waver from this policy. No matter how outstanding the circumstances if the public notification is to go out at 120 hours past initial notice, then it absolutely must.

I think the most precious quantity in this equation is trust from the public. Bugtraq started out as virtually nothing, especially in the eyes of companies like Microsoft or Sun. These days it's a real force in the security community because of Aleph1's impartial handling of everything. If the public is involved in this project, and they see their vendors not responding, they will become a force of change. If the third party can establish quickly that they can be trusted in every instance to not waver from their public guidelines, then the vendors will accede to the usefulness of them as a resource.

I would say active testing of the vulnerability would be something to be negotiated on a situational basis. Vendors, like MS, who regularly believe vulnerabilies to be purely hypothetical will be hard pressed to argue against a public, independant review of the vulnerability. If they believe it's a real problem and are working on it, there's no need for people to go port scanning and running exploits against them.

"Its like someone opened my mouth and stuck a fistful of herbs in it." - Tamio Kageyama, Iron Chef 'Battle Eggplant'
[ Parent ]

Absolutely. (2.71 / 7) (#13)
by Signal 11 on Sat Dec 16, 2000 at 05:36:34 PM EST

This is definitely wandering into a dark alley without a flashlight, and every lesson an organization like this can learn is going to be pretty harsh.

Absolutely, I agree. But somebody needs to lay the groundwork. Even if this organization were to fail, another one could rise to take its place and learn from the mistakes of the former. This latest iteration would build off of the examples of CERT, Bugtraq, and the government's "infrastructure taskforce". Will it succeed? Maybe not, but we should try anyway.

I would say active testing of the vulnerability would be something to be negotiated on a situational basis.

I think confirmation should be required, not negotiated. If somebody reports it to you, you ought to verify it before sounding the alarm - otherwise you may fall into the trap of reporting vulnerabilities that don't exist. This organization needs a reputation of "When we report something - it DOES exist, and action NEEDS to be taken". Otherwise, just note it as an "unconfirmed report" and pass it on as a low priority item.

Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

One sticky point... (3.50 / 2) (#14)
by Miniluv on Sat Dec 16, 2000 at 05:45:40 PM EST

In theory, I agree 100% that confirmation should be a mandatory, no circumstances considered, thing. The problem with that is that it's not entirely realistic as the situation stands now. For product vulnerabilities verification is a snap, you can usually set the situation up in house and verify, but with someone elses server it's a more difficult question.

While an organization of this nature is important, it's also highly important that it operate 100% within the confines of the law, and within the boundaries of reasonability. There's no easy solution, but it's definitely something important to figure out before attempting to get this sort of thing off the ground.

"Its like someone opened my mouth and stuck a fistful of herbs in it." - Tamio Kageyama, Iron Chef 'Battle Eggplant'
[ Parent ]

Reputation would be extremely important... (4.33 / 3) (#7)
by Khedak on Sat Dec 16, 2000 at 04:44:35 PM EST

... and difficult to maintain. The suggestion of having good lawyers on tap is of course laudable, but good lawyers means lots of money, and money means either donations or sponsorship. Nothing inherently wrong about this, but whenever money comes into play, there is a potential for corruption, and since as you said this organization would be very dependent on its reputation, this is a serious issue.

Another important thing to note is that although as administrators all we want is our network to function optimally, many companies take the 'deterrant' stance. Namely, they know they're going to be compromised, so rather than focus on securing our networks, they simply threaten to call the FBI on or sue anyone who compromises them. Thus far, it doesn't seem as if companies are particularly concerned with keeping the lid on dangerous geeks so much as villifying them. If some 16-year old kid from Norway wants to pull some shit, they'll just use it as a precedent to prove they really can extradite minors from other countries (as ludicrous as that seems.)

Law enforcement in many areas have been receptive to this, preferring to "go easy" on ecstacy users in exchange for lowering the number of hospitalizations - an equitable tradeoff, I think.

Maybe, but law enforcement in many areas also fiercely reject this type of activity, and there has in fact been legislation proposed to make information of this sort (intended to allow people to intend to use to do so safely) illegal, by saying that it is an exception to the first amendment right to free speech. Similar trends have appeared in Computer Security, notably exemplified by Microsoft's recent decision to consider its bug information its intellectual property and using IP laws to keep other sites from distributing it. The logical conclusion is that unless someone is licensed by Microsoft to have access to their bug reports, that someone will not have the same resources as (and thus be less effective than) someone who does. So we get into the issue of this company not only being accepted by law enforcement, but also endorsed by large software corporations. These entaglements make keeping a clean reputation very difficult indeed.

But, if someone could somehow make a reputable organization such as this, that would be cool. :)

i voted it +1 to section (3.25 / 4) (#10)
by el_guapo on Sat Dec 16, 2000 at 05:16:01 PM EST

even though i don't think it would work. it may for a while, but any such endeavors will ultimately fall to political bs. say, MS stuff gets publicly flogged while open source stuff gets preferencial treatment. or vice versa - however you designed it, it would still have to be run by Very Real People(tm)....
mas cerveza, por favor mirrors, manifestos, etc.
True, but... (2.50 / 4) (#11)
by Signal 11 on Sat Dec 16, 2000 at 05:24:48 PM EST

True, but if we had a board which was in charge instead of a single person, that would tend to limit that type of preferential treatment...

Besides, in all honesty, OSS/free software has an excellent reputation for patching bugs, with a few exceptions. Microsoft, on the other hand, has been very reactive and openly hostile towards the security community. This is documented, and not the subject of much dispute. Reputation in this industry is a two-way street - for both vendors, advisory groups, and response teams.

Society needs therapy. It's having
trouble accepting itself.
[ Parent ]

The postal system (4.12 / 8) (#17)
by jesterzog on Sat Dec 16, 2000 at 06:21:20 PM EST

One of the most irritating problems with trying to (anonymously) tell a company that their website or network or whatever is insecure is that it's traceable.

Send an email? They know who you are, or they can work it out. If it's from an anonymous address, they can bully the anonymous-address-holder to give more information. If it's through an organisation such as Signal 11 suggests, it's only a matter of time before someone tries to bully that organisation and get information out of them about the "thieving hacker" who attacked their sophisticated ROT13 encryption security measures. Either that or takes the organisation to court with expensive lawyers for not collecting that information.

Really though, if you want to be anonymous then what's the problem with just writing it (or printing it) in a polite letter on a bit of paper, putting it into an envelope with a stamp on it and dropping it into a post box? If they're smart they have a postal address advertised somewhere, and if they're not it's their problem. (Just say you're too ignorant to understand how to use email.)

jesterzog Fight the light

Anonymous email. (4.00 / 1) (#23)
by Holloway on Sun Dec 17, 2000 at 08:09:34 PM EST

- If you're going to reply to this, please also rate it.

Very well. Sending anonymous emails to a company is a possibility, just do a search for remailers. Or do as I do and set up about two dozen freemail forwarders to bounce the email about and ensure to go via the the occasional one that - due to size constraints - strips the oldest headers.

Once I've done that I usually head for Anonymizer (or preferably a less wellknown anonymous proxy thang).

== Human's wear pants, if they don't wear pants they stand out in a crowd. But if a monkey didn't wear pants it would be anonymous

[ Parent ]

Hack Reporting (3.33 / 3) (#20)
by Pertelote on Sat Dec 16, 2000 at 10:31:44 PM EST

Good idea. It may not be feasible today, but from ideas, and seeds of ideas and dreams, many things do happen. It only seems reasonable that this will have to be done soon. And it is always better for private industry to undertake monitoring than the government. (Good idea, indeed, Bo)
"Only the heir to the throne of fools fights a battle on 12 fronts!"
feasability (3.66 / 3) (#22)
by swr on Sun Dec 17, 2000 at 07:10:41 PM EST

If a private organization does it, who will pay to keep them operating?

People aren't going to want to pay to report a vulnerability. People aren't going to want to pay to hear that they have a vulnerability. There's always banner ads, but I don't think we're talking about a high-traffic web site.

It would have to be sponsored by some large organization- probably either the government or a computer security company. Or it would have to be an all-volunteer effort.

No matter how it is done, how can anyone be certain that the organization is free of...

  1. black-hats who would use the information for illegal purposes
  2. agents from various TLAs there to keep a list of security-knowledgable people "just in case"

Personally, I think I'd rather use the remailer network and hope that at least one remailer in the chain is good, or (if the issue is of wide enough interest) just post to Bugtraq and hope that there is safety in numbers.

Conflicts of interest (4.00 / 1) (#24)
by micco on Mon Dec 18, 2000 at 11:30:42 AM EST

One word: TRUSTe

While this organization is different than what you suggest, they should serve as a lesson in how to set up this kind of thing (or, more pointedly, how not to). They have a long history of hiding, shifting blame, or downright ignoring violations of their policies by companies which happen to be major contributors. They are supposed to audit a company's procedures to insure that they do not violate certain privacy standards, but there is a long list of companies who continue to carry the TRUSTe seal of approval despite egregious privacy violations.

The point is that any organization of this type is vulnerable to conflicts of interest. It requires funding to operate, and that funding is most likely to come from high-tech companies who also comprise the subject of most of the monitoring. I'm loath to propose a governmental solution, but that seems to be one of the "cleanest" funding sources from a conflict of interest standpoint. Crime hotlines and organizations like Dance Safe run very well on donations from the community, but by their very nature they are not open to the conflicts of interest I'm discussing (except in as much as criminals are unlikely to donate to Crime Watchers, which doesn't hurt Crime Watchers very much).

It's possible that a project like this could run with minimal funding, but somebody somewhere is going to have to provide servers and bandwidth. Whether that's a business, university, private endowment, or government grant, you're going to get pressure to give your benefactors the benefit of the doubt rather than publicize embarassing information.

Hack reporting | 24 comments (24 topical, 0 editorial, 0 hidden)
Display: Sort:


All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!