Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Censorware - changing the debate from "filtering"

By Seth Finkelstein in Technology
Mon Mar 25, 2002 at 12:11:27 PM EST
Tags: Freedom (all tags)
Freedom

As already mentioned in another story a Federal censorware law is now being challenged in court. For the past months, I've been focusing on trying to change some of the ways people think about censorware. Censorware is not a "filter", it's a blinder-box.

I've changed the way people think about censorware before. In 2001, the Electronic Frontier Foundation (EFF) honored me with a Pioneer Award for my ground-breaking work in exposing what was actually banned by censorware. I was the first person to decrypt censorware blacklists, in 1995, and I've been studying censorware in all the years since then.


By now, it's fairly common knowledge that censorware blacklists are far from the magical Artificial Intelligence claimed by the manufacturers. But that often leads only to an argument over the blacklist's accuracy.

I've devoted much recent effort to examining censorware from another perspective, the imperatives of control. Note I don't use the common term "filtering". This is not mere partisanship. I believe "filtering" is an inaccurate description, and leads to a very misleading way of thinking about the issues. It was an excellent public-relations achievement of the censorware-makers to get much discussion to use the term "filtering". Because that term focuses attention on presumably toxic material, with the censorware program viewed as some type of purification system.

But the issue is about controlling what people are permitted to read. This is a profoundly different problem. The person to be controlled must be placed into an escape-proof blinder-box. And that inevitably leads to a requirement that any site which might allow them to peek out the box must be banned, as a threat to the control.

I've been working to publicize this fundamental control requirement in some of my anticensorware investigations:

When people think of censorware as a "filter", they think of it in terms of banning worthless or near-worthless sites. But making sure readers can't escape leads to an immediate imperative to prohibit anonymity, privacy, language translation sites, digital libraries, and so on. People don't think of censorware in those terms. And they should. Censorware is about control, not "filtering".

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Poll
The most dangerous site to censorware is
o Privacy/Anonymity 18%
o Language Translators 1%
o The Wayback Machine 6%
o The Google Cache 30%
o Image Search Engines 3%
o peacefire.org 25%
o censorware.net 8%
o sethf.com 5%

Votes: 59
Results | Other Polls

Related Links
o Google
o another story
o Electronic Frontier Foundation (EFF)
o Pioneer Award
o ground-bre aking work
o anticensor ware investigations
o The Pre-Slipped Slope - censorware vs the Wayback Machine web archive
o BESS's Secret LOOPHOLE (censorware vs. privacy & anonymity)
o SmartFilte r's Greatest Evils
o BESS vs The Google Search Engine (Cache, Groups, Images)
o BESS vs Image Search Engines
o language translation sites
o digital libraries
o Also by Seth Finkelstein


Display: Sort:
Censorware - changing the debate from "filtering" | 50 comments (32 topical, 18 editorial, 0 hidden)
Write in vote (2.11 / 9) (#3)
by juahonen on Mon Mar 25, 2002 at 09:01:46 AM EST

Kuro5hin

China (4.00 / 5) (#6)
by marx on Mon Mar 25, 2002 at 09:20:07 AM EST

Maybe an interesting angle could be to make a comparison between the commercial censorware and what China has implemented. Their task is very similar, so it would be interesting to see if their approach is similar as well.

It would be a quite good PR move for the anti-censorware movement to create an association between censorware and the Internet censorship methods implemented in China.

Join me in the War on Torture: help eradicate torture from the world by holding torturers accountable.

Re: China (4.25 / 4) (#9)
by juahonen on Mon Mar 25, 2002 at 09:44:22 AM EST

Digital Freedom Network has some interesting stories on "Attacs on the Internet in China".

According to one such story, dated more than a year back, their system resembled the western software. The blocking program, however, allowed system administrators to set their keyword lists for blocking. The situation has changed since then. Now ISPs are required to monitor user activity like time spend on the Internet, personal information and such. An interesting point here is that their new software is based on Linux, forcing ISPs to use Linux. Earlier the software was also available for Windows.

There's more to censorship than banning "inapproppriate" sites: While China has often been criticized for using laws and technology to censor the Net, the lack of Internet access is a far more serious form of censorship.

[ Parent ]

Governments use commercial censoreware (5.00 / 1) (#39)
by cyberformer on Mon Mar 25, 2002 at 09:27:45 PM EST

While China's might be a bit different, some repressive regimes such as Saudi Arabia simply buy the same censoreware tools that Western companies market to consumers and force on U.S. public libraries.

[ Parent ]
Censorware is a tool... not inherently evil. (3.71 / 7) (#21)
by Torgos Pizza on Mon Mar 25, 2002 at 12:08:22 PM EST

First, I admire your research into censorware and thought that cracking the encryption was pure genius. However, in this article you make it seem that filters are tools of the devil akin to having all mankind doomed due to Adam's Transgression.

Censorware is a filter. It's just like any other tool out there. Be it a hammer, a chainsaw or ice pick, they all have constructive uses in the world. They can help people have a better life. But they can also be misused in various ways as seen in various horror movies. Censorware can be a great tool for individuals, families and yes, even communities of all types.

As you must surely know, not all censorware is made the same. There are lots of brands and products to choose from. Not all of them filter sites the same. But it seems that your beef with them is that they either filter too much or lend to the "Big Brother" controlling our lives. Not all censorware is doing this. Where's the X-Files conspiracy that all censorware is working together to brainwash us into not going to goatse.cx?

Most censorware is going to decide to err on the side of caution. I wouldn't buy firewall software or hardware that didn't do the same thing. Are some things going to sneak though or be blocked accidentally? Sure. No software is perfect. But you seem to ignore that most libraries and public systems use censorware to either uphold community standards or reduce liability from lawsuits.

As I understand it, most libraries have a unrestricted computer in the back for adults doing research on topics that could be filtered. So I just don't understand what the big deal is. People want filtering at home and in the libraries. There's obviously a market for these products, or these companies would not exist. You point out that the system isn't perfect and needs correcting, which is on the money. The issue of control put you way out in left field when you don't even consider that there are other venues available. If people really want "privacy, language translation sites, digital libraries" as you put it, there are plenty of other ways to achieve this without censoring censorware.

I intend to live forever, or die trying.

Well... (none / 0) (#31)
by Danse on Mon Mar 25, 2002 at 03:49:24 PM EST

I think he makes a lot of legitimate points considering that Congress would like nothing more than to mandate that all Libraries filter net access (see the article on that currently on the front page). They aren't offering the libraries an easy out by letting them have unfiltered machines accessible to the public. They aren't letting the libraries decide to block based on community standards. They are trying to impose filters that are politically and religiously biased through extortionary tactics. The founders did a decent job of setting up the government, but for all their paranoia, they still underestimated the ingenuity of those who would seek to control the lives of others (aka congressman).






An honest debate between Bush and Kerry
[ Parent ]
Control device is a tool, but is a control device (4.80 / 5) (#36)
by Seth Finkelstein on Mon Mar 25, 2002 at 07:46:59 PM EST

Handcuffs are a tool. Straightjackets are a tool. But handcuff and straightjackets are not "peace-and-calmness filters". They are restraint devices. Consider the PR difference between "Should cops 'cuff nonviolent people"? versus "Should cops put nonviolent people into peace-and-calmness filters?"

All censorware has the design and optimization of controlling what people are allowed to read - not of "filtering" material. This is my point. And people just don't get it. They constantly reply, in a kind of non sequitur
"But straightjackets can be used on violent criminals as well as nonviolent protestors, they are a tool. So that means straightjackets are peace-and-calmness filters, not restraint devices."
That's the power of the way the debate has been framed

Note CIPA does not permit a single uncensorware'd computer in a library. Not one. It only allows permission to turn off the censorware if one can convince the authority of "bona fide research or other lawful purpose". But the default is censorware for all ages, on every machine.
-- Seth Finkelstein
[ Parent ]

Swipe your State ID (none / 0) (#42)
by pin0cchio on Tue Mar 26, 2002 at 09:04:49 AM EST

It only allows permission to turn off the censorware if one can convince the authority of "bona fide research or other lawful purpose". But the default is censorware for all ages, on every machine.

Define research. I could be doing research on the adult entertainment industry.

How's this for a policy? Swipe your State ID card (driver's license, etc.) through the reader to prove that an adult is present. Then an alert box pops up: "Censorware disabled for 15 minutes. [OK]" Then the adult can censor the children's research instead of having a farfrompërfect program do it for them.


lj65
[ Parent ]
filtering software problems (2.50 / 2) (#22)
by karb on Mon Mar 25, 2002 at 12:30:15 PM EST

Aren't most of the arguments against 'censorware' relating to either innaccuracies of the software, or malevolence on behalf of the implementors?

Is there any good reason open source filtering software couldn't solve all of these problems?
--
Who is the geek who would risk his neck for his brother geek?

No (3.33 / 3) (#24)
by greenrd on Mon Mar 25, 2002 at 12:45:58 PM EST

Did you even read the article? Inaccuracy cannot be 'solved'. That problem is not going to go away. You can't just wave your magic open source wand and create an artificial human-level intelligence. How are you going to deal with automatic translation sites, for example? How are you going to deal with Javascript-based web proxies? How are you going to protect kids from undesirable material on groups.google.com without blacklisting that entire domain?

Inaccuracy can certainly be reduced - for individual cases at least - by having an open process. However, filtering in e.g. schools almost inevitably creates an atmosphere where students (and possibly teachers as well) are afraid to ask for certain material to be unblocked. The question is, do the benefits of filtering outweigh the disadvantages caused by the inevitable blocking of appropriate material? I would say realistically probably yes, because they provide some defence against parental allegations of negligence. At least the school can point to the filters (and their policies) and say "We did our best", if some kind of scandal blows up concerning web browsing in school.


"Capitalism is the absurd belief that the worst of men, for the worst of reasons, will somehow work for the benefit of us all." -- John Maynard Keynes
[ Parent ]

problems with problems (2.00 / 1) (#25)
by karb on Mon Mar 25, 2002 at 01:40:31 PM EST

I am aware of the technical limitations of existing web-filters. However, I have yet to see any reasonable explanation as to why any of the technical hurdles facing web filtering software are insurmountable.

I really doubt that accurate filtering software requires human-level intelligence. Perhaps near human level image-recognition in limited domain, or the ability to recognize objectionable texts.

However, achieving near-human abilities given a limited scope has been achieved for many AI problems. I see no reason why these particular problems are more difficult. Your web filter doesn't need to be able to play baseball. Just discern if a website is objectionable or not. Recognizing naughty pictures/words/stories just doesn't appear, at first glance, to be a horribly difficult AI problem.

In other words, I think you're making the assumption that filtering is computationally infeasible too soon. All the examples you cite disassume the existence of programmatic filtering.
--
Who is the geek who would risk his neck for his brother geek?
[ Parent ]

What about (4.33 / 3) (#26)
by greenrd on Mon Mar 25, 2002 at 01:50:59 PM EST

Consider the word "pussy". Understanding whether this is meant in a sexual or non-sexual sense requires knowledge of context. This requires understanding what the text actually means, hence human-level AI.


"Capitalism is the absurd belief that the worst of men, for the worst of reasons, will somehow work for the benefit of us all." -- John Maynard Keynes
[ Parent ]

youch (none / 0) (#27)
by karb on Mon Mar 25, 2002 at 02:19:52 PM EST

I'm at lunch at work ;)

No, that really doesn't involve human-level AI. It involves a subfield of a subfield of AI, usually referred to as natural language understanding (which is, in turn, part of natural language processing).

Full-out comprehension of any text is very difficult. I think that's what the cyc project is doing, and they have had some successes. However, it is very easy to solve a small subset of the natural language understanding problem. I believe that identifying 'inappropriate' texts is fairly easy, and doesn't require even human level understanding capability.
--
Who is the geek who would risk his neck for his brother geek?
[ Parent ]

Hmm.. (4.00 / 2) (#29)
by Danse on Mon Mar 25, 2002 at 03:05:10 PM EST

If all of this is as easy as you seem to think it is, why are there so many failures at it, and no successes at all so far?






An honest debate between Bush and Kerry
[ Parent ]
Freh (5.00 / 2) (#32)
by _cbj on Mon Mar 25, 2002 at 04:27:08 PM EST

Understanding natural language is widely believed to require full strength artificial intelligence of the kind only dreamed about, it's not so much a subfield as an easily explainable (easily fundable) angle of attack. If you're talking present day technology, you just can't beat 'pussy'.

[ Parent ]
The problem is what is objectionable (4.80 / 5) (#28)
by a humble lich on Mon Mar 25, 2002 at 02:26:55 PM EST

Is a naughty picture one which shows a woman's (or man's) crotch? breasts? Ankles? Face?

Are stories about sex objectionable? What about trashy romance novels (the ones with clear descriptions of sex but nice clean language)? What about certain parts of the bible?

Is violence objectionable? What about sites about homosexuality? Medical sites with "naughty" pictures?

Is the KKK's homepage objectionable? What about Alternative Tentacles (Jello Biafra's label)? What about the US army's home page?

My point is even with perfect human intelligence, a filter that would block out all "objectionable" material.

[ Parent ]

sigh (none / 0) (#33)
by karb on Mon Mar 25, 2002 at 05:23:25 PM EST

Since everybody seems to be saying that the things I'm proposing are "impossible" and "would require full-strength AI", here is Five Year Old Paper On Genre Recognition From Xerox PARC.
--
Who is the geek who would risk his neck for his brother geek?
[ Parent ]
Question: (3.50 / 4) (#23)
by FredBloggs on Mon Mar 25, 2002 at 12:34:49 PM EST

"Censorware is not a "filter", it's a blinder-box"

Whats a blinder-box?

blinder-box (none / 0) (#35)
by Seth Finkelstein on Mon Mar 25, 2002 at 06:50:57 PM EST

http://www.dictionary.com/search?q=blinder

blinder
1. blinders A pair of leather flaps attached to a horse's bridle to curtail side vision. Also called blinkers.
2. Something that serves to obscure clear perception and discernment.

blinder
1. One who, or that which, blinds.
2. (Saddlery) One of the leather screens on a bridle, to hinder a horse from seeing objects at the side; a blinker.

blinder
n : a leather eye-patch sewn to the side of the halter that prevents a horse from seeing something on either side
-- Seth Finkelstein
[ Parent ]

N2H2 (none / 0) (#30)
by afree87 on Mon Mar 25, 2002 at 03:20:19 PM EST

I don't think you'll need to worry about them much longer... they've been delisted.
--
Ha... yeah.
I was a teenage N2H2 employee (4.25 / 4) (#34)
by chromag on Mon Mar 25, 2002 at 05:38:27 PM EST

Actually, I wasn't teenaged, but I did work there for nearly four years until laid off a year or so ago. I worked in IT, and as such had no input into the company's direction. I ridiculed many of the company's decisions and was fairly well-known internally as an outspoken opponent of many things N2H2 did.

I appreciate what Seth and Bennett at Peacefire are trying to do in one sense; when they're using the knowledge they gather to challenge idiotic laws like CIPA and it's precedents, I'm all for it. What I detest about them in many other cases is this sort of self-aggrandizement that seems so common to them and their ilk. Bennett Haselton showed up at a couple of N2H2 shareholder meetings simply to raise hell and try to get noticed. Seth posting this topic really smacks of "look at me" instead of "look at the issues."

Look guys. You're doing good things. But you're going about them in the wrong way in many cases. I've read just about everything you've written on the subject of N2H2 specifically and censorware in general, and the one thing that you consistently get completely wrong is the motivations of the companies involved. These are freedom-loving people for the most part. They're in business to make money, not to censor information. The biggest battles N2H2 fought during my time there were with customers who had the exact same concerns that you do and who were convinced that there was some moral component to what we were doing. Nothing could be further from the truth - from the beginning of my employment there, everything was strictly aboveboard to employees and to customers. N2H2 did not categorize a site as pornography without a rigid standard being applied, a definable standard that's publically available and that you yourself would agree with. Same with their other 30-whatever categories.

The problem, of course, is one of scale. You and Bennett liked to point to various sites that are blocked by Bess or not blocked by Bess and cry foul - that because Peacefire was blocked or because Google image searches are blocked somewhere that it meant a decision was made that those sites were evil or bad or injurious to someone or to N2H2. The truth is that the sites fit into one of N2H2's categories and that's all. Peacefire had instructions for circumventing filters, so it was blocked. What censorware company wouldn't block it? Be serious. The fact that it was initially marked as pornography was because that's the most-blocked category by N2H2's customers, not because it was mis-categorized or there was an evil plot to make people think it had goatse images on it. N2H2 at its height had 100 people working 24/7 categorizing sites. They did categorize millions of sites. The multi-millions of sites that did not get categorized or were categorized incorrectly were, and are, the problem, not the motivations of the company or the reviewers. The reviewers were doing their best at an impossible, thankless job, and the fact that any of them lasted more than a few weeks is amazing. The managers of the review department were doing their best, the people writing the software and building the hardware were doing their best. They weren't there to censor anything; none of us were. Hell, we were all doing our best to find a way to build and maintain a categorization system that was fair and usable and reliable. There's just too many websites for that to be a realistic goal, as you well know. The only way to deal with the volume is to institute a keyword system of some sort, which N2H2 never did for obvious reasons. Every site categorized was done by a human being, whether you believe me or not.

As long as there's a market for their software, these companies will continue making and selling it, regardless of your efforts. You should be spending your time and your research finding ways to make the market shrink, to kill CIPA, to change the attitudes of the parents and librarians and school boards who buy the products and who actually do believe that censorship is a good thing. Spending it in the pursuit of the companies making the products is Quixotic at best - they are NOT your enemy. My former co-workers and bosses at N2H2 don't waste any of their time worrying about you or Peacefire, believe me.

N2H2 being delisted last week is a good thing for them and for their market; they'll get out of the Internet Boom Company mode that they've been stuck in for 3 years, they'll stop worrying so much about appearing solvent for press releases and fund managers, and they'll get back to their core business of signing up educational customers as fast as they can write.

Notice I've used "censorware" throughout instead of Seth's touchstone word "filtering", even though the latter is much more descriptive. I agree that using the former word tends to polarize people more, and that's a good thing for Seth and for the filtering companies.

I apologize for my rambling. Before you ask, I have zero financial interest in N2H2 or any censorware company - I speak only as one who may have information others here do not.
--

-c
dump the zeros


Bah, I say (4.50 / 2) (#40)
by scruffyMark on Mon Mar 25, 2002 at 10:03:21 PM EST

These are freedom-loving people for the most part. They're in business to make money, not to censor information.

They're in the business of censoring information to make money.

The biggest battles N2H2 fought during my time there were with customers who had the exact same concerns that you do and who were convinced that there was some moral component to what we were doing.

Of course there was a moral component. You, and everyone at N2H2 (like all humans except perhaps the insane and the extremely retarded), are moral agents. By definition, every decision you make must involve a moral component.

If you did not do any moral thinking before making your decision, then you were simply shirking your duty as moral agents, i.e. humans. This does not eliminate the moral weight of your actions, nor does it absolve you of the morality or immorality of those actions.

While I'm after relentless contradicting you ;)

Spending it in the pursuit of the companies making the products is Quixotic at best - they are NOT your enemy

Non sequitur. The fact that you will not be able to defeat them directly does not make them any less your enemy, it merely means your tactics are inefficient. Granted, the latter statement may still be correct, but it does not follow from the former.

[ Parent ]

Bah bah I say in return (none / 0) (#46)
by chromag on Tue Mar 26, 2002 at 04:23:16 PM EST

Well put. However, I think your responses have more to do with my poor choice of words than with the points I raise.

First: N2H2 categorizes websites and provides the means by which a school administrator (for instance) can determine which of those categories their students can and cannot access. They don't censor anything directly, except in the sense that they know that most schools (again, for instance) will have the "porn" category blocked.

Second: I read what you say as an admonishment that a moral component exists in every human action or exclusion of action. True, of course. But that's not what I was referring to by "moral component to what we [N2H2's employees] were doing." I was referring to the belief of the various anti-censorware folks, as well as many customers, that N2H2's employees were categorizing a porn site as porn via some moral belief, and not via rigidly defined categories. No one at N2H2 looked at a site and said "that offends me, therefore it goes in the 'obscene' category." Instead, they looked at a site and said "that fits in the 'obscene' category by the definition written down here in front of me." Their own beliefs did not enter into it. That detachment was enforced, believe me.

Third: You're absolutely correct. The first sentence was from a different thought that I never finished writing. I was tired.
--

-c
dump the zeros


[ Parent ]
OK (none / 0) (#47)
by scruffyMark on Wed Mar 27, 2002 at 01:10:27 AM EST

Consider that put into my pipe and smoked. All perfectly valid points. Note that I'm not even sure if I'm for or against censorware, in general or in the particular context of libraries, I'm just picking nits in the discussion of it...

They don't censor anything directly, except in the sense that they know that most schools (again, for instance) will have the "porn" category blocked.

Arguably - perhaps very easily arguably - even that action of providing a tool that one knows will likely be put to use almost exclusively for the purpose of censorship is a non-trivial action of promoting censorship. Not that you were doing the censoring yourselves, as you point out. But making yourselves part of the whole industrial complex surrounding censorship, choosing to devote your efforts to extending the set of technologies devoted to censorship rather than any other human endeavour, is not ethically neutral.

This is by the same logic (although the accusation is of course nowhere near as grave) by which it is often argued that companies that manufacture handguns and market them for home use, for 'self-protection', etc., bear some of the responsibility for the rate of handgun deaths, both accidental and deliberate, where their products are sold. Similarly, they deserve some of the credit for any real reductions in crime effected by having an armed civilian populace. I personally don't believe the NRA when they claim that there is such a reduction, but that's entirely beside the point.

[ Parent ]

"Filtering" vs "Control" (4.00 / 1) (#37)
by driptray on Mon Mar 25, 2002 at 08:21:55 PM EST

Censorware is about control, not "filtering".

You seem to think that, from a strategic perspective, the word "control" will turn people off censorware more than the word "filtering" will. I don't agree.

The people that want censorware are not ashamed to admit that they want control. They want to control what people read and see - that's the whole point! And if they want control, and if there is a market for a product that gives the perception of control, then the word "control" doesn't give opponents of censorware much room to argue.

I think "filtering" is a far better word because it allows opponents of censorware to focus on how the filter is ineffective, has adverse effects (image searches, translation engines etc) and is typically politically loaded. By focussing on the "filtering", you can say how the filter lets through most of the porn, but blocks a lot of innocent and useful sites. That's gotta be a more effective argument, especially to the censorware market, ie, the people who want control. And they're the people you really need to influence.


--
We brought the disasters. The alcohol. We committed the murders. - Paul Keating
Re: "Filtering" vs "Control" (5.00 / 2) (#38)
by Seth Finkelstein on Mon Mar 25, 2002 at 08:35:13 PM EST

The people who want control are a lost cause. I've gone through many rounds of argument. To oversimplify, they'll take anything, ineffectiveness is no object.

What I'm trying to do in this instance, is to reach people who aren't so blindly enamored of control. A problem is that a debate over the accuracy of the blacklist is vulnerable to people giving the censorware the benefit of the doubt, simply on the basis that the targets are assumed worthless (even toxic) sites.

Pointing out that that it's a censorware feature, an intrinsic requirement of control, to ban e.g. language-translation sites, changes this debate.

The idea is that there's a population of people who are willing to accept collateral damage (as long as it isn't their own site), but balk at the idea of banning highly worthy sites merely because these sites provide services which could be escapes from the necessary control of censorware.
-- Seth Finkelstein
[ Parent ]

Loopholes are very easy to find (4.00 / 2) (#41)
by Blarney on Tue Mar 26, 2002 at 02:11:57 AM EST

While this is a pointless anecdote, I was once waiting at the airport for about 4 hours while a friend attempted to reschedule a flight. There was a Sharper Image there, running a PC with Internet Explorer that had been hacked to disallow things like the URL bar, the menu's, CNTRL-ALT-DEL, ALT-TAB, ALT-F4. Basically, you could push the Home button and click on links so that you could purchase things from an array of catalogs. Naturally, the Sharper Image catalog linked to Amazon, which linked to an Amazon Affiliate program, which linked to a Macintosh magazine, which linked to their IRC channel, which linked to a list of IRC hosts, which linked to their respective service providers, including AOL, which gave me a nice search engine box. It wasn't long before I was surfing to my hearts content, checking webmail and whatnot. Best of all, the machine had a loudspeaker inside the cabinet perfectly suited to reproducing the wit and wisdom of Eric Cartman.

If this story has any point, the point would be that securing yourself against "loopholes" is a fruitless task. All it takes is one "trusted" content provider who leaves a link to someone who shouldn't be trusted, and things aren't secure anymore. Like the Akamai bust a while back - all it takes it one such site out of many thousands and the whole thing falls apart.

So technically... (none / 0) (#49)
by Dyolf Knip on Fri Apr 05, 2002 at 03:45:04 AM EST

AOL violates the DMCA by providing a workaround for the lockdown program!

---
If you can't learn to do something well, learn to enjoy doing it poorly.

Dyolf Knip
[ Parent ]

Just wanted to congratulate to your great work! (2.00 / 1) (#43)
by Walter Scherer on Tue Mar 26, 2002 at 09:20:22 AM EST

Moin, Seth!

I appreciate your work. Well done! Don't get out of the game!

Tschau
--
Walter

Re:Just wanted to congratulate to your great work! (3.00 / 1) (#44)
by Seth Finkelstein on Tue Mar 26, 2002 at 10:47:38 AM EST

Thank you very much. It's good to know the effort is appreciated.
-- Seth Finkelstein
[ Parent ]
Who Watches The Watchers? (4.50 / 4) (#45)
by MrMikey on Tue Mar 26, 2002 at 11:03:50 AM EST

When you see products like Cybersitter, which "filters" dangerous pornography like, say, the National Organization for Women website, sites that discuss safe sex or homosexuality, and, oh yes, any site critical of Cybersitter or it's manufacturer, it isn't hard to get cynical about filtering software or it's manufacturers.

Even if you had some sort of Open Source filtering software in place, wherein the database of forbidden sites was open to examination and debate, two questions still remain: Why are you trying to keep people from reading/seeing whatever it is they want to read and see? Who is going to have oversight over your decisions?

Some people don't want their children to see pictures of penises. Some people don't want their children to read that homosexuality is OK. Some people don't want their children to read that evolutionary theory explains how life changes over time. Are all of these desires legitimate? Perhaps, if it's a parent and their children. What about a librarian and their patrons? A principle and their school? A federal official and all federally-funded libraries? Who gets to decide, and who gets to appeal a decision they don't like?

IMO, there should be no filtering at all. None. If you don't want your children seeing certain things, then you need to control your child's access, not all children's access. If you are talking about adults, then I can think of no material which could be more obscene than the obscenity that is censorship.

A menace to basic rights (none / 0) (#50)
by Truck Sa on Mon Apr 22, 2002 at 07:33:15 AM EST

The freedom of thinking, the freedom of speech and the right of information is among others at the core of our civil rights. Without this rights a fair and free society can`t exist.

The use of a filtering/censoring software in the internet jeopardizes all of those. Information is arbitrarily kept from flowing. Both -the "speaker" and the "audience" - are deprived.

On a private scale all this still seems to be still bearable, trusting the benevolent dictator (parents, boss) . I think it isn`t. But the great threat lies in the use of such technologies by governments. Most of the governments - the west included - see many parts of the net beyond their grip and try to extend their influence. One way will be the use of filter-software, whether it`s working properly (which imho is impossible) or not.

And such a national censorship is of a totally new quality and will aim directly at our civil rights.



Censorware - changing the debate from "filtering" | 50 comments (32 topical, 18 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest © 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!