Skip to content

Tragedy of the Commons

April 4, 2005

Last week, I got a call from someone investigating inappropriate filtering. He was seeking examples of how filters inappropriately block access to Internet sites.

I couldn’t help him, since we don’t have a filter yet. We abandoned our first effort and are starting over. We’ll find something; I simply didn’t want to spend a lot of money on a Ferrari just to drive around the block.

I’m sympathetic to this investigator’s position, although I think his search will lead him to questions of application rather than technology. Internet filters can work pretty well, depending on the goals of their implementation.

For us, we will filter Internet use through a blacklist of pornographic sites, a list updated daily by human editors.

Inappropriate blocking happens most with automated filtering, such as filtering by keywords, where the decision is made by a machine rather than a human. Humans can’t understand one another; imagine the mistakes a machine would make.

Some people are content to live with such mistakes for the security they feel working behind a filter.

Of course, man-made blacklists are fallible, too, not least by virtue of malice, such as was exposed years ago when the secret blacklists of some popular filters were first cracked by hackers.

One overtly conservative filter permitted access to the Republican National Committee website but denied access to the Democratic site. It was a “mistake.”

But, all the lists showed failings. Mistakes are inevitable, but the opportunity for a little thought control is just too seductive. That’s why I remain pleased about the degree of distrust that has historically haunted the filtering industry. I think it’s a healthy skepticism.

I’ve been opposed to filtering less on philosophical grounds and more on particulars, such as that early flavor of thought control, or the politicizing of the issue by the filtering industry itself, which stood to gain mightily from filtering laws, or our relatively untroubled experience here at the library.

The Internet has evolved and grown enormously in the last decade, and I find arguments in favor of filtering more reasonable now. I think the early opposition to filtering was healthy and appropriate, if we were to find some middle ground.

Although there are many more rude sites today than just a few years ago, there are also many more sites in general. More challenging than the mere quantity of pornography are the refined technologies that defeat your every move to avoid it.

These same technologies are also used in advertising. Commercial exploitation of the Internet is as invasive as anything else.

This is a good example of the “Tragedy of the Commons.” The tragedy plays out because of this: Everyone must agree to take care of the commons, but anyone can destroy it.

The question for many people has not been “if” to filter but “when.” When does it become necessary? As a society we feel our way through a tangle of risks. Highway deaths dropped during the years when the national speed limit was 55, but we happily began driving 75 again, accepting the resulting carnage as reasonable so that most of us could get somewhere faster.

I’ll be happy to filter Internet pornography from the World Wide Web, because it’s rude stuff, but it is not the most troublesome part of the Internet. There’s still chat and email and fraud and deceit, and spyware and viruses and …

Yet, this is the Information Superhighway, and we want cheap gas, no speed limit, and no-fault insurance.


From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: