In the essay you've come from, I state that the CDA is not
about "protecting the children." While I do not want to
detract from the other ideas of the essay by jumbling
too many messages, I do realize that such a statement needs
to be backed up, thus this analysis. It centers on two areas
of the CDA, the "indecency" provision and the "patently offensive"
provision.
The "indecency" provision is the first objectionable part of the
CDA, and it bans "communication" to a minor of "filthy, lewd,
lascivious or indecent" material with intent to "annoy, etc."
But this passage once read "harmful to minors;" now,
"harmful to minors" is explicitly defined, and contains
exceptions for works of artistic, political or social value.
Indecency (and the rest of the colorful adjectives above) is not,
and has no such exceptions.
Why is that important? Because the substitution was intentional,
and sneaked in while in committee, after the rest of the compromise
wording had been approved. This is a dead giveaway that "protecting
the children" is a blatant smokescreen, for if it were the reason
then "harmful to minors" would be sufficient. But "indecent" was
desired. Why? Because it's vague and undefined, and would curb
speech that, though valuable, is anathema to certain people. It
is about controlling speech.
This one is the biggie; for it bans communication that is
"patently offensive" according to "community standards"
containing references to "sexual or excretory activities
or organs," from being placed anywhere on the Internet
where minors may see it. This means Usenet. This means
HotWired threads. This means IRC. Anywhere.
This means that, for example, the Library of Congress
would not be allowed to put "Tropic of Cancer" as accessible
on the web. There are no exceptions for "serious literary,
political, or social value." None. And then there's that
enormous loophole of "community standards," when
the technology inherently renders location immaterial.
This one is about controlling the terms of public discourse.
Period. This is the provision that would make people's
expression on the Net's public spaces rated PG.
It won't work because the Net is an international system, and
just like there are tax havens today there will be data havens
tomorrow. If you want to publish hard-core porn, or, for that
matter, anything that might be found objectionable as per the
CDA, you'll move your server to Belize. If you want to publish
a sex story on Usenet, you'll encode it with unbreakable encryption,
bounce it off of five or six computers around the world, and
have it injected into Usenet from, say, the Netherlands.
Another solution must be found.
Filtering. Period. People say that filtering won't work (and
it is true that the current crop of systems is not foolproof),
but the truth is that technology can make for very solid systems,
even for families in which the children are the computer-savvy ones.
One can use unbreakable computer codes to create unforgeable
ratings, for example, and block access to anything that is not
rated in a specific way. This is the similar technology to what's
used by banks to send money, for example (and likely just as
trustworthy) Will it work 100 percent of the time?
Likely not; an enterprising-enough kid can always get around
restrictions, typically by finding other places beyond home
or school to get what he wants - but it will work vastly better
than the approach taken by the CDA, and is vastly less intrusive on personal
freedoms than the current law. Yet the law chooses to pay
lip service to filtering systems and use the big CDA hammer.
Why? Because it is, once again, not about "protecting
the children." That is a smokescreen, and the real deal is to
control what people say and how they say it. Keep that in mind.
[
ICB Home |
About moi |
Wares |
Other sites |
Writings |
Miscelanea
]
Copyright © 1995-1996 Iván Cavero Belaúnde.
All Rights Reserved.
24 Hours of Democracy
What the CDA is About
The Indecency Provision
The Patently Offensive Provision
Why it Won't Work
The Real Solution