|
|
| Author |
Message |
| 25 new of 78 responses total. |
rcurl
|
|
response 37 of 78:
|
Nov 12 16:15 UTC 1998 |
Since parents can control (up to a point) what books and magazines are
in their homes (but not in the library), one "fix" would be software
that (even imperfectly) scans for (levels of) "objectionable material"
(using the "contemporary community standards", for example), and then
sends a message to the parents to look at it and decide if they wish
to filter it. [I'm not yet sure that I think this is a *good* idea, but
it is an idea that meets current parental responsibility and rights.]
|
senna
|
|
response 38 of 78:
|
Nov 12 17:08 UTC 1998 |
CDA type legislations all deals with drawing lines between decency and
indecency, lines that become extremely complicated, jagged, and undefinable.
The lines shoudn't be drawn in the first place.
|
rcurl
|
|
response 39 of 78:
|
Nov 12 17:13 UTC 1998 |
Does it matter if lines are drawn so long as acting on the lines is
optional? If we can choose to accept or reject any lines, then they
just become an opinion (and others can offer them too). It is the
enforcement of opinions that is objectionagble.
|
senna
|
|
response 40 of 78:
|
Nov 12 17:18 UTC 1998 |
Rane, I'm referring to lines drawn by the law. The country is not in
the position to say "this is decent but this isn't." I've written
extensive thoughts on this before. The minute the government becomes a
"value judge," it has gone over the line. Legislation like the CDA
proposals draws such lines and makes it the government's responsibility
to enforce them. That is blatantly wrong.
|
rcurl
|
|
response 41 of 78:
|
Nov 12 17:21 UTC 1998 |
Ofr course I agree with that, but I've just suggested drawing lines anyway
but making them optional. That is midway between having no lines and
legally enforcing lines. Everyone should be happy with that, right?
|
remmers
|
|
response 42 of 78:
|
Nov 12 17:31 UTC 1998 |
Who draws these optional lines?
|
remmers
|
|
response 43 of 78:
|
Nov 12 17:40 UTC 1998 |
Who draws these optional lines?
And re the *technical* feasability of doing the scanning for
"objectionable material" suggested in resp:37 , I see big problems. For
instance, how do you decipher a graphic automatically and determine
whether it is a photo of Shirley Temple or a Playboy centerfold? A
website with sexually explicit graphics could get past the filters by
having completely innocuous text.
|
jep
|
|
response 44 of 78:
|
Nov 12 17:49 UTC 1998 |
re #27, 32: The CDA bills are large in part because they are popular
enough to carry other legislation. The press keeps reporting it as
protection for children and pro-child and all that, despite considerable
self-interest on their own part to ward off any barriers for anyone to
publish anything, because that's how most people see these things.
re #33: 'Exploitation of children' isn't just pictures of nude
children, or even access to sexual content on the WWW. Teenagers can
easily be encouraged to call sex chat lines, pay lots of money for
access to on-line pornography sites, and for sexually explicit materials
to be sent to their homes.
also: The CDA II was passed and signed into law. I don't know when it
takes effect, but it's probably around the start of the year.
I agree that it is a better solution to take charge of your kids
yourself, but many people have more important things to do with their
time, such as make money and watch TV.
|
scg
|
|
response 45 of 78:
|
Nov 12 20:44 UTC 1998 |
re 31:
Those ISPs had just their news servers seized, not their entire
operations. I'm not at all convinced that that was a reasonable action by
the prosecutors, given that ISPs generally just take a full news feed,
accepting whatever is in it. The argument used by the prosecutors in that
case was that the ISPs had been notified that the child-porn being temporarily
stored on their news servers was illegal, and hadn't taken any actuion to
remove it.
That's a little different from the CDA II. Child porn is illegal whether or
not the CDA II is upheld. The question is whether knowing that your news
server is picking up child porn automatically, and providing it to your
customers, and not doing anything to stop the automatic processes from
happening, is knowingly possessing child porn. I've heard a lot of arguments
on both sides. I'm not a lawyer, and it probably is a somewhat tricky legal
question, so I have no idea whether that argument makes legal sense or not.
One thing that action has done is given lots of news server operators good
excuses to get rid of newsgroups such as
alt.binaries.pictures.erotica.pre-teen and the like, groups that had enough
traffic to be quite noticable from the resource utilization standpoint, and
which I imagine very few news server administratrators felt comfortable
spending their resources on.
|
i
|
|
response 46 of 78:
|
Nov 13 03:12 UTC 1998 |
Given an easy-to-filter .SEX domain (or something equivalent), how much
incentive is there for porn providers to bypass the filter? Given a
bypass problem, does it have enough well-financed enemies to create a
healthy market for the moral equivalent of anti-virus software? (Most
employers don't want their workers doing any smut surfing on company
time - what will they pay for effective blocking?)
|
drew
|
|
response 47 of 78:
|
Nov 13 03:20 UTC 1998 |
I can't help but wonder how the "children" whom the CDA-II bill seeks to
protect, feel about this. Do they feel endangered by the presense of porn,
or a need for such "protection"?
And I especially oppose bans on *possession* of just about anything,
especially patterns of magnetic force on a disk platter! Excessive do's and
don'ts.
|
mcnally
|
|
response 48 of 78:
|
Nov 13 07:36 UTC 1998 |
re #46: Some sex sites deliberately masquerade as popular sites,
choosing names that are like those sites to get people who slip up
when entering their URLs.. (for example, there is (or was) a site
called "www.whitehouse.com" which was set up to catch people headed
for "www.whitehouse.gov". Since many people are conditioned to
assume URLs end in ".com" I'm sure they got a lot of accidental hits.)
Such a site is not going to want to clearly distinguish itself as a
sex site.
I'm not very fond of most of the filtering options proposed so far but
I think that what we'll eventually wind up with is some sort of self-
rating system where material "harmful to children" can be posted freely
on the net so long as those providing it marked it as adult material,
with substantial penalties for deliberately misrepresenting harmful
material.. Sites wanting to avoid the issue completely could just mark
everything on their site as "adult", making it unavailable to children
but keeping them safe from legal difficulties, at least those of the
CDA sort.. I suspect that there will be a problem with a "chilling effect"
where some sites with material appropriate to children might nevertheless
mark it off limits just to be on the safe side but it seems that such a
scheme would at least be better than the ham-handed regulatory solutions
we've been offered so far..
|
remmers
|
|
response 49 of 78:
|
Nov 13 12:10 UTC 1998 |
If the scenario in the last paragraph of resp:48 comes to pass, I can
forsee real problems for open systems like Grex that don't pre-screen
content and don't *want* to pre-screen content. Anybody can come along,
post something, and it becomes immediately web-accessible. That's a big
part of how we build our community. Do we make users self-rate the
material they enter? What happens if they don't?
That's one of the reasons I support Grex taking a visible stand now on
the whole business of content regulation.
|
eieio
|
|
response 50 of 78:
|
Nov 13 14:02 UTC 1998 |
Re 48: Yeah, right when there was a lot of interest in the Mars Pathfinder,
a lot of people mistakenly hit "http://www.nasa.com". Which, yes, was
exploration of *sorts*...
I'm still unconvinced about the feasibility of web filtering. There was a
school system in England that put up one of the popular smut traps. It looked
for various naughty combinations of letters, and if it found any of them, it
wouldn't let the page through.
Apparently people in Scunthorpe only made the connection after their kids
couldn't find a single bit of information about their town.
|
morpheus
|
|
response 51 of 78:
|
Nov 14 06:38 UTC 1998 |
re #50: *snicker* how amusing... almost as amusing as whitehouse.com...
re Jeds response #44 to *my* entry 33: the obvious solution to prevent
children from being "exploited" into *buying* pornography or calling a
phonesex line (oh, gee, horror of horrors, do you think mommy and daddy
might realize that THEY didn't call the 900 number and actually do
something about it, instead of passing bullshit legislation that
filters MY THOUGHTS?) is to give away pornographic materials for free.
So, I am glad that Jed has pointed out to me that the obvious solution
to this whole debate is to do nothing at all, so that children can't be
exploited in this horrid manner, and can continue to recieve free fuck
photos (oh, the alliteration in that last sentence!). Damn those
capitalistic bastards for trying to corrupt the kids by making them BUY
PORN!
Of course, last time I checked, most if not all of the commercial sites
on the internet require a credit card to verify age, so the logical
conclusion is that the little brats will have to STEAL a CREDIT CARD or
commit CREDIT CARD FRAUD, in which case the fact that their whacking
off to pictures of Pamela Lee Anderson isn't going to corrupt them a
lot further.
In any event, censoring me isn't something that the government should
be wasting my tax dollars to do. I don't like getting a bad thing for
free, let alone paying one fifth of my income for it.
Christ. Each time I see this kind of thing, I am reminded that this
country was founded by religious nut-jobs, and that they haven't
stopped trying to "convert the savages" since they landed here. Since
all of the injuns are off on conservations and in casinos, however,
they have taken it upon themselves to see me as a savage and convert me
and my kind.
Well, I am done ranting now, so I am going to go have sex with someone
and take photos of it. Have I got any bidders? (btw, I am a minor!)
|
bru
|
|
response 52 of 78:
|
Nov 14 21:03 UTC 1998 |
Obviously you have no idea of what you speak.
You don't neee to spend any money to find sex sites on the net, very gaphic
sites as a matter of fact. And while they may sell time on there sites, they
do offer a wide variety of pictures to lure you in for free.
|
morpheus
|
|
response 53 of 78:
|
Nov 15 09:38 UTC 1998 |
yeah, true, but jep said that one form of "exploiting" youth is getting
them to *buy* (yes, he did say buy) pornographic materials.
So, I was responding to that ;-)
|
jep
|
|
response 54 of 78:
|
Nov 16 15:24 UTC 1998 |
re #51: I think you misread my message, just as you did my loginid. (-:
|
gregb
|
|
response 55 of 78:
|
Nov 18 04:18 UTC 1998 |
After reading all this, I keep asking myself, "What are these folks /really/
trying to defend. Free speech, or the right to look at porn/erotic (semantics)
material online?" No one has said a word about how this effects other genres
of literature, art, etc. Because it doesn't. And AFAIC, that's fine. If your
interests are centered on sexually-oriented newsgroups, Web sites, etc., then
I'd say you have a pretty narrow fiew of the world. There's more than enough
other kinds of resources out theere that you shouldn't even care if this
paticular segment is eliminated. 'Sides, if your that hard up (pun intended)
for stories, pics, what-have-you, there's still all the books, mags, flicks,
and um, other goodies available via various legal channels.
|
scg
|
|
response 56 of 78:
|
Nov 18 04:49 UTC 1998 |
The original CDA certainly did spill over into areas it wasn't "intended" to
cover, in that it both used such a vague definition of what it banned that
it banned almost anything that could possibly be objectionable, and it also
held operators of systems responsible for content posted to those systems by
their users. To take a system like Grex, for example, as long as all the
discussion stayed "decent", we would have been fine. However, if somebody
posted something objectionable, Grex would have been liable for what was
posted. The problem with that is that, even if we were to accept that
everything the law said was bad was stuff we didn't want on our system, we
still couldn't enforce that without getting rid of our open access policies.
The choices would have been to shut Grex down, or to have the Grex staff read
every response before it gets posted to the conferences, and preread
everything said in party, or to ban minors from the system. That didn't just
affect Grex, but would affect everywhere on the Net that allows the general
public to post content. It was worse, too, at least in some versions. I'm
not sure what finally got passed, but early versions of the bill would have
held operators of mail servers liable for the content of their users' e-mail,
even though it's generally not legal for a service provider to read a
customer's e-mail, and some early versions of it would have even held Internet
backbone providers responsible for data that flowed through their networks,
even if the originator of the data wasn't even a customer of theirs.
The current bill is vastly different, and as such probably should not be
referred to as CDA II. It holds only originators of content liable for
content, and has a much stricter definition of what content it bans. The
content that it bans follows what if I understand correctly is pretty much
the usual definition of what constitutes obscene material, except that they've
said it has to be obscene with respect to minors seeing it, instead of taking
the straight definition of obscenity, which is illegal anyway. The legal
issue, then, is whether adults on the Net wanting anonymous access to content
should be allowed to access content that it's not legal for minors to be given
access to, or whether restricting what adults can do anonymously to the same
level as what kids are allowed to do is legal. In the case of the original
CDA, one of the Court's objections to the bill was that it was restricting
minors in ways that it was not legal to restrict adults, and then holding
adults to the same regulations, so it seems likely that the courts will have
the same objection to this one as well.
In the practical matter, of what content does this law ban, yes, it's mostly
porn. As such, while I agree that this is a free speech issue and it's
probably therefore bad legislation, I find it somewhat hard to get extremely
worked up against it. However, given a still somewhat vague definition of
what got banned, overzealous prosecutors could proabbly use this against some
stuff that isn't porn as well.
|
mcnally
|
|
response 57 of 78:
|
Nov 18 07:45 UTC 1998 |
re #55: probably one of the reasons people are talking mostly about
sexually explicit material is that it's one of the few classes of
information that the law presumes is automatically "harmful to children"
(which is the criteria of the proposed bill.)
There are certainly other things on the Internet that could be judged
"harmful to children" but it isn't completely clear that they would be.
Off the top of my head, I'd expect that if the law stands up we'll
eventually see cases in which someone tries to use the law to control
some of the following kinds of information..
o bomb-making instructions
o drug-making instructions or drug-legalization literature
o racist hate literature
o non-graphic information on sex-related topics such as abortion
or birth control.
o 'satanic' or similar literature
|
mdw
|
|
response 58 of 78:
|
Nov 18 08:43 UTC 1998 |
Perhaps the people who drafted CDA II thought it was only aimed at
"originators", but the language of the bill is worded much broader and
isn't nearly so specific. I'm sure someone who merely forwarded
material deemed harmful would be considered just as liable under the
law. In usenet, the moderators of certain newsgroups could almost
certainly also be held liable. If these interpretations of the law
survive the courts, then I would expect the prosecutors will next try to
go after some adult BBS operator (if there are any left that have any
sort of open registration system, or if the prosecutors are able to show
that whatever registration system is in use is somehow inadequate.)
If *those* court cases pass muster, then I would expect grex to be at
serious and direct risk. When CDA passed, about half of the grex board
expressed *very* cold feet about assuming such a risk on behalf of grex.
It would be a very good thing to ask the current board candidates how
they would feel about grex if this should come to pass.
I would also expect prosecutors will also be interested in somehow
"controlling" the unmoderated newsgroups that contain material that was
found "harmful" in other court cases. I don't know who the prosecutors
will go after there, but I suspect ISPs that offer unrestricted news
access will find it harder to keep these newsgroups.
|
rcurl
|
|
response 59 of 78:
|
Nov 18 16:19 UTC 1998 |
All Greg tells me in #55 is that he doesn't like porn. He is, of course,
entitled to that opinion. However he is not the only person in the world,
and those that like the genre have as much right to access to it as
they (and Greg) have to any other form of speech. All I read in #55 is
the voice of a censor of free speech.
|
drew
|
|
response 60 of 78:
|
Nov 19 03:22 UTC 1998 |
Gotta agree with Rane on this one. I don't much care for porn myself -
B-O-R-i-n-g! But I oppose such stuff as CDA I and II on general principles
of minimum do's and don'ts.
What is so damn special about sex and "porn", anyways? And just how does
looking at a picture of a naked woman, or even people screwing, manage to
cause harm to a "minor" or anyone else? I note that at least one "minor" -
responded above - does not seem to feel the need for such "protection". And
that a lot of legislation passed in the name of "protecting children" pushes
children around more than anyone else.
|
rcurl
|
|
response 61 of 78:
|
Nov 19 05:06 UTC 1998 |
I keep asking that too, but get no answers. I see no harm to minors from
porn *if they are brought up with a full understanding of human biology
and behavior*. Well, we know that seldom happens, so all this fuss about
porn is because of the failures of adults. Pretty lousy reason for
censorship.
|