mdw
|
|
response 179 of 183:
|
Jul 31 03:08 UTC 1995 |
Ok, so let us suppose this is a problem that that in fact this is a
problem that deserves to be fixed. This is a social engineering
problem, so it's most appropriate to analyze solutions in terms of the
social consequences.
The problem here seems to be that we have different groups of people who
have shared different amounts of information, and we have some people
who treat other people differently based on that differning amount of
information.
One possible solution would be to make all people share the same amount
of information. There are a couple of possible variations here. For
instance, we could arrange to neither collect, store, or share *any*
information. This is the mininformation solution. At the opposite
extreme, we could insist everyone provide complete and accurate
information, and institute a comprehensive verification program to
guarantee the integrity of the information we've collected. This is the
maxiverification solution. Yet another possibility is to fill in any
information not provided by the user, with fake information invented by
the system. This is the fakeinformation solution.
All of these solutions are technically feasible. Most of these
solutions have, in fact, been implemented at other locations, so we are
in an excellent position to predict the consequences of implementing any
of them on grex. For instance, with the mininformation solution, we can
predict a greater rate of problem behaviors. People with common
interests are less likely to find each other, and less likely to stay
around. Other people who are out merely to harass other people, are
likely to enjoy the perceived lack of "accountability". In the real
world, such systems are generally used when there is little actual
interaction between people, the numbers of people are large, & the
people will be together for a relatively short while.
The maxiverification technique is more popular when a small group of
people are expected to spend a considerable amount of time interacting -
job applications are a typical example. As with the mininformation
solution, we can predict its effects here: we would lose a certain
percentage of users right off who are paranoid of "big brother". We
would accumulate a new percentage of users who would be less tolerant of
"problem behaviors", and would demand that access be cut off to people
who are perceived to exhibit "problem behaviors". "One strike and
you're out" type thinking. This is in fact the traditional manner in
which most computer timesharing systems have been run, so there is ample
precedent here. Indeed Grex, as currently operated, is very much an
anomaly.
The fakeinformation solution is perhaps the most challenging to find a
good example of, because in real life, most people find it very
uncomfortable to deal with other people who have not provided real
information. Nevertheless, there is a classic example in everyone's
livingroom - the television, where, behind the protective barrier of
thick glass and complete aseptic nothingness, we have a complex mixture
of actors, real people, drama, fantasy, fact, and fiction all wrapped up
into one attractive package.
All of these solutions suffer from one further defect, they dehumanize
or depersonalize the users, by removing an element of choice from the
user. Instead of letting users decide how much they want to share with
other users, we're telling them how much they can share.
As long as we're willing to tell people how they should behave, why not
go one step further, and make people behave the way we expect them to?
That is, instead of removing the trigger that causes the problem, why
not address the problem head on--how people treat one another? Here
too, there are solutions that are technically feasible. Essentially, we
are talking about behavior modification, or various forms of
selectionism. For instance, before we let people onto grex, we might
give them a psychological profile, analyze if their behavior would be
affected by the contents of of another person's .plan, and if it would
be, refuse to give them a grex account. This would ensure we'd only
have people on grex who wouldn't care, and thereby eliminates the
problem. Another solution would be to brainwash people. We'd send
people off to an intensive 24 week "boot camp"; where we would classify
people according to various undesirable traits, "break" the person's old
personality, and replace it with a new more compatible grexian
personality. This would be an expensive process, so we should probably
demand a considerable amount of money from the user up front. Also, if
the person is exposed to the real world again, they might revert, so it
might be best to keep them in boot camp permamently. This might seem a
silly solution, but in fact, it's quite feasible, and there are
organizations all over the world that use only minor variations on what
I've described here.
Any of these solutions would represent a fairly drammatic change in the
way grex operates, so it is reasonable to suppose that most of the
current grex user base & staff will depart. That would produce a
temporary cash flow problem that might be difficult to resolve. I can't
think of any other solutions to "the problem" that wouldn't produce
effects similar to the effects of the solutions I proposed above, I'd
regret losing the present grex if any of these were to happen, and all
in all, I think the problem is not nearly as bad as the probable effects
of any of these solutions. Therefore, I agree with srw's nutshell
analysis, "it's human nature", and that this is one of of those things
we'll just have to learn to live with.
|
mdw
|
|
response 182 of 183:
|
Aug 2 07:54 UTC 1995 |
We have the "official" notion of membership, and user verification, and
we have the "real" notion of being a participant on grex, and of
"publishing" information provided via newuser. It's easy to get
confused between the two; but the former is merely a nuisance artifact
of external reality and is, to the extent that it's possible to do so,
best ignored. The later is certainly an important aspect of grex, and
and by no means trivial.
Sidhe claims that there is a "reversal" of system philosophy between the
user-driven openness of newuser, & the fact that answers given in
newuser might have later consequences as that user is integrated into
grex society. I believe it's not a reversal. It's a continuation.
Each person here is free to chose the level & meaning of "belonging" -
it starts when that person decided how much and what information they
choose to share about themselves, and it continues as they choose to use
and participate the part(s) of grex that appeal the most to them. It's
not reasonable to expect everyone to be "just the same". Different
people will be different, and will choose to associate with different
people, and that diversity of difference is the real reason for grex's
existance. You see srw's admission that he would treat people
differently based on the amount and kind of information they "publish"
as an admission of a terrible sin I think srw is being commendably
honest, and admitting a bias that exists in nearly all of us, conscious
or unconscious.
|