You are not logged in. Login Now
 0-24   25-49   50-74   75-99   100-124   125-149   150-174   175-183   
 
Author Message
9 new of 183 responses total.
srw
response 175 of 183: Mark Unseen   Jul 24 04:34 UTC 1995

I cannot agree at all. I would be very hesitant to share with a 
person who has nothing at stake in comparison with a person who does.
What on earth is wrong with that?

As there is nothing wrong with that position of mine
(which I will insist is the case) one concludes that through no
wrongdoing whatsoever, those who remain anonymnous and thus have nothing
at risk in their behavior, will be at a disadvantage, just as Marcus said.

It is inevitable. It is group dynamics. You are tilting at windmills
trying to change that.
sidhe
response 176 of 183: Mark Unseen   Jul 29 18:29 UTC 1995

        Callling the problem insurmountable does nothing. Facing it, and
finding a new approach, is the only way to overcome it.
davel
response 177 of 183: Mark Unseen   Jul 30 00:11 UTC 1995

OTOH, complaining endlessly about minutia doesn't make them problems worth
doing a lot of work to fix, either.
srw
response 178 of 183: Mark Unseen   Jul 30 06:25 UTC 1995

Christopher, I am not just saying it is insurmountable, I am saying it
is inevitable, natural, immutable. You are wasting your time.
It is not a problem to be faced, it is a fact of life to be handled.
mdw
response 179 of 183: Mark Unseen   Jul 31 03:08 UTC 1995

Ok, so let us suppose this is a problem that that in fact this is a
problem that deserves to be fixed.  This is a social engineering
problem, so it's most appropriate to analyze solutions in terms of the
social consequences.

The problem here seems to be that we have different groups of people who
have shared different amounts of information, and we have some people
who treat other people differently based on that differning amount of
information.

One possible solution would be to make all people share the same amount
of information.  There are a couple of possible variations here.  For
instance, we could arrange to neither collect, store, or share *any*
information.  This is the mininformation solution.  At the opposite
extreme, we could insist everyone provide complete and accurate
information, and institute a comprehensive verification program to
guarantee the integrity of the information we've collected.  This is the
maxiverification solution.  Yet another possibility is to fill in any
information not provided by the user, with fake information invented by
the system.  This is the fakeinformation solution.

All of these solutions are technically feasible.  Most of these
solutions have, in fact, been implemented at other locations, so we are
in an excellent position to predict the consequences of implementing any
of them on grex.  For instance, with the mininformation solution, we can
predict a greater rate of problem behaviors.  People with common
interests are less likely to find each other, and less likely to stay
around.  Other people who are out merely to harass other people, are
likely to enjoy the perceived lack of "accountability".  In the real
world, such systems are generally used when there is little actual
interaction between people, the numbers of people are large, & the
people will be together for a relatively short while.

The maxiverification technique is more popular when a small group of
people are expected to spend a considerable amount of time interacting -
job applications are a typical example.  As with the mininformation
solution, we can predict its effects here: we would lose a certain
percentage of users right off who are paranoid of "big brother".  We
would accumulate a new percentage of users who would be less tolerant of
"problem behaviors", and would demand that access be cut off to people
who are perceived to exhibit "problem behaviors".  "One strike and
you're out" type thinking.  This is in fact the traditional manner in
which most computer timesharing systems have been run, so there is ample
precedent here.  Indeed Grex, as currently operated, is very much an
anomaly.

The fakeinformation solution is perhaps the most challenging to find a
good example of, because in real life, most people find it very
uncomfortable to deal with other people who have not provided real
information.  Nevertheless, there is a classic example in everyone's
livingroom - the television, where, behind the protective barrier of
thick glass and complete aseptic nothingness, we have a complex mixture
of actors, real people, drama, fantasy, fact, and fiction all wrapped up
into one attractive package.

All of these solutions suffer from one further defect, they dehumanize
or depersonalize the users, by removing an element of choice from the
user.  Instead of letting users decide how much they want to share with
other users, we're telling them how much they can share.

As long as we're willing to tell people how they should behave, why not
go one step further, and make people behave the way we expect them to?
That is, instead of removing the trigger that causes the problem, why
not address the problem head on--how people treat one another?  Here
too, there are solutions that are technically feasible.  Essentially, we
are talking about behavior modification, or various forms of
selectionism.  For instance, before we let people onto grex, we might
give them a psychological profile, analyze if their behavior would be
affected by the contents of of another person's .plan, and if it would
be, refuse to give them a grex account.  This would ensure we'd only
have people on grex who wouldn't care, and thereby eliminates the
problem.  Another solution would be to brainwash people.  We'd send
people off to an intensive 24 week "boot camp"; where we would classify
people according to various undesirable traits, "break" the person's old
personality, and replace it with a new more compatible grexian
personality.  This would be an expensive process, so we should probably
demand a considerable amount of money from the user up front.  Also, if
the person is exposed to the real world again, they might revert, so it
might be best to keep them in boot camp permamently.  This might seem a
silly solution, but in fact, it's quite feasible, and there are
organizations all over the world that use only minor variations on what
I've described here.

Any of these solutions would represent a fairly drammatic change in the
way grex operates, so it is reasonable to suppose that most of the
current grex user base & staff will depart.  That would produce a
temporary cash flow problem that might be difficult to resolve.  I can't
think of any other solutions to "the problem" that wouldn't produce
effects similar to the effects of the solutions I proposed above, I'd
regret losing the present grex if any of these were to happen, and all
in all, I think the problem is not nearly as bad as the probable effects
of any of these solutions.  Therefore, I agree with srw's nutshell
analysis, "it's human nature", and that this is one of of those things
we'll just have to learn to live with.
popcorn
response 180 of 183: Mark Unseen   Jul 31 11:34 UTC 1995

Er, but this item is about verifying those users who want to become
members, not about how much info to collect from brand new users.
sidhe
response 181 of 183: Mark Unseen   Jul 31 21:52 UTC 1995

        Indeed, this is totally off-track.

        Grex sets a fine precedent with its' "_you_ fill in the blanks"
newuser, that allows anyone to have the level of disclosure they
desire. The fact that there is a total reversal when one actually
wants to "belong" to this place is very odd.
mdw
response 182 of 183: Mark Unseen   Aug 2 07:54 UTC 1995

We have the "official" notion of membership, and user verification, and
we have the "real" notion of being a participant on grex, and of
"publishing" information provided via newuser.  It's easy to get
confused between the two; but the former is merely a nuisance artifact
of external reality and is, to the extent that it's possible to do so,
best ignored.  The later is certainly an important aspect of grex, and
and by no means trivial.

Sidhe claims that there is a "reversal" of system philosophy between the
user-driven openness of newuser, & the fact that answers given in
newuser might have later consequences as that user is integrated into
grex society.  I believe it's not a reversal.  It's a continuation.
Each person here is free to chose the level & meaning of "belonging" -
it starts when that person decided how much and what information they
choose to share about themselves, and it continues as they choose to use
and participate the part(s) of grex that appeal the most to them.  It's
not reasonable to expect everyone to be "just the same".  Different
people will be different, and will choose to associate with different
people, and that diversity of difference is the real reason for grex's
existance.  You see srw's admission that he would treat people
differently based on the amount and kind of information they "publish"
as an admission of a terrible sin I think srw is being commendably
honest, and admitting a bias that exists in nearly all of us, conscious
or unconscious.
tsty
response 183 of 183: Mark Unseen   Aug 14 18:54 UTC 1995

Still seems to be a "one strike and you're out (except for certain
situations/people)" as well as prohibiting some people from getting
past the on-deck circle. 
  
I frankly acknowledge a paucity of direct information from me, about
me, until and unless i feel comfortable with the recipient. And, then
again, i've found my comfort level to be more "wishful thinking" than
"reality." So, shit happens again. A couple times i've been quite
ready to call it quits and shuffle off. You just can't trust some
stuff to some people - which is true everywhere, but i would have
thought less so in the tighter computer community that emcompases
Ann Arbor and electronic environs.
 0-24   25-49   50-74   75-99   100-124   125-149   150-174   175-183   
Response Not Possible: You are Not Logged In
 

- Backtalk version 1.3.30 - Copyright 1996-2006, Jan Wolter and Steve Weiss