You are not logged in. Login Now
 0-24   25-26         
 
Author Message
ajax
Human hardware Mark Unseen   Oct 18 01:50 UTC 1995

From a Q&A blurb in Popular Science, answering how much memory
capacity, in computer terms, the human brain has.
 
"Most Scientific research puts the number at between one and ten
terabytes, according to Robert Birge, a chemistry professor at
Syracuse University. ... How can researchers even hazard a guess?
Generally, it's done by taking a cross section of the brain and
counting the number of neurons.  Each neuron is assumed to hold
one bit. ... Of course, it's not that easy.  For one thing, the
brain stores information in more complex ways than a computer does.
Moreover, much of the 'data' stored in the brain is in the form
of images, which humans compress far more efficiently than
computers.  And some scientists believe neurons are capable of
storing more than one bit apiece.  These factors account for the
estimates ranging as high as ten terabytes.  Birge favors three
terabytes, 'but this is really rough,' he admits."
 
Interesting question...what do you think of those estimates?  My
gut reaction is "what, just a thousand gigs?"  It seems almost
insulting for it to be a number attainable with hard drives in
the foreseeable future.  But I have no basis for a better guess,
and that is a honking lot of data.
26 responses total.
gregc
response 1 of 26: Mark Unseen   Oct 18 08:30 UTC 1995

1.) I think your slight revulsion at the concept of human brain storage
    being obtainable with current computer hardware, is based more on
    human conceit and arrogance, rather than on actual scientific evidence.
2.) Humans have a problem with conceptualizing quantitys once you
    get past "1, 2, 3, many....". 10 terabytes is a tremendous amount
    of data. OTOH, the fact that our brains have that "little" storage,
    might account for why people are always misplacing their car keys. :-)
3.) I suspect that the brains methodology of storing information is so
    differnt from our current computer's paradigm, that a direct comparison
    is meaningless. Even if we can directly count the smallest quanta of
    brain storage and call that a "brain bit", I suspect that 1 megabit
    of brain bits are capable of storing a great deal more *information*
    that 1 megabit of compter storage, due to the storage methodologys
    employed in the brain.
ajax
response 2 of 26: Mark Unseen   Oct 18 15:17 UTC 1995

1.) Absolutely!  If I were a computer, I wouldn't be irked to hear humans
    had more memory than me, but I'm human!  :-)

Our brains do seem to employ tremendously lossy compression (we can recall
a scene in a movie, but it's so much less detailed than a VHS tape).  I 
agree that the storage method is too different to do any close comparison.
One day I expect we or computers will understand fairly accurately how our
memory works.
rcurl
response 3 of 26: Mark Unseen   Oct 18 19:01 UTC 1995

Those ten (or whatever) teraneurons each have dendrites making contacts
with the dendrites of hundreds if not thousands of other neurons, each
synapse of which is capable of being in several states. It is not a
counting but a combinatorial question. If A neurons have B dendrites
synapes with C possible states, how many different states can the system
have? I'll think about that another time.... 

scg
response 4 of 26: Mark Unseen   Oct 19 06:55 UTC 1995

re 2:
        if HumanMemory > MyMemory then be irked
rcurl
response 5 of 26: Mark Unseen   Oct 19 18:35 UTC 1995

My estimate is that there are C^[A, B+1] states, where [] is the binomial
coefficient. If A=10^12, B=10, and C=3, the number of possible states is
about 10^[10^124]. A lot of these are hardwired, but there's enough to go
around. The wonder is that we are so dumb. 

gregc
response 6 of 26: Mark Unseen   Oct 19 21:25 UTC 1995

Another thing that is missing here, is that if the brain is a giant inter-
connected mass of memory cells, then where is the CPU? Oh, the brain is the
CPU? Then where is the memory?

We have a massively inter-connected network of undifferentiated cells that
perform *both* the jobs of memory storage *and* processing. I think it's
a major mistake to count every neuron as just a single bit. As Rane pointed
out, every nueron is many bits *and* it's a small piece of the processor too.

The differences between a human brain and a modern Von-neuman style digital
computer are like comparing apples and carbuerators, no direct comparison
is possible.
ajax
response 7 of 26: Mark Unseen   Oct 20 05:14 UTC 1995

Interesting, Greg.  I was thinking along those lines, our brains hold
programs and memory in a way RAM doesn't, but couldn't put my finger on
the diff -  it's what you said, that it's also the CPU.
 
Neurons seem perhaps more like logic gates than simple bits of memory.
They're "firing" or "not firing," but doesn't that on/off output result
from a variable numbers of inputs (dendrites)?  As you said, they're
both memory and logic circuits.
 
Rane, interesting approach.  Considering neural connections helps figure
how useful our neurons are, but maybe isn't relevant in computing just
the raw "uncompressed" storage capacity of the brain.  To elaborate:
 
 If you had a normalized relational database on an auto-compressed hard
 drive, the manufacturer's drive spec's capacity wouldn't indicate how
 much "info" it *really* stored.  The related fields and compression are
 important to the useful storage capacity, but there is still a raw,
 physical bit capacity of the hard drive.
 
 Counting what our neurons can store is similar - it may give a raw data
 capacity, but the dendrites, "data formats," etc. are what really matter.
 
Looking at *raw* capacities, I don't think bit/brain comparisons are as
far apart as apples to carburetors...more like apples to corn :-).
rcurl
response 8 of 26: Mark Unseen   Oct 20 19:07 UTC 1995

There are implications of my calculation that I am only beginning to
realize, in part because of Greg's and your observations. I calculated the
total number of different possible *states*. The number is unimaginable.
It also says nothing where state combinations are localized for any
function, or how memory and processing states are located and interact. To
create a "brain" the total available states must be much ordered and
constrained, so they work together. This will constrain the *useable*
memory and processing states dramatically. The notion of a "bit" may be
oversimplistic, as a "bit" from the state space perspective is just one
state *of the whole brain*. A "memory" would be made up of many bits, but
the related states could be suffused all over the brain (which appears to
be the case, and sure starts sounding like "parallel" computation). Now my
brain is sore, and I must rest it for the next cognition ;->. 

srw
response 9 of 26: Mark Unseen   Oct 21 06:27 UTC 1995

You're definitely onto something. 

Each neuron acts like a gate, but a much more complicated gate than and
and,or,nand,nor.  Some areas excite, other suppress. Over time, the degree,
and even the sign of the effect at any synapse is modified. I think the
"learning" and "memory" is somehow stored in these "coefficients" at the
synapses. That's about all the clues I have.
scott
response 10 of 26: Mark Unseen   Oct 21 12:14 UTC 1995

Sort of like bubble memory, where the "contents" are constantly being moved
around the brain as part of normal processing?  
mdw
response 11 of 26: Mark Unseen   Oct 22 01:27 UTC 1995

Well, no, not really - because one of the properties the brain has that
bubble memory does not is that it's highly redundant.  Bubble memory,
even more than most computer components, is highly serialized and
non-redundant.  You can, in fact, only look at one bit at a time, and if
any bit of the hardware goes, the whole chip goes.  Human memory is a
very different thing, and probably much more similar to a holograph than
anything else.  In a holograph, the whole image is stored "everywhere",
and if you break a holograph in two, each half still has the whole
picture, it's just coarser in each half.  In fact, humans can suffer
fairly massive amounts of brain damage and still have only minimal
effects on memory.  (It *is* possible to have damage to small areas that
have a massive effect on memory, but that's probably because it's
damaged the storage & retrieval parts, not the actual memory itself.)

Another interesting feature of human memory is that it's

actually a two stage process.  There's a "fast memory" and a "slow
memory" storage system.  When we first learn something, it goes into
short-term memory which is probably an electro-chemical dynamic process.
Several hours after we learn something, it goes into a purely chemical
static long-term memory process.  There are several ways to show this
must be happening - people who suffer some massive disruption of brain
processing, such as a shock, will forget everything they learned just
before the shock, but not older stuff.  There are also ways to shut down
the brain's electrical processing, and if this is done, again, the short
term memory goes, but the long term does not.

Even though we can't recall our memories in detail, that does not mean
it's not there.  mean it's not there.  In fact, the evidence that all of
that detail *is* recorded, and what's lost is not the memory, but the
ability to retrieve it.  In certain kinds of brain operations, it's
possible to stimulate certain parts of the brain, and if this is done,
the patient will recall some random long lost experience, complete in
every detail; including sound, scent, & all.  If another nearb y spot is
stimulated, a different random memory will be retrieved.
ajax
response 12 of 26: Mark Unseen   Oct 25 17:47 UTC 1995

Rane, was thinking about your calculation of the number of states
a brain can hold.  Since a byte can represent 256 states, and two
bytes can represent 65,536 states, then to convert your calculations
of states to bytes of storage, would you take log base 256 of your
huge number (10^[10^124])?  I realize it could be many orders of
magnitude off due to the limitations you later noted, and states
don't convert to bytes quite so precisely for arbitrary data, but it
might be an interesting number to see anyway.  Though I don't think
I have a calculator that will do log256(10^(10^124)).
 
Re 11, I think when I took Cognitive Psychology (intro), they
divided human memory into three types (short-, medium-, long-term?),
like remembering a phone number a few seconds until you dial it,
or a conversation you can repeat a minute later but not an hour
lter, and long-term permanent stuff.  Though perhaps the medium-
term stuff does go into long term, and is just never indexed for
likely retrievability.
 
It does create have some interesting parallels to computer
architecture, where storage is divided into different sections of
varying speed and permanence (registers, processor cache, ram, hard
drives, tape archives).
 
I am aware of my "memory refresh rate" when trying to remember a
phone number long enough to dial it - repeating it to myself every
couple seconds definitely keeps the memory active longer.  And it
seems like some people have something like a "direction" register,
while I can barely find north on a map!
anecdote
response 13 of 26: Mark Unseen   Oct 30 00:44 UTC 1995

I quickly read throught all the previous responses and it seems no one has
yet to take into consideration one's feeling, emotions, and urges  how can
these things be quantified?   There is more to the human brain than just
different types of memory.
srw
response 14 of 26: Mark Unseen   Oct 31 06:51 UTC 1995

I am not convinced that there is anything special about those. They are
much higher levels of processing. What is to keep us from quantifying them?
ajax
response 15 of 26: Mark Unseen   Nov 1 04:40 UTC 1995

  I don't see feelings, emotions, or urges as particularly special,
but I don't count them as part of memory, either (unless remembering
feelings/emotions/urges).  They get handled in other parts of the
brain.
 
  That borders more on the "can machines think" debate, a classic
philosophical/AI question.  It's an interesting issue.  As "smarter"
machines are created and evolve to argue the question for themselves,
I'm sure it will become an important theological/political question.
Unfortunately, I don't think that will happen in our lifetimes.
rcurl
response 16 of 26: Mark Unseen   Nov 1 07:40 UTC 1995

Re #12: it doesn't seem right to equate a "bit" with a "state". Consider
a state space of all possible ways to place a set of chess pieces on
a chessboard. That's a big number. Is one such "state" a "bit"? Is
a set of 256 of the possible states a "byte"? The chessboard can, of
course, have only *one* state at any instant, but it can change. Likewise,
the brain has only *one* state at an instant (and the state changes very
rapidly). Each state could be assigned a number, but in no sense is
their a sequence of consecutive numbers. The analogy seems flawed.
ajax
response 17 of 26: Mark Unseen   Nov 1 10:28 UTC 1995

  I do not equate a bit to a state.  A particular state, out of a
domain of N states, can be represented with a number from 1 to N.
If N=2 (maybe for a light switch, being on or off), *then* I think
a bit is comparable to a state.  But for N>2 (as in chess, where N
is a jiggagazillion), it would require a larger number of bits (in
chess, log base 2 of a jiggagazillion) to represent a given state.
 
  If it could be determined that human memory had exactly N states
(as you were estimating in #5), taking the log base 256 of N seems
a not-unmeaningful way of defining the answer to the question "how
much memory can the brain hold in computer terms [bytes]?"
 
  It's an odd question, but suppose someone asked "how much memory
does a chess board hold in computer terms?"  (Interpreted in a
canonical way - ignore "rotation" of pieces and such).  I think
the number of bytes needed to represent the layouts (or states) is
a reasonable way of answering.
rcurl
response 18 of 26: Mark Unseen   Nov 2 06:24 UTC 1995

But why "bytes"? That's an artifical value chosen to be just large
enough to encompass most of our written symbolic system, which is
itself artificial. I'd just leave it as "states", which are countable.
ajax
response 19 of 26: Mark Unseen   Nov 2 08:32 UTC 1995

  "Bytes" was chosen to answer the question, how much memory in
computer terms.  If you don't like the arbitrariness of bytes,
how about bits?  Makes little diff.  But if you prefer leaving
the brain number in states, it's easy enough to convert computer
memory capacity to a maximum number of states...a megabyte of RAM
could represent 256^1,024,576 states.  It just seems a lot harder
to comprehend such a big number.
rcurl
response 20 of 26: Mark Unseen   Nov 2 18:58 UTC 1995

Bits are the countable memory units. Bytes are an arbitrary aggregate of
bits. The biggest difference between computer memory and the brain is that
the brain to a considerable extent uses states, while the computer memory
by comparison is really dumb, being restricted to just bytes. That is, it
does not use all the *combinations* possible between byte sites. 

ajax
response 21 of 26: Mark Unseen   Nov 3 07:13 UTC 1995

You're right, but those differences are in how brains and computers
*use* memory, not in what the memory capacities are.  That type of
comparison would be much harder.  If you're interested in raw memory
capacity, what's the diff between specifying it in states, bits, or
bytes?  Whether you store data in an inefficient 8-bit ASCII format,
or an efficient JPEG format (which doesn't care about byte boundaries,
except in its header), the raw capacity of the memory chips is the same.
rcurl
response 22 of 26: Mark Unseen   Nov 3 17:47 UTC 1995

I think you are saying that you can *count* the states in binary ("bits")
or in whatever "hexadecimal-squared" is called ("bytes"), with which I
agree completely. In the hardware, though, there are different possible
organizations of the states/data, which relate to how the brain or the
computer processes data. Have we converged? :)

ajax
response 23 of 26: Mark Unseen   Nov 4 02:34 UTC 1995

I believe so.  Time to go grab someone's brain and test our theories. :)
rcurl
response 24 of 26: Mark Unseen   Nov 4 06:52 UTC 1995

Just found a review of _The Cognitive Neurosciences_ by Michael Gazzaniga (MIT
Press, 1995). Brief quote from review:

   "Many studies reveal that a brain does encode information as patterned
  activity in a population of neurons. In some species, brain maps
  in the superior colliculus, for example, encode the location of
  visual and auditory stimulio, as well as movement. On these maps,
  pinpointing the source of a sensory stimulus or directing a movement
  to a specific location involves many neutrons. Population coding exists
  in other areas as well."

Sounds like "states". Incidentally, the review also reveals that "The
function of a single cortical neuron, for instance, depends on the
10,000 inputs that it probably receives and the 10,000 outputs that it
probably makes."
 0-24   25-26         
Response Not Possible: You are Not Logged In
 

- Backtalk version 1.3.30 - Copyright 1996-2006, Jan Wolter and Steve Weiss