|
|
From a Q&A blurb in Popular Science, answering how much memory capacity, in computer terms, the human brain has. "Most Scientific research puts the number at between one and ten terabytes, according to Robert Birge, a chemistry professor at Syracuse University. ... How can researchers even hazard a guess? Generally, it's done by taking a cross section of the brain and counting the number of neurons. Each neuron is assumed to hold one bit. ... Of course, it's not that easy. For one thing, the brain stores information in more complex ways than a computer does. Moreover, much of the 'data' stored in the brain is in the form of images, which humans compress far more efficiently than computers. And some scientists believe neurons are capable of storing more than one bit apiece. These factors account for the estimates ranging as high as ten terabytes. Birge favors three terabytes, 'but this is really rough,' he admits." Interesting question...what do you think of those estimates? My gut reaction is "what, just a thousand gigs?" It seems almost insulting for it to be a number attainable with hard drives in the foreseeable future. But I have no basis for a better guess, and that is a honking lot of data.
26 responses total.
1.) I think your slight revulsion at the concept of human brain storage
being obtainable with current computer hardware, is based more on
human conceit and arrogance, rather than on actual scientific evidence.
2.) Humans have a problem with conceptualizing quantitys once you
get past "1, 2, 3, many....". 10 terabytes is a tremendous amount
of data. OTOH, the fact that our brains have that "little" storage,
might account for why people are always misplacing their car keys. :-)
3.) I suspect that the brains methodology of storing information is so
differnt from our current computer's paradigm, that a direct comparison
is meaningless. Even if we can directly count the smallest quanta of
brain storage and call that a "brain bit", I suspect that 1 megabit
of brain bits are capable of storing a great deal more *information*
that 1 megabit of compter storage, due to the storage methodologys
employed in the brain.
1.) Absolutely! If I were a computer, I wouldn't be irked to hear humans
had more memory than me, but I'm human! :-)
Our brains do seem to employ tremendously lossy compression (we can recall
a scene in a movie, but it's so much less detailed than a VHS tape). I
agree that the storage method is too different to do any close comparison.
One day I expect we or computers will understand fairly accurately how our
memory works.
Those ten (or whatever) teraneurons each have dendrites making contacts with the dendrites of hundreds if not thousands of other neurons, each synapse of which is capable of being in several states. It is not a counting but a combinatorial question. If A neurons have B dendrites synapes with C possible states, how many different states can the system have? I'll think about that another time....
re 2:
if HumanMemory > MyMemory then be irked
My estimate is that there are C^[A, B+1] states, where [] is the binomial coefficient. If A=10^12, B=10, and C=3, the number of possible states is about 10^[10^124]. A lot of these are hardwired, but there's enough to go around. The wonder is that we are so dumb.
Another thing that is missing here, is that if the brain is a giant inter- connected mass of memory cells, then where is the CPU? Oh, the brain is the CPU? Then where is the memory? We have a massively inter-connected network of undifferentiated cells that perform *both* the jobs of memory storage *and* processing. I think it's a major mistake to count every neuron as just a single bit. As Rane pointed out, every nueron is many bits *and* it's a small piece of the processor too. The differences between a human brain and a modern Von-neuman style digital computer are like comparing apples and carbuerators, no direct comparison is possible.
Interesting, Greg. I was thinking along those lines, our brains hold programs and memory in a way RAM doesn't, but couldn't put my finger on the diff - it's what you said, that it's also the CPU. Neurons seem perhaps more like logic gates than simple bits of memory. They're "firing" or "not firing," but doesn't that on/off output result from a variable numbers of inputs (dendrites)? As you said, they're both memory and logic circuits. Rane, interesting approach. Considering neural connections helps figure how useful our neurons are, but maybe isn't relevant in computing just the raw "uncompressed" storage capacity of the brain. To elaborate: If you had a normalized relational database on an auto-compressed hard drive, the manufacturer's drive spec's capacity wouldn't indicate how much "info" it *really* stored. The related fields and compression are important to the useful storage capacity, but there is still a raw, physical bit capacity of the hard drive. Counting what our neurons can store is similar - it may give a raw data capacity, but the dendrites, "data formats," etc. are what really matter. Looking at *raw* capacities, I don't think bit/brain comparisons are as far apart as apples to carburetors...more like apples to corn :-).
There are implications of my calculation that I am only beginning to realize, in part because of Greg's and your observations. I calculated the total number of different possible *states*. The number is unimaginable. It also says nothing where state combinations are localized for any function, or how memory and processing states are located and interact. To create a "brain" the total available states must be much ordered and constrained, so they work together. This will constrain the *useable* memory and processing states dramatically. The notion of a "bit" may be oversimplistic, as a "bit" from the state space perspective is just one state *of the whole brain*. A "memory" would be made up of many bits, but the related states could be suffused all over the brain (which appears to be the case, and sure starts sounding like "parallel" computation). Now my brain is sore, and I must rest it for the next cognition ;->.
You're definitely onto something. Each neuron acts like a gate, but a much more complicated gate than and and,or,nand,nor. Some areas excite, other suppress. Over time, the degree, and even the sign of the effect at any synapse is modified. I think the "learning" and "memory" is somehow stored in these "coefficients" at the synapses. That's about all the clues I have.
Sort of like bubble memory, where the "contents" are constantly being moved around the brain as part of normal processing?
Well, no, not really - because one of the properties the brain has that bubble memory does not is that it's highly redundant. Bubble memory, even more than most computer components, is highly serialized and non-redundant. You can, in fact, only look at one bit at a time, and if any bit of the hardware goes, the whole chip goes. Human memory is a very different thing, and probably much more similar to a holograph than anything else. In a holograph, the whole image is stored "everywhere", and if you break a holograph in two, each half still has the whole picture, it's just coarser in each half. In fact, humans can suffer fairly massive amounts of brain damage and still have only minimal effects on memory. (It *is* possible to have damage to small areas that have a massive effect on memory, but that's probably because it's damaged the storage & retrieval parts, not the actual memory itself.) Another interesting feature of human memory is that it's actually a two stage process. There's a "fast memory" and a "slow memory" storage system. When we first learn something, it goes into short-term memory which is probably an electro-chemical dynamic process. Several hours after we learn something, it goes into a purely chemical static long-term memory process. There are several ways to show this must be happening - people who suffer some massive disruption of brain processing, such as a shock, will forget everything they learned just before the shock, but not older stuff. There are also ways to shut down the brain's electrical processing, and if this is done, again, the short term memory goes, but the long term does not. Even though we can't recall our memories in detail, that does not mean it's not there. mean it's not there. In fact, the evidence that all of that detail *is* recorded, and what's lost is not the memory, but the ability to retrieve it. In certain kinds of brain operations, it's possible to stimulate certain parts of the brain, and if this is done, the patient will recall some random long lost experience, complete in every detail; including sound, scent, & all. If another nearb y spot is stimulated, a different random memory will be retrieved.
Rane, was thinking about your calculation of the number of states a brain can hold. Since a byte can represent 256 states, and two bytes can represent 65,536 states, then to convert your calculations of states to bytes of storage, would you take log base 256 of your huge number (10^[10^124])? I realize it could be many orders of magnitude off due to the limitations you later noted, and states don't convert to bytes quite so precisely for arbitrary data, but it might be an interesting number to see anyway. Though I don't think I have a calculator that will do log256(10^(10^124)). Re 11, I think when I took Cognitive Psychology (intro), they divided human memory into three types (short-, medium-, long-term?), like remembering a phone number a few seconds until you dial it, or a conversation you can repeat a minute later but not an hour lter, and long-term permanent stuff. Though perhaps the medium- term stuff does go into long term, and is just never indexed for likely retrievability. It does create have some interesting parallels to computer architecture, where storage is divided into different sections of varying speed and permanence (registers, processor cache, ram, hard drives, tape archives). I am aware of my "memory refresh rate" when trying to remember a phone number long enough to dial it - repeating it to myself every couple seconds definitely keeps the memory active longer. And it seems like some people have something like a "direction" register, while I can barely find north on a map!
I quickly read throught all the previous responses and it seems no one has yet to take into consideration one's feeling, emotions, and urges how can these things be quantified? There is more to the human brain than just different types of memory.
I am not convinced that there is anything special about those. They are much higher levels of processing. What is to keep us from quantifying them?
I don't see feelings, emotions, or urges as particularly special, but I don't count them as part of memory, either (unless remembering feelings/emotions/urges). They get handled in other parts of the brain. That borders more on the "can machines think" debate, a classic philosophical/AI question. It's an interesting issue. As "smarter" machines are created and evolve to argue the question for themselves, I'm sure it will become an important theological/political question. Unfortunately, I don't think that will happen in our lifetimes.
Re #12: it doesn't seem right to equate a "bit" with a "state". Consider a state space of all possible ways to place a set of chess pieces on a chessboard. That's a big number. Is one such "state" a "bit"? Is a set of 256 of the possible states a "byte"? The chessboard can, of course, have only *one* state at any instant, but it can change. Likewise, the brain has only *one* state at an instant (and the state changes very rapidly). Each state could be assigned a number, but in no sense is their a sequence of consecutive numbers. The analogy seems flawed.
I do not equate a bit to a state. A particular state, out of a domain of N states, can be represented with a number from 1 to N. If N=2 (maybe for a light switch, being on or off), *then* I think a bit is comparable to a state. But for N>2 (as in chess, where N is a jiggagazillion), it would require a larger number of bits (in chess, log base 2 of a jiggagazillion) to represent a given state. If it could be determined that human memory had exactly N states (as you were estimating in #5), taking the log base 256 of N seems a not-unmeaningful way of defining the answer to the question "how much memory can the brain hold in computer terms [bytes]?" It's an odd question, but suppose someone asked "how much memory does a chess board hold in computer terms?" (Interpreted in a canonical way - ignore "rotation" of pieces and such). I think the number of bytes needed to represent the layouts (or states) is a reasonable way of answering.
But why "bytes"? That's an artifical value chosen to be just large enough to encompass most of our written symbolic system, which is itself artificial. I'd just leave it as "states", which are countable.
"Bytes" was chosen to answer the question, how much memory in computer terms. If you don't like the arbitrariness of bytes, how about bits? Makes little diff. But if you prefer leaving the brain number in states, it's easy enough to convert computer memory capacity to a maximum number of states...a megabyte of RAM could represent 256^1,024,576 states. It just seems a lot harder to comprehend such a big number.
Bits are the countable memory units. Bytes are an arbitrary aggregate of bits. The biggest difference between computer memory and the brain is that the brain to a considerable extent uses states, while the computer memory by comparison is really dumb, being restricted to just bytes. That is, it does not use all the *combinations* possible between byte sites.
You're right, but those differences are in how brains and computers *use* memory, not in what the memory capacities are. That type of comparison would be much harder. If you're interested in raw memory capacity, what's the diff between specifying it in states, bits, or bytes? Whether you store data in an inefficient 8-bit ASCII format, or an efficient JPEG format (which doesn't care about byte boundaries, except in its header), the raw capacity of the memory chips is the same.
I think you are saying that you can *count* the states in binary ("bits")
or in whatever "hexadecimal-squared" is called ("bytes"), with which I
agree completely. In the hardware, though, there are different possible
organizations of the states/data, which relate to how the brain or the
computer processes data. Have we converged? :)
I believe so. Time to go grab someone's brain and test our theories. :)
Just found a review of _The Cognitive Neurosciences_ by Michael Gazzaniga (MIT Press, 1995). Brief quote from review: "Many studies reveal that a brain does encode information as patterned activity in a population of neurons. In some species, brain maps in the superior colliculus, for example, encode the location of visual and auditory stimulio, as well as movement. On these maps, pinpointing the source of a sensory stimulus or directing a movement to a specific location involves many neutrons. Population coding exists in other areas as well." Sounds like "states". Incidentally, the review also reveals that "The function of a single cortical neuron, for instance, depends on the 10,000 inputs that it probably receives and the 10,000 outputs that it probably makes."
That reminds me "neural networks" on computers, where the software specs often include the maximum number of "nodes." Analogy of #24 to computers: a particular bit of memory may be set by 10,000 different program instructions, and the same bit of memory may in turn be read for use by 10,000 other program instructions. Though this again raises the issue of neurons containing code and data, while the two are usually (but rarely by necessity) separated in computers.
Feelings, emotion, & all definitely makes a difference in memory. Studies that take this into account show that, to maximize your ability to retrieve information, you should be in as nearly the same surroundings to retrieve it as when you remembered it. If you think about it, this is a perfectly sensible evolutionary survival trait. If you're about to be eaten by a tiger, the *best* thing to remember is what you did the last time you were about to be eaten by a tiger. If you studied for an important test in a dark bar half drunk, then you will do best if you can also take your test in a dark bar half drunk. Even food & drink can make a difference - apparently, caffein improves the ability to concentrate, & sugar improves the ability the memory; so a cup of tea or a can of pop may well be a very good idea, both when studying, & when taking the test. In computer terms, the human brain is an associative memory array, & some of the inputs include the current emotional state and the current environment.
Response not possible - You must register and login before posting.
|
|
- Backtalk version 1.3.30 - Copyright 1996-2006, Jan Wolter and Steve Weiss