22 Are the Androids Dreaming Yet?
gates so that they communicate in the vertical direction as well. Intel
demonstrated the first three-dimensional chip in 2004, and these chips
should begin to appear in our laptops by around 2020.
Taking a chip into the third dimension solves the economic
problem but adding logic gates to a 3D chip presents a new problem
– heat. Heat is generated in proportion to the volume of the chip but
it can only be lost through the surface area. Result: the chip overheats.
Large animals have the same problem which is why elephants have huge
ears, filled with blood vessels, they can flap to cool themselves and really
big mammals, such as whales, live in the ocean. The thermal problem
is now the biggest problem in most computer designs. One data point
suggests we could solve this problem, the human brain. We pack huge
processing power into our skulls without overheating by using a variety
of techniques, including folding the surface of the brain, running each
neuron very slowly and maybe even using quantum mechanics. A very
recent discovery is that brains could be using quantum effects to transmit
signals. If true – and the research has only been recently published –
it means we may use a form of high-temperature superconductivity to
avoid overheating. More on this in Chapter 4.
Excluding exotic quantum effects, the main difference between
computer and human brains is their processing architecture. Brains
use slow, asynchronous logic to process information rather than the
fast, synchronous type used in modern day computers. Logic gates in
today’s computers work all the time, even when there is nothing to do.
For example, if I multiply 2 by 3 on my laptop the entire multiply circuit,
designed to work on 20 digit numbers will still operate, and, even worse,
it will operate on every tick of the master clock even if there is nothing
to multiply. The brain, by contrast, works only as it needs; unused gates
don’t operate. This gives a massive reduction in unnecessary power
consumption. We’d like to use this technique in modern computers but
it is very difficult to implement. Tiny changes in timing cause completely
different operation and this makes them hard to test. We accept this
sort of problem in humans, calling it ‘human error’, but we count on
computers to behave absolutely reliably, so full-blown asynchronous logic
is not likely to appear anytime soon. Some of these ideas, however, have
made their way into today’s consumer devices. For example, the chips
in the latest iPhone contain two CPUs: a small slow one operating when
the phone is dormant in your pocket, and a large fast one that switches
on when you need the power for a game or other high performance task.
HOUSE_OVERSIGHT_015712
Discussion 0
No comments yet
Be the first to share your thoughts on this epstein document