HOUSE_OVERSIGHT_013576.jpg

2.04 MB

Extraction Summary

4
People
3
Organizations
1
Locations
1
Events
1
Relationships
3
Quotes

Document Information

Type: Scientific/academic text (page from a book or paper submitted as evidence)
File Size: 2.04 MB
Summary

This document is page 76 of a scientific text bearing a House Oversight Bates stamp. The text discusses the concept of entropy in information theory and neuroscience, referencing the work of researchers Seymore Kety, Louis Sokoloff, Harold Himwich, and Claude Shannon. It compares computer computation to brain metabolism and energy consumption.

People (4)

Name Role Context
Seymore Kety Researcher
National Institutes of Mental Heath (sic)
Louis Sokoloff Researcher
National Institutes of Mental Heath (sic)
Harold Himwich Researcher
State of Illinois Thudicum Laboratory
Claude Shannon Scientist
Developer of entropy in communication theory/information theory

Organizations (3)

Name Type Context
National Institutes of Mental Heath
Research institution (Note: Text contains typo 'Heath' instead of 'Health')
State of Illinois Thudicum Laboratory
Research laboratory
House Oversight Committee
Implied by Bates stamp 'HOUSE_OVERSIGHT'

Timeline (1 events)

1948
Formalization of entropy for use in communication theory/information theory by Claude Shannon
N/A

Locations (1)

Location Context
Location of Thudicum Laboratory

Relationships (1)

Seymore Kety Colleagues Louis Sokoloff
Listed together as being from National Institutes of Mental Heath

Key Quotes (3)

"The metaphorical machine for the current age of entropy, analogous to the role of heat and steam engines in classical thermodynamics, is the computer."
Source
HOUSE_OVERSIGHT_013576.jpg
Quote #1
"In this context, entropy and information were obviously complementary descriptors."
Source
HOUSE_OVERSIGHT_013576.jpg
Quote #2
"The information reception capacity of a system is dependent upon the amount of"
Source
HOUSE_OVERSIGHT_013576.jpg
Quote #3

Full Extracted Text

Complete text extracted from the document (2,418 characters)

Leaving the framework of physical thermodynamic entropies entirely, the entropy of information was introduced in the context of communication engineering in electrical and electronic devices. The metaphorical machine for the current age of entropy, analogous to the role of heat and steam engines in classical thermodynamics, is the computer. Energy in this context is a relatively trivial property. Ammeters and other monitors of load are unable to discriminate between a computer actively engaged in encoding and computation or one simply maintaining its dynamic memory while resting in computational readiness. This situation is very analogous to the results of early work discussed previously on the metabolic rates and sources of the whole brain’s energy, oxygen and glucose metabolism, by National Institutes of Mental Heath’s Seymore Kety and Louis Sokoloff and the State of Illinois Thudicum Laboratory’s Harold Himwich. Using whole head arterial-venous, energy-in, energy-out, differences, they could not demonstrate differences in rates of whole brain metabolism between states in which the human subjects were engaged in solving mathematical problems or deeply sleep. In today’s brain imaging research, using a variety of physical reflections of the brain’s metabolic activity, it is the differences in regional distributions of metabolic activity that are relatable to subjective and behavioral states, not differences in total amount of energy expended. In graphically coded representations of the regional metabolism of the brain in action, one or another or many areas “light up” and others “grow dark” in correlation with changes in thinking, feeling and action.
The entropy first developed by Claude Shannon was formalized for use in 1948 in what was then called communication theory and now information theory. It represented a measure of the ambiguity and uncertainty that had the potential for being resolved by new knowledge. In this context, entropy and information were obviously complementary descriptors. A message that informs us about which of ten possibilities should be chosen contains less information than one that informs us about the proper choice to be made from among a thousand possibilities. The entropy of communication theory is a measure that is computed on uncertainty. The information reception capacity of a system is dependent upon the amount of
76
HOUSE_OVERSIGHT_013576

Discussion 0

Sign in to join the discussion

No comments yet

Be the first to share your thoughts on this epstein document