HOUSE_OVERSIGHT_013577.jpg

1.99 MB

Extraction Summary

2
People
1
Organizations
0
Locations
0
Events
1
Relationships
3
Quotes

Document Information

Type: Scientific manuscript / essay page (evidence exhibit)
File Size: 1.99 MB
Summary

This document appears to be page 77 of a scientific manuscript or essay discussion information theory, specifically focusing on binary coding, probability, logarithms, and entropy. It discusses the work of Claude Shannon and George Boole. While the text is academic in nature, the 'HOUSE_OVERSIGHT_013577' footer indicates it was seized or produced as evidence in a congressional investigation, likely related to Jeffrey Epstein's connections with the scientific community.

People (2)

Name Role Context
Claude Shannon Historical Figure (Mathematician/Engineer)
Referenced for his 1938 master's thesis mapping Boolean algebra to switching devices and his work on entropy.
George Boole Historical Figure (Mathematician)
Referenced for his algebraic scheme used in computation.

Organizations (1)

Name Type Context
House Oversight Committee
Implied by the Bates stamp 'HOUSE_OVERSIGHT_013577' indicating this document is part of a congressional investigation.

Relationships (1)

Claude Shannon Intellectual/Academic George Boole
Shannon mapped Boole's algebraic scheme onto switching devices.

Key Quotes (3)

"In the binary coding scheme of digital electronic operations, the unit of information is the bit"
Source
HOUSE_OVERSIGHT_013577.jpg
Quote #1
"Shannon’s 1938 master’s thesis mapped George Boole’s algebraic scheme for doing yes-no, either-or computation onto current switching devices"
Source
HOUSE_OVERSIGHT_013577.jpg
Quote #2
"The amount of uncertainty in the receiver that pre-existed the receipt of the message."
Source
HOUSE_OVERSIGHT_013577.jpg
Quote #3

Full Extracted Text

Complete text extracted from the document (2,443 characters)

uncertainty in the receiver that pre-existed the receipt of the message. In the binary
coding scheme of digital electronic operations, the unit of information is the bit, a
choice made between 0 or 1 in the resolution of a two state ambiguity at each place
of some power of two number of places. Our relatively common computers these
days have 32 or 64 bit processors. If these 0,1 choices are made in a random
sequence in which each step is independent of the previous one, the sequential
probabilities, . . . ~ are multiplicative: e.g. the probability of getting two 1’s (heads
in a fair coin) in a row are the product of each 0.5 probability: ρ1 = 0.5 × ρ2 = 0.5 =
ρ1 ρ2 = 0.25. Using the common base ten system of logarithms to demonstrate the
algebraic fact that multiplicative probabilities are logarithmically additive (and
ignoring the minus sign that comes with making logarithms of the decimal fractions
of probability), we notice that log10(0.5) = 0.693147 and log10(0.25) = 1.386294 and
that 0.693147 + 0.693147 = 1.386294.
The dot-dash choices of Morse code machines, the go, no-go gates of
transistors, the open versus closed ion channel-mediated neuronal membrane
discharge and the left, right spins of the single electrons of today’s quantum
computers lead naturally to an information encoding of multiplicative sequences as
the sum of logarithms in base (equal to the number of available states) two, each ρ=
0.5 choice called, log2(0.5) = 1, a bit. Shannon’s 1938 master’s thesis mapped
George Boole’s algebraic scheme for doing yes-no, either-or computation onto
current switching devices such that circuit closed was “true” and circuit open was
“false.” Using Boole’s laws such as “Not(A and B)” always equals “(Not A) or (Not
B)” led to schemes for circuit routing through electronic gates which also serve for
information storage in gadgets ranging from cell phone directories to computer hard
disks.
Following Claude Shannon, each logarithmically additive entropy term is
expressed as the sums, Σι of its probability, ρι, times the probability’s logarithm,
Σι(ρι× log2) (ρι in base two. A logarithm is an exponent of its relevant base such that,
for example, the logarithm, base two, of 2 × 2 × 2, 23, = 3 and 3 bits can encode
eight binary (0,1) numbers: (000, 001, 010,011,100,101,110, and 111). Shannon
used a hill-like, called convex, entropy function S (ρ)= -Σ(ρ ln (ρ)). The amount of
77
HOUSE_OVERSIGHT_013577

Discussion 0

Sign in to join the discussion

No comments yet

Be the first to share your thoughts on this epstein document