HOUSE_OVERSIGHT_016830.jpg

1.84 MB

Extraction Summary

4
People
2
Organizations
3
Locations
0
Events
2
Relationships
3
Quotes

Document Information

Type: Scientific/academic manuscript or book page
File Size: 1.84 MB
Summary

This document appears to be page 27 of a scientific or philosophical manuscript (possibly a book draft) discussing Artificial Intelligence, Machine Learning, and Causal Inference. The text argues that 'Strong AI' requires more than just data processing ('Babylonian data-fitting'); it requires causal models and theoretical understanding ('Athens'). It cites Harvard scientist Gary King and philosopher Stephen Toulmin. The document bears a House Oversight Bates stamp, suggesting it was obtained during an investigation, likely related to Jeffrey Epstein's funding of scientific research, though Epstein is not explicitly mentioned on this specific page.

People (4)

Name Role Context
Gary King Social Scientist
Harvard social scientist quoted regarding causal inference.
Stephen Toulmin Philosopher
Author of 'Foresight and Understanding' (1961), referenced for his comparison of Greek and Babylonian sciences.
Eratosthenes Ancient Scientist
Historical figure (276-194 BC) mentioned for calculating the circumference of the Earth.
Author (Unnamed) Writer/Researcher
First-person narrator ('I ask myself', 'My general conclusion') discussing AI and causal inference.

Organizations (2)

Name Type Context
Harvard
Affiliation of Gary King.
House Oversight Committee
Referenced in the Bates stamp (HOUSE_OVERSIGHT).

Locations (3)

Location Context
Used metaphorically and historically regarding astronomy and data fitting.
Used metaphorically representing theoretical understanding.
Planetary body mentioned in the context of Eratosthenes' experiment.

Relationships (2)

Author Academic Citation Gary King
Author quotes King to support the concept of the 'Causal Revolution'.
Author Academic Citation Stephen Toulmin
Author references Toulmin's 1961 book to discuss scientific methodology.

Key Quotes (3)

"More has been learned about causal inference in the last few decades than the sum total of everything that had been learned about it in all prior recorded history."
Source
— Gary King (Quoted by the author regarding the 'Causal Revolution'.)
HOUSE_OVERSIGHT_016830.jpg
Quote #1
"Opaque learning systems may get us to Babylon, but not to Athens."
Source
— Author (Concluding thought on the limitations of data science without causal models.)
HOUSE_OVERSIGHT_016830.jpg
Quote #2
"My general conclusion is that human-level AI cannot emerge solely from model-blind learning machines; it requires the symbiotic collaboration of data and models."
Source
— Author (Thesis statement regarding Artificial Intelligence.)
HOUSE_OVERSIGHT_016830.jpg
Quote #3

Full Extracted Text

Complete text extracted from the document (2,699 characters)

scientists are doing science, especially in such data-intensive sciences as sociology and
epidemiology, for which causal models have become a second language. These
disciplines view their linguistic transformation as the Causal Revolution. As Harvard
social scientist Gary King puts it, “More has been learned about causal inference in the
last few decades than the sum total of everything that had been learned about it in all
prior recorded history.”
As I contemplate the success of machine learning and try to extrapolate it to the
future of AI, I ask myself, “Are we aware of the basic limitations that were discovered in
the causal-inference arena? Are we prepared to circumvent the theoretical impediments
that prevent us from going from one level of the hierarchy to another level?”
I view machine learning as a tool to get us from data to probabilities. But then we
still have to make two extra steps to go from probabilities into real understandingnce—
two big steps. One is to predict the effect of actions, and the second is counterfactual
imagination. We cannot claim to understand reality unless we make the last two steps.
In his insightful book Foresight and Understanding (1961), the philosopher
Stephen Toulmin identified the transparency-versus-opacity contrast as the key to
understanding the ancient rivalry between Greek and Babylonian sciences. According to
Toulmin, the Babylonian astronomers were masters of black-box predictions, far
surpassing their Greek rivals in accuracy and consistency of celestial observations. Yet
Science favored the creative-speculative strategy of the Greek astronomers, which was
wild with metaphorical imagery: circular tubes full of fire, small holes through which
celestial fire was visible as stars, and hemispherical Earth riding on turtleback. It was
this wild modeling strategy, not Babylonian extrapolation, that jolted Eratosthenes (276-
194 BC) to perform one of the most creative experiments in the ancient world and
calculate the circumference of the Earth. Such an experiment would never have occurred
to a Babylonian data-fitter.
Model-blind approaches impose intrinsic limitations on the cognitive tasks that
Strong AI can perform. My general conclusion is that human-level AI cannot emerge
solely from model-blind learning machines; it requires the symbiotic collaboration of
data and models.
Data science is a science only to the extent that it facilitates the interpretation of
data—a two-body problem, connecting data to reality. Data alone are hardly a science,
no matter how “big” they get and how skillfully they are manipulated. Opaque learning
systems may get us to Babylon, but not to Athens.
27
HOUSE_OVERSIGHT_016830

Discussion 0

Sign in to join the discussion

No comments yet

Be the first to share your thoughts on this epstein document