HOUSE_OVERSIGHT_016885.jpg
2.55 MB
Extraction Summary
1
People
0
Organizations
1
Locations
0
Events
0
Relationships
4
Quotes
Document Information
Type:
Manuscript page / essay / evidence document
File Size:
2.55 MB
Summary
This document is page 82 of a larger manuscript or essay (bearing a House Oversight Bates stamp) that discusses the theoretical risks of Artificial Intelligence (AI). The text argues against 'digital megalomania' and the fear of a 'Doomsday Computer,' suggesting that fears of AI turning the universe into paper clips or enslaving humans are based on contradictory premises. It references Wiener (Norbert Wiener) and compares AI safety to the evolution of industrial safety standards in Western societies.
People (1)
| Name | Role | Context |
|---|---|---|
| Wiener | Referenced Figure |
Refers to Norbert Wiener, a pioneer in cybernetics, regarding the 'value-alignment problem' and humanizing norms.
|
Locations (1)
| Location | Context |
|---|---|
|
Mentioned in the context of historical safety standards in the 20th century.
|
Key Quotes (4)
"The fear is that we might give an AI system a goal and then helplessly stand by as it relentlessly and literal-mindedly implemented its interpretation of that goal, the rest of our interests be damned."Source
HOUSE_OVERSIGHT_016885.jpg
Quote #1
"If we gave it the goal of making paper clips, it might turn all the matter in the reachable universe into paper clips, including our possessions and bodies."Source
HOUSE_OVERSIGHT_016885.jpg
Quote #2
"The way to deal with this threat is straightforward: Don’t build one."Source
HOUSE_OVERSIGHT_016885.jpg
Quote #3
"Whereas at the turn of the 20th century Western societies tolerated shocking rates of mutilation and death in industrial, domestic, and transportation accidents, over the course of the century the value of human life"Source
HOUSE_OVERSIGHT_016885.jpg
Quote #4
Discussion 0
No comments yet
Be the first to share your thoughts on this epstein document