Wiener

Person
Mentions
66
Relationships
4
Events
5
Documents
29

Relationship Network

Loading... nodes
Interactive Network: Click nodes or edges to highlight connections and view details with action buttons. Drag nodes to reposition. Node size indicates connection count. Line color shows relationship strength: red (8-10), orange (6-7), yellow (4-5), gray (weak). Use legend and help buttons in the graph for more guidance.

Event Timeline

Interactive Timeline: Hover over events to see details. Events are arranged chronologically and alternate between top and bottom for better visibility.
4 total relationships
Connected Entity Relationship Type
Strength (mentions)
Documents Actions
person Arturo Rosenblueth
Academic scientific
5
1
View
person John von Neumann
Professional intellectual
5
1
View
person Pinker
Intellectual commentary
5
1
View
person SHANNON
Intellectual academic
5
1
View
Date Event Type Description Location Actions
1950-01-01 N/A Reference point for Wiener's predictions and the state of technology. N/A View
1950-01-01 N/A Conversations between Wiener and John von Neumann regarding the technological singularity. Unknown View
1948-01-01 N/A Publication of the appendix to the 1948 edition of Cybernetics. N/A View
1946-01-01 N/A Macy conferences on cybernetics New York View
1942-01-01 N/A First of the Macy conferences regarding control of complex systems. Unknown View

HOUSE_OVERSIGHT_016981.jpg

This document appears to be a page (178) from an academic essay or book regarding the history and philosophy of Artificial Intelligence and its intersection with Art. It references the foundational 1955 AI proposal by McCarthy and Marvin Minsky (a known Epstein associate), discusses Google DeepMind, and analyzes artistic works by Philippe Parreno and mathematical models by John Horton Conway. The document bears a 'HOUSE_OVERSIGHT' Bates stamp, indicating it was collected as part of a congressional investigation, likely related to the inquiry into MIT Media Lab's funding and Minsky's ties to Epstein.

Academic essay / investigation document
2025-11-19

HOUSE_OVERSIGHT_016976.jpg

This document is a biographical summary or academic profile of Caroline A. Jones, an art historian at MIT. It details her academic focus on the intersection of art, technology, and cybernetics, specifically describing her course 'Automata, Automatism, Systems, Cybernetics.' The text explores her philosophical views on 'left' versus 'right' cybernetics and references historical figures like Wiener, Shannon, and Turing. The document bears a House Oversight Committee bates stamp.

Biographical profile / academic summary (house oversight committee document)
2025-11-19

HOUSE_OVERSIGHT_016951.jpg

This text discusses artist Suzanne Treister's project *Hexen 2.0*, which explores the history of the Macy conferences on cybernetics. It highlights the historical lack of artistic representation in these scientific discussions and includes a quote from scientist von Foerster arguing against the artificial division between art and science.

Book page or report excerpt
2025-11-19

HOUSE_OVERSIGHT_016922.jpg

This document is page 119 of a larger text, marked as a House Oversight exhibit. It contains a scientific or philosophical essay discussing the convergence of computation and physical fabrication, referencing Von Neumann, Turing, and Gordon Moore. The text explores the implications of self-reproducing automata, digital fabrication, and the future of AI, suggesting a merging of artificial and natural intelligence.

Scientific essay / book page / government exhibit
2025-11-19

HOUSE_OVERSIGHT_016921.jpg

The text explores the parallels between biological evolution, specifically Hox genes, and artificial intelligence, arguing that AI currently suffers from a "mind-body problem" due to its lack of physical form. It advocates for "digital materials"—modular, programmable physical components analogous to amino acids or Lego bricks—to bridge the gap between computation and fabrication. The author references pioneers like von Neumann, Shannon, and Turing to contextualize the convergence of digital information and physical construction.

Report or book page
2025-11-19

HOUSE_OVERSIGHT_016920.jpg

This document appears to be page 117 of a book or essay discussing the philosophy and technical evolution of Artificial Intelligence (AI), specifically deep learning and neural networks. It covers concepts such as the 'curse of dimensionality,' the shift from imperative to generative design, and the 'black box' nature of AI decision-making. The page is stamped 'HOUSE_OVERSIGHT_016920', indicating it is part of a production of documents for a congressional investigation, likely related to Jeffrey Epstein's ties to the scientific community or academia.

Book page / manuscript excerpt (house oversight evidence)
2025-11-19

HOUSE_OVERSIGHT_016917.jpg

The document appears to be an excerpt from a book or report describing the work and philosophy of Neil Gershenfeld, director of MIT's Center for Bits and Atoms. It details a visit to MIT in 2003, Gershenfeld's involvement with the 'Fab Labs' maker movement, and his critical views on Norbert Wiener's theories regarding automation and AI. The page bears a House Oversight stamp, suggesting it is part of a larger investigation file.

Narrative report / book excerpt
2025-11-19

HOUSE_OVERSIGHT_016916.jpg

This page appears to be an excerpt from a philosophical or academic text regarding Artificial Intelligence, specifically contrasting the informational theories of Norbert Wiener and Claude Shannon. It discusses the ethical implications of 'deep learning,' 'rampant militarism,' and 'runaway corporate profit-seeking.' While the document bears a 'HOUSE_OVERSIGHT' Bates stamp (suggesting it was part of a document dump, possibly related to Epstein's ties to the scientific community), the text itself does not contain specific names, dates, or transaction details related to Jeffrey Epstein.

Book excerpt or academic essay (contained within house oversight committee production)
2025-11-19

HOUSE_OVERSIGHT_016914.jpg

This document is page 111 of a larger text (marked HOUSE_OVERSIGHT_016914), likely a book or academic essay. It discusses the philosophical and scientific views of Norbert Wiener regarding entropy, the futility of military secrecy, and the commodification of information. It contrasts Wiener's humanistic definition of information (citing his work 'Human Use') with Claude Shannon's mathematical definition ('the bit'), and critiques the mediocrity of mass media in the postwar era.

Book excerpt / academic text (evidence file)
2025-11-19

HOUSE_OVERSIGHT_016885.jpg

This document is page 82 of a larger manuscript or essay (bearing a House Oversight Bates stamp) that discusses the theoretical risks of Artificial Intelligence (AI). The text argues against 'digital megalomania' and the fear of a 'Doomsday Computer,' suggesting that fears of AI turning the universe into paper clips or enslaving humans are based on contradictory premises. It references Wiener (Norbert Wiener) and compares AI safety to the evolution of industrial safety standards in Western societies.

Manuscript page / essay / evidence document
2025-11-19

HOUSE_OVERSIGHT_016884.jpg

This document appears to be page 81 of a book or essay, stamped with 'HOUSE_OVERSIGHT_016884', suggesting it was part of a document production for a congressional investigation. The text is a philosophical and scientific argument expressing skepticism about the dangers of 'superintelligence' and Artificial General Intelligence (AGI). The author argues that intelligence is distinct from motivation, critiques the idea of AI becoming 'ruthless megalomaniacs,' and dismisses the feasibility of a 'Laplace's demon' scenario due to the infinite nature of knowledge.

Book excerpt / evidence document
2025-11-19

HOUSE_OVERSIGHT_016883.jpg

This document is page 80 of a text (likely an essay or book chapter) stamped with a House Oversight control number. The text argues that societal threats stem not from technological surveillance or 'sci-fi dystopia' AI, but rather from 'oppressive political correctness' and vague laws that allow for prosecutorial overreach (citing Harvey Silverglate). The author dismisses fears of AI subjugation as 'chimerical' and based on a misunderstanding of intelligence.

Essay / book excerpt (page 80) included in congressional evidence
2025-11-19

HOUSE_OVERSIGHT_016882.jpg

The text argues against the concept of technological determinism, asserting that political freedom is driven by norms and institutions rather than technology levels. The author uses historical examples to show that repression existed in low-tech eras and that modern high-tech societies often have high degrees of freedom, countering fears of an inevitable "surveillance state."

Book page or academic text excerpt
2025-11-19

HOUSE_OVERSIGHT_016879.jpg

This page serves as a biographical introduction for psychologist Steven Pinker, likely preceding an essay written by him. It outlines his academic focus on naturalistic understanding and computational theory of mind, while summarizing his skepticism regarding catastrophic AI risk scenarios. The document is part of a House Oversight collection, indicated by the footer stamp.

Biographical introduction / essay excerpt (house oversight committee exhibit)
2025-11-19

HOUSE_OVERSIGHT_016877.jpg

This document is page 74 of a larger work (essay or book) titled 'Calibrating the AI-Risk Message.' It discusses the existential risks of Artificial Intelligence, arguing that superintelligent AI poses an 'environmental risk' to biological life rather than just social or economic risks. The text references Norbert Wiener, Eliezer Yudkowsky, Douglas Adams, and Eric Drexler. The document bears the Bates number HOUSE_OVERSIGHT_016877, indicating it was part of a document production for a House Oversight Committee investigation, likely related to Jeffrey Epstein's known associations with scientists and transhumanists.

Essay / book chapter / scientific paper
2025-11-19

HOUSE_OVERSIGHT_016875.jpg

This document appears to be page 72 of a larger text, stamped with 'HOUSE_OVERSIGHT_016875', indicating it is part of an evidentiary submission to the House Oversight Committee. The text is an essay or chapter discussing the existential risks of Artificial Intelligence, specifically the 'Control Problem,' drawing parallels to biological evolution. It references historical figures like Turing, Wiener, and Good, and argues that humanity is facing the end of the 'human-brain regime' as AI advances.

Essay / article excerpt (house oversight committee evidence)
2025-11-19

HOUSE_OVERSIGHT_016872.jpg

This document is a biographical profile of Jaan Tallinn, an Estonian developer and existential risk philanthropist. It details his founding of the Centre for the Study of Existential Risk at Cambridge in 2012 and recounts a social anecdote about him breakdancing at a high-society dinner party in London. The document bears a House Oversight Committee stamp, suggesting it was part of evidence gathered during an investigation, likely related to Epstein's connections with the scientific/intellectual community (Edge Foundation circles).

Biographical profile / book excerpt (included in house oversight investigation files)
2025-11-19

HOUSE_OVERSIGHT_016857.jpg

This document appears to be page 54 of a philosophical or technical essay regarding the ethics of artificial intelligence, data privacy, and surveillance capitalism. The text criticizes 'West Coast companies' for monetizing user data and inferences without consent and contrasts profit-driven exploitation with government suppression of dissent. It concludes by stating that solving these issues requires engineering, legislation, and moral leadership. The document bears a House Oversight Bates stamp.

Essay/book page (likely from a house oversight committee production)
2025-11-19

HOUSE_OVERSIGHT_016856.jpg

This document page discusses the vulnerabilities inherent in modern software, specifically explaining "buffer overrun" exploits and comparing computer viruses to biological ones. It highlights the widespread reliance on vulnerable computer systems for critical infrastructure and daily life, while also illustrating how modern web browsing has shifted from simple data retrieval to a model involving continuous user tracking and data collection.

Government report page / oversight document
2025-11-19

HOUSE_OVERSIGHT_016837.jpg

The text discusses the concept of Cooperative Inverse-Reinforcement Learning (CIRL), a framework designed to align machine actions with human preferences through a game-theoretic approach involving partial information. Using a hypothetical example of agents named Harriet and Robby, it illustrates how uncertainty about preferences encourages cooperation and teaching, and further applies this framework to solve the "off-switch problem" by incentivizing robots to allow themselves to be deactivated.

Book page or academic report snippet
2025-11-19

HOUSE_OVERSIGHT_016836.jpg

This document discusses the risks associated with superintelligent AI, arguing that the multidimensional nature of intelligence does not negate the potential threat to humans. It explores solutions to "Wiener's warning," suggesting the need to define a formal problem ($F$) that ensures AI behavior aligns with human happiness, while cautioning against simple reward maximization which leads to the "wireheading problem."

Page from a policy report or academic book on ai safety
2025-11-19

HOUSE_OVERSIGHT_016824.jpg

A page from a scientific essay or book chapter discussing the history and skepticism of the 'technological singularity.' The author references work on quantum computers and cites historical figures like John von Neumann and Norbert Wiener, as well as modern figures like Ray Kurzweil, Stephen Hawking, and Elon Musk regarding the potential risks and benefits of AI and superintelligence. The document bears a House Oversight stamp, indicating it is part of an investigation file, likely related to Jeffrey Epstein's connections to the scientific community.

Book chapter / essay / academic paper (evidence document)
2025-11-19

HOUSE_OVERSIGHT_016823.jpg

This document is page 20 of a larger text (likely an essay or book chapter) discussing the history of cybernetics and Artificial Intelligence. It critiques the predictions of Norbert Wiener and early AI researchers like Herbert Simon, John McCarthy, and Marvin Minsky, specifically noting how they overestimated the speed of AI development in the 1950s. The document bears a 'HOUSE_OVERSIGHT' Bates stamp, indicating it was part of a congressional investigation, likely related to Epstein's connections to scientists like Marvin Minsky.

Document page (essay/article/book chapter)
2025-11-19

HOUSE_OVERSIGHT_016399.jpg

This text explores the philosophical and artistic implications of cybernetics, contrasting "right cybernetics" (corporate and military AI) with "left cybernetics" (ecological and trans-species understanding). Drawing on the work of Gregory Bateson and various artists, it argues for a view of the mind as immanent and interconnected with the environment, rather than confined to the individual cranium.

Academic text / book page
2025-11-19

HOUSE_OVERSIGHT_016398.jpg

This text critiques the evolution of artificial intelligence from a theoretical simulation to a tool of capitalism and social control, contrasting these developments with artistic interpretations. It discusses the work of artist Philippe Parreno and mathematician John Horton Conway, exploring the boundaries between simulation and life through the lens of Conway's "Game of Life" and its implications for understanding complexity and consciousness.

Book page / academic text
2025-11-19
Total Received
$0.00
0 transactions
Total Paid
$0.00
0 transactions
Net Flow
$0.00
0 total transactions
No financial transactions found for this entity. Entity linking may need to be improved.
As Sender
0
As Recipient
0
Total
0
No communications found for this entity. Entity linking may need to be improved.

Discussion 0

Sign in to join the discussion

No comments yet

Be the first to share your thoughts on this epstein entity