HOUSE_OVERSIGHT_013247.jpg

2.37 MB

Extraction Summary

1
People
4
Organizations
0
Locations
0
Events
0
Relationships
4
Quotes

Document Information

Type: Academic glossary / scientific manuscript excerpt
File Size: 2.37 MB
Summary

This document is page 331 of a glossary from a scientific paper or book regarding Artificial Intelligence, specifically focusing on the OpenCog architecture. It defines technical terms such as 'Embodiment,' 'Emotion' (referencing Dorner's Psi theory), 'Evolutionary Learning' (referencing MOSES), and 'Fishgram.' The document bears a 'HOUSE_OVERSIGHT_013247' stamp, indicating it was part of a document production to the US House Oversight Committee, likely related to investigations into Jeffrey Epstein's funding of or interest in AI and transhumanist research.

People (1)

Name Role Context
Dorner Researcher/Theorist
Referenced in definition of 'Emotion' regarding 'Dorner's Psi theory of emotion'.

Organizations (4)

Name Type Context
OpenCog
Repeatedly referenced as the context for definitions (e.g., Embodiment, Exemplar).
CogPrime
Referenced in relation to Emotion, Episodic Knowledge, and Evolutionary Learning.
MOSES
Referenced under Evolutionary Learning and Exemplar.
House Oversight Committee
Implied by the footer stamp 'HOUSE_OVERSIGHT_013247'.

Key Quotes (4)

"Embodiment: Colloquially, in an OpenCog context, this usually means the use of an AI software system to control a spatially localized body in a complex (usually 3D) world."
Source
HOUSE_OVERSIGHT_013247.jpg
Quote #1
"Emotion: Emotions are system-wide responses to the system's current and predicted state."
Source
HOUSE_OVERSIGHT_013247.jpg
Quote #2
"Dorner's Psi theory of emotion contains explanations of many human emotions in terms of underlying dynamics and motivations..."
Source
HOUSE_OVERSIGHT_013247.jpg
Quote #3
"Exemplar: (in the context of imitation learning) - When the owner wants to teach an OpenCog controlled agent a behavior by imitation, he/she gives the pet an exemplar."
Source
HOUSE_OVERSIGHT_013247.jpg
Quote #4

Full Extracted Text

Complete text extracted from the document (3,787 characters)

A.2 Glossary of Specialized Terms 331
• Embodied Communication Prior: The class of prior distributions over (goal, environ-
ment pairs), that are imposed by placing an intelligent system in an environment where
most of its tasks involve controlling a spatially localized body in a complex world, and in-
teracting with other intelligent spatially localized bodies. It is hypothesized that many key
aspects of human-like intelligence (e.g. the use of different subsystems for different memory
types, and cognitive synergy between the dynamics associated with these subsystems) are
consequences of this prior assumption. This is related to the Mind-World Correspondence
Principle.
• Embodiment: Colloquially, in an OpenCog context, this usually means the use of an AI
software system to control a spatially localized body in a complex (usually 3D) world. There
are also possible "borderline cases" of embodiment, such as a search agent on the Internet.
In a sense any AI is embodied, because it occupies some physical system (e.g. computer
hardware) and has some way of interfacing with the outside world.
• Emergence: A property or pattern in a system is emergent if it arises via the combination
of other system components or aspects, in such a way that its details would be very difficult
(not necessarily impossible in principle) to predict from these other system components or
aspects.
• Emotion: Emotions are system-wide responses to the system's current and predicted state.
Dorner's Psi theory of emotion contains explanations of many human emotions in terms
of underlying dynamics and motivations, and most of these explanations make sense in a
CogPrime context, due to CogPrime's use of OpenPsi (modeled on Psi) for motivation and
action selection.
• Episodic Knowledge: Knowledge about episodes in an agent's life-history, or the life-
history of other agents. CogPrime includes a special dimensional embedding space only for
episodic knowledge, easing organization and recall.
• Evolutionary Learning: Learning that proceeds via the rough process of iterated differen-
tial reproduction based on fitness. incorporating variations of reproduced entities. MOSES
is an explicitly evolutionary-learning-based portion of CogPrime; but CogPrime's dynamics
as a whole may also be conceived as evolutionary.
• Exemplar: (in the context of imitation learning) - When the owner wants to teach an
OpenCog controlled agent a behavior by imitation, he/she gives the pet an exemplar. To
teach a virtual pet "fetch" for instance, the owner is going to throw a stick, run to it, grab
it with his/her mouth and come back to its initial position.
• Exemplar: (in the context of MOSES) – Candidate chosen as the core of a new deme, or
as the central program within a deme, to be varied by representation building for ongoing
exploration of program space.
• Explicit Knowledge Representation: Knowledge representation in which individual,
easily humanly identifiable pieces of knowledge correspond to individual elements in a knowl-
edge store (elements that are explicitly there in the software and accessible via very rapid,
deterministic operations)
• Extension: In PLN, the extension of a node refers to the instances of the category that
the node represents. In contrast is the intension.
• Fishgram (Frequent and Interesting Sub-hypergraph Mining): A pattern mining
algorithm for identifying frequent and/or interesting sub-hypergraphs in the Atomspace.
• First-Order Inference (FOI): The subset of PLN that handles Logical Links not in-
volving VariableAtoms or higher-order functions. The other aspect of PLN, Higher-Order
Inference, uses Truth Value formulas derived from First-Order Inference.
HOUSE_OVERSIGHT_013247

Discussion 0

Sign in to join the discussion

No comments yet

Be the first to share your thoughts on this epstein document