HOUSE_OVERSIGHT_013091.jpg

2.09 MB

Extraction Summary

0
People
5
Organizations
0
Locations
0
Events
2
Relationships
4
Quotes

Document Information

Type: Academic text / book excerpt (evidence file)
File Size: 2.09 MB
Summary

This document appears to be a page from a technical book or academic paper regarding Artificial General Intelligence (AGI). It discusses the 'Body and Mind' problem, specifically how to embed 'inductive bias' into AI systems, comparing approaches like Cyc and SOAR with the architecture of OpenCog and CogPrime. The page is marked with a 'HOUSE_OVERSIGHT' Bates stamp, indicating it was collected as evidence, likely part of a larger investigation (potentially related to Jeffrey Epstein's funding of scientific research, though no specific names appear on this page).

Organizations (5)

Name Type Context
Cyc
SOAR
OpenCog
CogPrime
DeSTIN

Relationships (2)

CogPrime System Integration OpenCog
CogPrime incorporates a combination of the third and fourth options... An artificial endocrine system for OpenCog is also under development...
DeSTIN System Integration Atomspace
CogPrime's generic dynamic knowledge store, the Atomspace, is coupled with specialized hierarchical networks (DeSTIN)...

Key Quotes (4)

"The particularities of the human mind/body should not be taken as general requirements for general intelligence."
Source
HOUSE_OVERSIGHT_013091.jpg
Quote #1
"To solve this problem without some sort of strong inductive biasing may require massively more experience than young humans obtain."
Source
HOUSE_OVERSIGHT_013091.jpg
Quote #2
"CogPrime incorporates a combination of the third and fourth options."
Source
HOUSE_OVERSIGHT_013091.jpg
Quote #3
"OpenCog has no gastrointestinal nor cardiological nervous system..."
Source
HOUSE_OVERSIGHT_013091.jpg
Quote #4

Full Extracted Text

Complete text extracted from the document (3,417 characters)

9.6 Body and Mind 175
9.6.2.2 Implications for AGI
What lesson should the AGI developer draw from all this? The particularities of the human
mind/body should not be taken as general requirements for general intelligence. However, it
is worth remembering just how difficult is the computational problem of learning, based on
experiential feedback alone, the right way to achieve the complex goal of controlling a system
with general intelligence at the human level or beyond. To solve this problem without some sort
of strong inductive biasing may require massively more experience than young humans obtain.
Appropriate inductive bias may be embedded in an AGI system in many different ways.
Some AGI designers have sought to embed it very explicitly, e.g. with hand-coded declarative
knowledge as in Cyc, SOAR and other "GOFAI" type systems. On the other hand, the human
brain receives its inductive bias much more subtly and implicitly, both via the specifics of the
initial structure of the cognitive cortex, and via ongoing coupling of the cognitive cortex with
other systems possessing more focused types of intelligence and more specific structures and/or
dynamics.
In building an AGI system, one has four choices, very broadly speaking:
1. Create a flexible mind-network, as unbiased as feasible, and attempt to have it learn how
to achieve its goals via experience
2. Closely emulate key aspects of the human body along with the human mind
3. Imitate the human mind-body, conceptually if not in detail, and create a number of struc-
turally and dynamically simpler intelligent systems closely and appropriately coupled to
the abstract cognitive mind-network, provide useful inductive bias.
4. Find some other, creative way to guide and probabilistically constrain one's AGI system's
mind-network, providing inductive bias appropriate to the tasks at hand, without emulating
even conceptually the way the human mind-brain receives its inductive bias via coupling
with simpler intelligent systems.
Our suspicion is that the first option will not be viable. On the other hand, to do the second
option would require more knowledge of the human body than biology currently possesses. This
leaves the third and fourth options, both of which seem viable to us.
CogPrime incorporates a combination of the third and fourth options. CogPrime's generic
dynamic knowledge store, the Atomspace, is coupled with specialized hierarchical networks
(DeSTIN) for vision and audition, somewhat mirroring the human cortex. An artificial en-
docrine system for OpenCog is also under development, speculatively, as part of a project using
OpenCog to control humanoid robots. On the other hand, OpenCog has no gastrointestinal nor
cardiological nervous system, and the stress-response-based guidance provided to the human
brain by a combination of the heart, gut, immune system and other body systems, is achieved
in CogPrime in a more explicit way using the OpenPsi model of motivated cognition, and its
integration with the system's attention allocation dynamics.
Likely there is no single correct way to incorporate the lessons of intelligent human body-
system networks into AGI designs. But these are aspects of human cognition that all AGI
researchers should be aware of.
HOUSE_OVERSIGHT_013091

Discussion 0

Sign in to join the discussion

No comments yet

Be the first to share your thoughts on this epstein document