HOUSE_OVERSIGHT_013229.jpg

1.71 MB

Extraction Summary

4
People
0
Organizations
2
Locations
0
Events
2
Relationships
2
Quotes

Document Information

Type: Academic paper / technical report (excerpt)
File Size: 1.71 MB
Summary

This document appears to be page 313 from a technical book or paper titled 'Measuring Incremental Progress Toward Human-Level AGI' (Artificial General Intelligence). It outlines specific criteria and hypothetical testing scenarios for AI development, focusing on Emotion, Modeling Self and Other, and Social Interaction. The text uses names like Hugo, Cassio, Ben, and Itamar in these scenarios, which likely correspond to real-world AI researchers (e.g., Ben Goertzel). The page bears a 'HOUSE_OVERSIGHT_013229' stamp, indicating it is part of a larger government document collection.

People (4)

Name Role Context
Hugo Character in AI scenario
Used in an example regarding subgoal creation and pleasing a subject.
Cassio Character in AI scenario
Used in examples involving emotion, theory of mind, and other-awareness. (Likely a reference to AI researcher Cassio ...
Ben Character in AI scenario
Used in examples involving emotion, theory of mind, and interaction. (Likely a reference to AI researcher Ben Goertzel).
Itamar Character in AI scenario
Used in an example regarding empathy. (Likely a reference to AI researcher Itamar Arel).

Locations (2)

Location Context
Hypothetical setting for an AI learning task.
Lab
Hypothetical setting for an AI self-control task.

Relationships (2)

Ben interaction within scenario Cassio
Ben points at Cassio's tower; Ben asks Cassio to carry out a task.
Ben interaction within scenario Itamar
Itamar is happy because Ben likes his tower of blocks.

Key Quotes (2)

"Given the goal of pleasing Hugo, can the robot learn that telling Hugo facts it has learned but not told Hugo before, will tend to make Hugo happy?"
Source
HOUSE_OVERSIGHT_013229.jpg
Quote #1
"The robot needs to set these experiences aside, and not let them impair its self-model significantly; it needs to keep on thinking it’s a good robot"
Source
HOUSE_OVERSIGHT_013229.jpg
Quote #2

Full Extracted Text

Complete text extracted from the document (2,699 characters)

17.2 Measuring Incremental Progress Toward Human-Level AGI 313
• Subgoal creation, based on its preprogrammed goals and its reasoning and planning
– Example task: Given the goal of pleasing Hugo, can the robot learn that telling
Hugo facts it has learned but not told Hugo before, will tend to make Hugo happy?
• Affect-based motivation
– Example task: Given the goal of gratifying its curiosity, can the robot figure out that
when someone it’s never seen before has come into the preschool, it should watch
them because they are more likely to do something new?
• Control of emotions
– Example task: When the robot is very curious about someone new, but is in the
middle of learning something from its teacher (who it wants to please), can it control
its curiosity and keep paying attention to the teacher?
9. Emotion
• Expressing Emotion
– Example task: Cassio steals the robot’s toy, but Ben gives it back to the robot. The
robot should appropriately display anger at Cassio, and gratitude to Ben.
• Understanding Emotion
– Example task: Cassio and the robot are both building towers of blocks. Ben points
at Cassio’s tower and expresses happiness. The robot should understand that Ben
is happy with Cassio’s tower.
10. Modeling Self and Other
• Self-Awareness
– Example task: When someone asks the robot to perform an act it can’t do (say,
reaching an object in a very high place), it should say so. When the robot is given
the chance to get an equal reward for a task it can complete only occasionally, versus
a task it finds easy, it should choose the easier one.
• Theory of Mind
– Example task: While Cassio is in the room, Ben puts the red ball in the red box.
Then Cassio leaves and Ben moves the red ball to the blue box. Cassio returns and
Ben asks him to get the red ball. The robot is asked to go to the place Cassio is
about to go.
• Self-Control
– Example task: Nasty people come into the lab and knock down the robot’s towers,
and tell the robot he’s a bad boy. The robot needs to set these experiences aside,
and not let them impair its self-model significantly; it needs to keep on thinking it’s
a good robot, and keep building towers (that its teachers will reward it for).
• Other-Awareness
– Example task: If Ben asks Cassio to carry out a task that the robot knows Cassio
cannot do or does not like to do, the robot should be aware of this, and should bet
that Cassio will not do it.
• Empathy
– Example task: If Itamar is happy because Ben likes his tower of blocks, or upset
because his tower of blocks is knocked down, the robot is asked to identify and then
display these same emotions
11. Social Interaction
HOUSE_OVERSIGHT_013229

Discussion 0

Sign in to join the discussion

No comments yet

Be the first to share your thoughts on this epstein document