manipulation in human environments

Post on 12-Jan-2016

24 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Aaron Edsinger & Charlie Kemp Humanoid Robotics Group MIT CSAIL. Manipulation in Human Environments. Domo. 29 DOF 6 DOF Series Elastic Actuator (SEA) arms 4 DOF SEA hands 2 DOF SEA neck Active vision head Stereo cameras Gyroscope Sense joint angle + torque 15 node Linux cluster. - PowerPoint PPT Presentation

TRANSCRIPT

Edsinger+Kemp NEMS 2006

Manipulation in Human Environments

Aaron Edsinger & Charlie KempHumanoid Robotics Group

MIT CSAIL

Edsinger+Kemp NEMS 2006

Domo•29 DOF

•6 DOF Series Elastic Actuator (SEA) arms

•4 DOF SEA hands

•2 DOF SEA neck

•Active vision head

•Stereo cameras

•Gyroscope

•Sense joint angle + torque

•15 node Linux cluster

Edsinger+Kemp NEMS 2006

Manipulation in Human Environments

● Work with everyday objects● Collaborate with people● Perform useful tasks

Human environments are designed to match our cognitive and physical abilities

Edsinger+Kemp NEMS 2006

Applications

● Aging in place

● Cooperative manufacturing

● Household chores

Edsinger+Kemp NEMS 2006

Three Themes● Let the body do the thinking

● Collaborative manipulation

● Task relevant features

Edsinger+Kemp NEMS 2006

QuickTime™ and a decompressor

are needed to see this picture.

Edsinger+Kemp NEMS 2006

Let the Body do the Thinking

● Design – Passive compliance– Force control– Human morphology

QuickTime™ and aMotion JPEG OpenDML decompressor

are needed to see this picture.

Edsinger+Kemp NEMS 2006

Let the Body do the Thinking

● Compensatory behaviors– Reduce uncertainty– Modulate arm stiffness– Aid perception (motion, visibility)– Test assumptions (explore)

Edsinger+Kemp NEMS 2006

Let the Body Do the Thinking

QuickTime™ and a decompressor

are needed to see this picture.

Edsinger+Kemp NEMS 2006

Collaborative Manipulation

•Complementary actions•Person can simplify perception and action for the robot•Robot can provide intuitive cues for the human

•Requires matching to our social interface

Edsinger+Kemp NEMS 2006

Collaborative ManipulationSocial amplification

Edsinger+Kemp NEMS 2006

Collaborative Manipulation● A third arm:

– Hold a flashlight– Fixture a part

● Extend our physical abilities: – Carry groceries– Open a jar

● Expand our workspace:– Place dishes in a cabinet– Hand a tool – Reach a shelf

Edsinger+Kemp NEMS 2006

Task Relevant Features

● What is important?● What is irrelevant?

*Distinct from object detection/recognition.

Edsinger+Kemp NEMS 2006

Structure In Human Environments

Donald NormanThe Design of Everyday Objects

Edsinger+Kemp NEMS 2006

Structure In Human Environments

● Sense from above● Flat surfaces● Objects for human hands● Objects for use by humans

Human environments are constrained to match our cognitive and physical abilities

Edsinger+Kemp NEMS 2006

Edsinger+Kemp NEMS 2006

Why are tool tips common?

● Single, localized interface to the world● Physical isolation helps avoid irrelevant

contact– Helps perception– Helps control

Edsinger+Kemp NEMS 2006

QuickTime™ and aYUV420 codec decompressor

are needed to see this picture.

Edsinger+Kemp NEMS 2006

Tool Tip Detection

● Visual + motor detection method● Kinematic Estimate● Visual Model

Edsinger+Kemp NEMS 2006

Edsinger+Kemp NEMS 2006

Mean Pixel Error for Automatic and Hand Labelled Tip Detection

Edsinger+Kemp NEMS 2006

Mean Pixel Error for Hand Labeled, Multi-Scale Detector, and Point Detector

Edsinger+Kemp NEMS 2006

Model-Free Insertion

QuickTime™ and a decompressor

are needed to see this picture.

•Active tip perception

•Arm stiffness modulation

•Human interaction

Edsinger+Kemp NEMS 2006

Other Examples

● Circular openings● Handles● Contact Surfaces● Gravity Alignment

Edsinger+Kemp NEMS 2006

Future:Generalize What You've Learned

● Across objects – Perceptually map tasks across objects– Key features map to key features

● Across manipulators – Motor equivalence– Manipulator details may be irrelevant

Edsinger+Kemp NEMS 2006

RSS 2006 WorkshopManipulation for Human Environments

Robotics: Science and Systems University of Pennsylvania , August 19th, 2006

manipulation.csail.mit.edu/rss06

Edsinger+Kemp NEMS 2006

Summary

● Importance of Task Relevant Features

● Example of the tool tip– Large set of hand tools– Robust detection (visual + motor)– Kinematic estimate– Visual model

Edsinger+Kemp NEMS 2006

In Progress

● Perform a variety of tasks– Insertion– Pouring– Brushing

Edsinger+Kemp NEMS 2006

Learning from Demonstration

Edsinger+Kemp NEMS 2006

The Detector Responds To

Fast Motion Convex

Edsinger+Kemp NEMS 2006

Multi-scale Histogram (Medial-Axis, Hough Transform for Circles)

Motion Weighted Edge MapVideo from Eye Camera

Local Maxima

Edsinger+Kemp NEMS 2006

Defining Characteristics● Geometric

– Isolated– Distal– Localized– Convex

● Cultural/Design– Far from natural grasp location– Long distance relative to hand size

Edsinger+Kemp NEMS 2006

Other Task Relevant Features?

Edsinger+Kemp NEMS 2006

Detecting the Tip

Edsinger+Kemp NEMS 2006

Include Scale and Convexity

Edsinger+Kemp NEMS 2006

Distinct Perceptual Problem

● Not object recognition● How should it be used● Distinct methods and features

Edsinger+Kemp NEMS 2006

Edsinger+Kemp NEMS 2006

Edsinger+Kemp NEMS 2006

Use The Hand's Frame

● Combine weak evidence● Rigidly grasped

Edsinger+Kemp NEMS 2006

Edsinger+Kemp NEMS 2006

Acquire a Visual Model

top related