haptic sensing and perception for telerobotic...

39
Haptic Sensing and Perception for Telerobotic Manipulation Emil M. Petriu, Dr. Eng., P.Eng. , FIEEE Professor School of Information Technology and Engineering University of Ottawa Ottawa, ON., K1N 6N5 Canada http://www.site.uottawa.ca/~petriu

Upload: others

Post on 02-Jun-2020

22 views

Category:

Documents


0 download

TRANSCRIPT

Haptic Sensing and Perception for Telerobotic Manipulation

Emil M. Petriu, Dr. Eng., P.Eng. , FIEEEProfessor

School of Information Technology and EngineeringUniversity of Ottawa

Ottawa, ON., K1N 6N5 Canadahttp://www.site.uottawa.ca/~petriu

Telepresence for Remote Robotic Applications

Teleoperation has been developed as an extension of the human intervention capability in remote or hazardous environments.

Remotely operated robots designed to work inenvironments that are unknown or imprecisely known. Other than planetary applications, this type of robot control is used for work in deep sea,harshoperating environments, or remote medical operations.

Telepresence allowins human teleoperators to experience the feeling that they are in remote environments.

VideoCamera Robot

Arm

Tactile Sensors

Manipulated Object

Head Mounted Display

HapticFeedback

Virtual model of the object manipulatedin the physicalworld

VideoCamera Robot

Arm

Tactile Sensors

Manipulated Object

VideoCamera Robot

Arm

Tactile SensorsTactile Sensors

Manipulated Object

Head Mounted Display

HapticFeedback

Virtual model of the object manipulatedin the physicalworld

Head Mounted Display

HapticFeedback

Virtual model of the object manipulatedin the physicalworld

Video and haptic virtual reality interfaces allow a human operator to remotely control a robot manipulator equipped with video camera and tactile sensors.

Two-point limen test: 2.5 mm fingertip, 11 mm for palm, 67 mm for thigh (from [Burdea& Coiffet 2003] ).

[Burdea& Coiffet 2003] G. Burdea and Ph. Coiffet, Virtual Reality Technology,(second edition), Wiley, New Jersey, 2003

Human Haptic System

The robotic telemanipulation system has a bilateral architecture allowing to connect the human operatorand the telerobotic manipulator as transparently as possible.

Using a head mounted display for augmented visual virtual reality and a haptic feedback system, the human operator controls the operation of a remote robot manipulator equipped with video camera and tactile sensors placed in robot’s hand.

The tactile sensors provide the cutaneous informationat the remote robotic operation site. The joint sensors of the robot arm provide the kinesthetic information.

A tactile human-computer interface provides the cutaneous feedback allowing the human operator to feel with his/her own sense of touch the same sensation as that acquired by the remote robot hand from its artificial tactile sensor. A robot-like kinematic structure provides the kinesthetic feedback for the haptic human-computer interface.

Robotic Haptic System

Sensory data gathered from vision, joint encoders, and tactile sensors are integrated in a unique framework which allows one to deal in a common way with all the properties of the manipulated object, These include the object’s 3D geometr/shape, its surface-material properties, and the contact forces occurring during object handling by the robot. This framework also solves sensor redundancy problems occurring when more sensors than actually required are used to measure a given object parameter.

Multi Sensor Data Fusion

The multi-sensor data fusion system has a hierarchical architecture with an ascending sensory processing path in parallel with a descending task-decomposition control path connected via world models at each level. The processing time at each level is reduced by the use of a priori knowledge from the world model that provides predictions to the sensory system. The use of a world model promotes modularity because the specific information requirements of the sensory and control hierarchies are decoupled.

The time clutch concept is used to disengage synchrony between operator specification time and telerobot operation time during path specifications.In order to avoid fatal errors and reduce the effect of the communication delay, we are using a distributed virtual environment allowing to maintain a shared world modelof the physical environment where the telemanipulation operation is conducted].

Another critical requirement is the need to maintain the synchronism between the visual and the haptic feedback. While being two distinct sensing modalities, both haptic perception and vision measure the same type of geometric parameters of the 3D objects that are manipulated.

Multi Sensor Data Fusion (continued) …

Path Specifications

ENVIRONMENT

Task

ROBOT

JOINT/WHEEL SERVOS

I.R. RANGE SENSORS

Actuator

Wheel Position

WHEEL/STEER ENCODERS

VISION

ONBOARD COMPUTER

Raster Image

Local Connection

Remote Connection

Object Identities and POSEs

Position Specifications

FRAME TRANSFORMS

TRAJECTORY PLANNER

ROBOT MODELFRAME TRANSFORMS

Robot Position

TASK PLANNER

GEOMETRIC WORLD MODEL

POSITION MODEL

OBJECT RECOGNITION

TRAJECTORY PARAMETRS ESTIMATION

Trajectory Constraints

TELEOPERATOR

VIDEO MONITOR

TAC

TILE

M

ON

ITO

R

&

JO

YSTI

CK

TACTILE SENSOR

COMPOSITE WORLD MODEL

Model-based telepresence control of a robot

Haptic Control for Object Manipulation.

Human tactile perception is a complex act with two distinct modes. First, passive sensing, whichis produced by the "cutaneous" sensory network,

provides information about contact force, contact geometric profile and temperature. Second, active sensing integrates the cutaneous sensory information with "kinesthetic" sensory information (the limb/joint positions and velocities).

Tactile sensing is the result of a deliberate exploratory perception act. Since the cutaneous information provided by the tactile probe is local, it is necessary to add a "carrier" (kinesthetic capability) to allow the probe to move around on the explored object surface. The local cutaneous information provided by the different touch frames is integrated with the kinesthetic position parameters of the carrier resulting in a "global view" (geometric model) of the explored object.

The robotic manipulator consists of a 5-axis commercial robot, an instrumented passive-compliant wrist and a 16-by-16 tactile probe. Position sensors placed in the robot’s joints and on the instrumented passive-compliant wrist provide the kinesthetic information. The compliant wrist allows the robot-hand equipped with tactile sensors to accommodate the constraints of the explored object surface and thus to increase the amount of cutaneous tactile information.

Instrumented passive compliant wristfor the tactile exploration of objects.

The compliant wrist allows the robot-hand equipped with tactile sensors to accommodate the constraints of the explored object surface and thus to increase the amount of cutaneous tactile information.

The tactile probe is based on a 16-by-16 matrix of Force Sensing Resistor (FSR) elements spaced 1.58 mm apart on a 6.5 cm2 (1 square inch) area. The FSR elements exhibit exponentially decreasing electrical resistance with applied normal force: the resistance changes by two orders of magnitude over a pressure range of 1 N/cm2 to 100 N/cm2.

FORCE SENSITIVE TRANSDUCER

2D- SAMPLING ELASTIC OVERLAY

EXTERNAL FORCE

xy

z3D OBJECT

Tab-shaped elastic overlay.

Protruding round tabs allow the material to expand without in the x and y directions and compress in the z direction, reducing the elastic overlay's blurring effect.

The elastic overlay has a protective damping effect against impulsive contact forces and its elasticity resets the transducer system when the probe ceases to touch the object. However, they may cause considerable blurring distortions in the sensing process if they are not properly controlled. We avoided it by replacing the one-piece pad with a custom-designed elastic overlay consisting of a relatively thin membrane with protruding round tabs. This construction allows the material to expand without any stress in the x and y directions making possible its compression in the z direction proportionally with the normal stress component. The tabs are arranged in a 16-by-16 array Having a tab on top of each node of the FSR matrix. This tab configuration provides a de facto spatial sampling, which reduces the elastic overlay's blurring effect on the high 2D sampling resolution of the FSR transducer.

Experimental results Illustrating the positive

effect of the tab-shaped

overlay.

16-by-16 median

filtered tactile image of a washer.

Human oriented sensory interfaces allow the teleoperator to experience virtual reality feelings of visual, force, and tactile nature.

These interfaces should have easily perceivable and task-related sensor information displays (monitors) in such away to enhance the teleoperator's control capabilities.

Human - Computer Interfaces

The haptic perception is a complex exploratory act integrating two distinct modes: (i) a cutaneous tactile sensor provides information

about contact force, contact geometric profile andthe temperature of the touched object,

(ii) a kinesthetic sensory system provides information about the positions and velocities of the kinematic structure (e.g. hand) carrying the tactile sensor.

There are various cutaneous mechanoreceptors located in the outer layers of the skin. These receptors are specialized nervous cells or neurons. The free nerve endings are the most numerous and play an active role in the perception of pain, cold and warmth.

These mechanoreceptors have preferential frequency response characteristics: the highest sensitivity for the Pacinian Corpuscles (PC) units is around 250-300 Hz but they respond from 30 Hz to very high frequencies. The Rapidly Adapting (RA) units effective frequency range is between 10 and 200 Hz, with more sensitivity below 100 Hz. The Slowly Adapting (SA) units respond at low frequencies, under 40-50 Hz.

The human kinesthetic function has a much lower frequency band.

The Human Haptic Perception

• Uses 18-22 linear sensors – electrical strain gauges;• Angles are obtained by measuring voltages on • a Wheastone bridge;• 112 gestures/sec “filtered”.• Sensor resolution 0.5 degrees, but errors accumulate • to the fingertip (open kinematic chain);• Sensor repeatability 1 degree• Needs calibration when put on the hand

CyberGlove®

Immersionn_3D Interaction <http://www.immersion.com/>

CyberGrasp™ Pack

CyberForce® CyberTouch™

CyberGrasp™

Immersionn_3D Interaction <http://www.immersion.com/>

Commercial “Virtual Hand Toolkit for CyberGlove/Grasp”providing the kinesthetic human-computer interface

Performance comparison of various sensing gloves (from [Burdea& Coiffet 2003]

Haptic Feedback to the Human Operator

H U M A N - H A N D

TM TM

TACTILE SENSATION RECONSTRUCTION

TM = Tactile Monitor

TS TS

TACTILE IMAGE ACQUISITION

ROBOT - HAND

TS = Tactile Sensor

A tactile monitor placed on the operator's palm should Allow the human teleoperator to virtually feel by touch the object profile measured by the tactile sensors placed in the jaws of the robot gripper.

“Tactile Display” for Computer-Human Interface

Cutaneous tactile monitor developed at the University of Ottawa in the early 90s. It consists of an 8-by-8 array of electromagnetic vibrotactile stimulators. The active area is 6.5 cm2 (same as the tactile sensor).

Pseudo-Random Multi-Valued Sequences (PRMVS)

A more compact absolute position encoding can be obtained by using Pseudo-Random Multi-Valued Sequences (PRMVS) where sequence elements are entries taken from an alphabet with more than two symbols.

A1 A00 1 1 1A2 0 A A A2 A2 A2

p = 0 … 1 … 2 … 3 … 4 … 5 … 6 … 7 … 8 … 9 … 10… 11 … 12… 13… 14

Origin Pointer

q

P = p . q

As an example, a two stage shift register, n=2, having the feedback defined by the primitive polynomial h(x)= x2+x+A over

GF(4) ={0,1,A,A2}, with A2+A+1=0 and A3=1, generates the 15-term PRMVS {0, 1, 1, A2, 1, 0, A, A, 1, A, 0, A2, A2, A, A2}. Any 2-tuple seen through a 2-position window sliding over this sequence is unique.

Each stimulator corresponds to a 2-by-2 window in the tactile sensor array. The vibrotactile stimulators are used as binary devices that are activated when at least two of the corresponding taxels (tactile elements)in the tactile sensor array window are "on". The figure shows a curved edge tactile feedback .

R(0) = R(n) ? ?c(n-1)·R(n-1) ? ? ? ? c(1)·R(1)

R(n) R(n-1) R(k) R(2) R(1)

R(0)

Pseudo-Random Binary EncodingA practical solution allowing absolute position recovery with any desired n-bit resolution while employing onlyone binary track, regardless of the value of n.

Feedback for reverse PRBS R(n+1)= R(1) ? ?b(2)·R(2) ? ? ? ?b(n)·R(n)

Shift register length n

Feedback for direct PRBS R(0)= R(n) ? ?c(n-1)·R(n-1) ? ? ? ?c(1)·R(1)

4 R(0) = R(4) ? ?R(1) R(5) = R(1) ? ?R(2)

5 R(0) = R(5) ? ?R(2) R(6) = R(1) ? ?R(3)

6 R(0) = R(6) ? ?R(1) R(7) = R(1) ? ?R(2)

7 R(0) = R(7) ? ?R(3) R(8) = R(1) ? ?R(4)

8 R(0) = R(8) ? ?R(4) R(9) = R(1) ? ?R(3) ? ?R(3) ? ?R(2) ? ?R(4) ? R(5)

9 R(0) = R(9) ? ?R(4) R(10) = R(1) ? ?R(5)

10 R(0) = R(10) ? ?R(3) R(11) = R(1) ? ?R(4)

Table 1 Feedback equations for PRBS generation

Robotic Tactile Recognition of Pseudo-Random Encoded Objects

0 0 0 0 1 0 1 0 1 1 1 0 1 1 0 0 0 1 1 1 1 1 0 0 1 1 0 1 0 0 1

0 5 10 15 2520 30p =

PRBS=

A (2n -1) term Pseudo-Random Binary Sequences (PRBS) generated by a n-bit modulo-2 feedback shift register is used as an one-bit / quantization-step absolute code. The absolute position identification is based on the PRBS window property. According to this any n-tuple seen through a n-bit window sliding over PRBS is unique and henceforth it fully identifies each position of the window.

0 0 0 01 01 0 1 1 1 0 1 1 0 0 0 1 1 1 1 1 0 0 1 1 0 1 0 0 1

0 5 10 15 2520 30p =

PRBS =

Q(0) Q(1) Q(2) Q(3)

Serial-parallel code conversion of the absolute position p=18 on a 31-position PRBS encoded track with four milestones.

The figure shows, as an example, a 31-bit term PRBS: 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 1, 1, 0, 0, 1, 1, 0,1, 0, 0, 1, generated by a 5-bit shift register. The 5-bit n-tuples seen through a window sliding over this PRBs are unique and represent a 1-bit wide absolute position code.

Origin

q

P=p·q Pointer

p=0 5 10 15 20 25 30

_____________________________________________________________ n q=3 q=4 q=8 q=9

______________________________________________________________2 x2+x+2 x2+x+A x2+Ax+A x2+x+A3 x3+2x+1 x3+x2+x+A x3+x+A x3+x+A4 x4+x+2 x++x2+Ax+A2 x4+x+A3 x4+x+A5

5 x5+2x+1 x5+x+A x5+x2+x+A3 x5+x2+A6 x6+x+2 x6+x2+x+A x6+x+A x6+x2+Ax+A7 x7+x6+x4+1 x7+x2+Ax+A2 x7+x2+Ax+A3 x7+x+A8 x8+x5+2 x8+x3+x+A9 x9+x7+x5+1 x9+x2+x+A

10 x10+x9+x7+2 x10+x3+A(x2+x+1) _______________________________________________________________The following relations apply: for GF(4)= GF(22): A2+A+1=0, A2=A+1, and A3=1 for GF(8)= GF(23): A3+A+1=0, A3=A+1, A4=A2+A, A5=A2+A+1,

A6=A2+1, and A7=1 for GF(9)= GF(32): A2+2A+2=0, A2=A+1, A3=2A+1, A4=2, A5=2A,

A6=2A+2, A7=A+2, and A8=1 _______________________________________________________________According to the PRMVS window property any q-valued contents

observed through a n-position window sliding over the PRMVS is unique and fully identifies the current position of the window.

0 A 1 A2 A A2 A2 A2 1 1 A2 A2 A2 A A2 1 A 0 0 1 A2 A2 A 1 0 A2 A2 0 1 A A2 A2 1 0 0 A2 0 0 A A A2 1 A2 A2 1 A2 A A 0 0 A2

0 1 A 1 A 0 A2 A 0 0 A A2 0 A 1 A 1 0 A2 A2 A 0 A2 0 1 1 1 1 0 A2 0 A A2 A2

0 A2 A 1 A2 1 1 1 A A 1 1 1 A2 1 A A2

0 0 A 1 1 A2 A 0 1 1 0 A A2 1 1 A 0 0 1 0 0 A2 A2 1 A 1 1 A 1 A2 A2 0 0 1 0 A A2 A A2 0 1 A2 0 0 A2 1 0 A2 A A2 A 0 1 1 A2 0 1 0 A A A A 0 1 0 A2 1 10 1 A2 A 1 A A A A2 A2 A A A 1 A A2 1 0 0 A2 A A 1 A2 0 A A 0 A2 1 A A A2 0 0 A 0 0 1 1 A A2 A A A2 A 1 1 0 0 A 0 A2 1 A2 1 0 A 1 0 0 1 A 0 1 A2 1 A2

0 A A 1 0 A 0 A2 A2 A2 A2 0 A 0 1 A A

15-by-17 PRA obtained by folding a 255 element PRS defined over GF(4),with q=4, n=4, k1=2, k2=2, n1= qk1-1=15, and n2=(qn-1)/n1=17

V(k,1)

V(k,2)

V(k,3)

V(k,3)

V(k,4)

V(k,3)

0001020304050607

i-2i-1ii+1i+2i+3i+4i+5

M-3M-2M-1

00 01 02 03 04 05 06 j-3 j-2 j-1j

j+1j+2j+3j+4 N-1i

Y

X

.

.

.

.

.

.

... . . .

V(r,1) V(r ,2) V(r,3)

V(r,3)

V(r ,4) V(r ,5)

V(r,6)

V(r,6) V(r,6)V(r,3)

V(r,4)V(r,1)

V(r ,1)V(r,2)

j

V(k,3)V(k,1)O(k)

X

Y

V(k,4)

V(k,2)

ZV(r ,4)

V(r,3)V(r ,1)

V(r,2)

V(r,6)

V(r,5)

X

Y

Z

O(r)

3D object models are unfolded and mapped on the encoding pseudo-random array.

PRA code elements are Braille-like embossed on object surfaces.

The binary symbols used to mark "0" and respectively "1" are recognized on the basis of the number of end-points and vertices. The symbol on the left representing "0" has 2 end-points and 1 vertex-point and the symbol on the right representing "1" has 3 end-point and 2 vertex-points.

The shape of the embossed symbols is specially designed for easy tactile recognition. For an efficient pattern recognition, the particular shape of the binary symbols were selected in such a way to meet the following conditions: (i) there is enoughinformation at the symbol level to provide an immediate indication of the grid orientation, (ii) the symbol recognition procedure is invariant to position, and orientations, (iii) the symbols have a certain peculiarity so that other objects in the scene will not be mistaken for encoding symbols.

PRA code elements are Braille-like embossed on the object surface.

“0” “1”

“A” “A2”

The shape of the four code symbols for a PRA over GF(4) embossed on object’s surface

1” ½ ’’

1¼ “

½ ’’

The 15-by-17 PRA defined over GF(4)embossed using the four code symbols

C1 C2

C3 C4

C5 C6

C7 C8

P1 P2

P3 P4

P5 P6

P7 P8

The vertex labeled models of two simple 3D objects

C4 C1

C8 C5

C1 C2

C5 C6

C2

C6

C3

C7

C3

C7

C4

C8

C1

C2

C4

C3

C8

C7

C5

C6

P3 P4

P2P1

P2P1

P6P5

P6P5

P7P8

P7P8

P3P4 P4

P8

P1

P5

P2

P6

P3

P7

Mapping the embossed PRBA on the surfaces of the two 3D objects

Tactile images of the four GF(4) encoding symbols

Composite tactile image of four symbols on an encoded object surface

C1

C2

C3

C4

C5

C6

C7

The four tactile recovered symbols are recognized (using a NN) and,using the PRA window property their location is unequivocally identified on the face of one of the 3D objects

Current Research Objectives

Object recognition by integrating haptic and visual sensory data into a composite haptic & vision model of 3D object shape, surface and material properties (e.g. texture, elasticity,heat transfer characteristics), and contact forces and slippage.

Development of a haptic & vision model-based coding for distributed interactive virtual reality applications. Using a unique composite haptic & vison model of the manipulated objects could be more efficient than using separate vision and haptic data streams. Such an integral approach makes sense as the role of haptics is to enhance the vision. Both haptic perception and vision measure the same type of geometric parameters of the 3D objects that are manipulated.

Study of human-computer interaction aspects of the haptic & visual perception for interactive distributed virtual environments. The aim is the development of a more efficient haptic feedback enhancing the visual feedback to human operators during virtual object manipulation.