microsoft powerpoint - vrcourse_haptics.ppt [compatibility mode]

Download Microsoft PowerPoint - VRcourse_Haptics.ppt [Compatibility Mode]

If you can't read please download the document

Upload: dominic54

Post on 07-May-2015

1.689 views

Category:

Documents


3 download

TRANSCRIPT

  • 1.1 Haptic Interfaces Daniel Thalmann EPFL VRlabEPFL - VRlab What is a haptic interface? This is a force reflecting device which allows a user to touch, feel, manipulate, create, and/or alter i l t d D bj t i i t l i tsimulated D-objects in a virtual environment. could be used to train physical skills such as those jobs requiring specialized hand-help tools (e.g. surgeons, astronauts, mechanics), to provide haptic- feedback modeling of three dimensional objects without a physical medium (such as automobile body designers working with clay models), or to mock-up developmental prototypes directly fromp p p yp y CAD databases (rather than in a machine shop). Definition of Haptic Gibson1 (1966) A haptic system is defined as "The sensibility of the individual to the world adjacent to his body by use of his body". The haptic perceptual system is unusual in that it can include the sensory receptors from the whole body and is closely linked to the movement of the body so can have a direct effect on the ld b i i dworld being perceived. 1. Gibson, R., 1966, "Electrical Stimulation of Pain and Touch", International Symposium on Skin Senses, D. Kenshalo, ed., Springfield, IL Devices which track movement, such as head-trackers, eye-trackers, magnetic or optical motion trackers, but do not provide force-reflecting feedback to the user t h ti i t fare not haptic interfaces. To gain a deeper understanding of human haptics, multidisciplinary investigations involving biomechanics of touch, tactile neurophysiology, psychophysics, visual-haptic interacions, motor control, and computational models are employed.p p y

2. 2 Human Haptics Two complementary channels: ~ Tactile Strictly responsible for the variation of the cutaneous stimuli Presents spatial distribution of forces ~ Kinesthetic (Proprioception) Refers to the human perception of ones b d i i d iown body position and motion Presents only the net force information Tactile Display Skin sensation is essential for many manipulation and exploration tasks, for example, medical palpation. Tactile display devices stimulate the skin t t th ti f t tto generate these sensations of contact. The skin responds to several distributed physical quantities: 1. High-frequency vibrations: Surface texture, slip, impact, and puncture. 2. Small-scale shape or pressure distributionp p 3. Thermal properties NIST Pins Down Imaging System for Blind (2002) Computer scientists and engineers at NIST haveg created a tactile graphic display that brings electronic images to the blind and visually impaired in the same way that Braille makes words readable. High frequency vibrations e.g. CyberTouch of Immersion Based on CyberGlove The CyberGlove is a fully instrumented glove that provides up to 22 high-accuracy joint-angleprovides up to 22 high accuracy joint angle measurements. It uses proprietary resistive bend-sensing technology to accurately transform hand and finger motions into real- time digital joint-angle data. 3. 3 CyberTouch of Immersion CyberTouch is a tactile feedback option for Immersion's CyberGlove. It features small vibrotactile stimulators on each finger and the palm of the CyberGlove. Each stimulator can be individually programmed to vary the strength of touch sensation. The array of stimulators can generate simple sensations such as pulses or sustained vibration, and they can be used in combination to produce complex tactile feedback patterns. Small-scale shape or pressure distribution Commonly modeled by an array of closely-spaced pins that can be individually raised and loweredindividually raised and lowered. Design challenge: allocation of dozens of fast actuators in a few cubic centimeters e.g. Refreshable Tactile Display Computer Monitor for the Blindp http://www.webdesk.com/refreshable-tactile-display/ Other Tactile Displays Thermal properties: We infer material composition and temperature difference. Thermal display devices usually based on Peltier thermoelectric coolers. Many other tactile display modalities: electrorheological devices for conveying compliance, electrocutaneous stimulators, ultrasonic friction displays, and rotating disks fordisplays, and rotating disks for creating slip sensations. Haptic Rendering of Surfaces 4. 4 The models of the probe a point a 3D object a line segment j State of the Art Difficult to simulate proprioception on the entire body Fingertip Wrist 14 Arm + 2 fingers Foot Hardware Issues The haptic devices are worn by a human which attaches to the arm and/or hand and which measures the operator's movements so they can be mimicked by a remote manipulator A force reflecting device canremote manipulator. A force-reflecting device can additionally make the operator feel the forces measured by the remote manipulator. Hardware Design: Finger-based systems Hand-based systems Exoskeletal systems Inherently passive devices. http://haptic.mech.northwestern.edu/intro/gallery/index.html/ Finger-Based Systems PHANToM (TM) (The Personal Haptic Interface Mechanism) made by SensAble Devices, Inc., and developed at the MIT Artificial Intelligence Laboratory.g y Users insert their index finger into a thimble. The PHANToM tracks the motion of the users finger tip and can actively exert an external force on the finger, creating compelling illusions of interaction with solid physical objects. http://www.sensable.com/products-haptic-devices.htm 5. 5 The Pen Based Force Display developed at the University of Washington. The pen based force display is a direct- d ll l d d Finger-Based Systems (2) drive, parallel, actuation redundant, three degree-of-freedom haptic device, designed to provide force feedback information generated by either a master-slave system or a virtual simulation. The operator interacts with it usingThe operator interacts with it using either the fingertip or a freely held pen- like tool. Hand-based systems Examples of joysticks: TopShot of Immersion Corp Microsoft's Sidewinder Force Feedback Pro Joystick .J y Hand-Based Systems (2) Examples of steering wheels: TopShot of Immersion Hand based Haptic devices : Novin Falcon (2007) Low cost hand based haptic device for video gamedevice for video game The device, which is designed to retail for under $100 in mass market quantities, performs comparably to commercial devices that cost thousands of dollars, making the technologydollars, making the technology practical for consumer applications for the first time. http://home.novint.com/ http://www.youtube.com/watch?v=f3r3od7nah4 6. 6 CyberGloveTM and derived products The CyberGlove Provides a real time evaluation of the handevaluation of the hand posture 22 angles are measured (each phalanges, abducts and wrist) 21 CyberTrackTM The CyberTrack Device that is embedded in the CyberForce exoskeleton.the CyberForce exoskeleton. It provides a very accurate tracking (orientation and position) of the users wrist. 22 Refresh rate : near 1000Hz Sensibility : less than 0,1mm CyberGraspTM Hand exoskeleton that allows a 0.5 degree of freedom force feedback h fion each finger. Strings can pull a finger. Useful for grasping simulation : the hand 23 simulation : the hand cannot be closed when an object is grasped. CyberGraspTM of Immersion The CyberGrasp is a lightweight, force-reflecting exoskeleton that fits over a CyberGlove and adds resistive force feedback to each finger. With the CyberGrasp force feedback system, users are able to feel the size and shape of computer-generated 3D objects in a simulated virtual world. 7. 7 Grasp forces are produced by a network of tendons routed to the fingertips via the exoskeleton. There are five actuators, one for each finger, which can be individually programmed to prevent the users fingers from penetrating or crushing a virtual solid object. CyberGraspTM of Immersion (2) g j The device exerts grasp forces that are roughly perpendicular to the fingertips throughout the range of motion, and forces can be specified individually.forces can be specified individually. CyberForce CyberForce provides a 3D force feedback on the wrist. It is useful for giving the impression of contacts with the virtual environment. 26 Max. force : 60N CyberforceTM of Immersion CyberForce is a force feedback armature that not only conveys realistic grounded forces to the hand and arm but also provides six-degree-of-freedom positional tracking that accurately measures translation and rotation of the hand in three dimensions. Complete Haptics Workstation Haptics Workstation includes right-hand and left- hand CyberForce whole-hand force-feedback systems t d b hi d th mounted behind the user on vertically adjustable columns. permits both seated and standing configurations. The CyberForce systems apply ground-referenced forces to each of the fingersforces to each of the fingers and hands. 8. 8 EXOS Dextrous Hand Master originally developed as a master controller for the Utah/MIT d bDexterous Hand robot. It is an exoskeleton-like device worn on the fingers and hand. Using Hall-effect sensors as potentiometers at the joints it accurately measures the bending of h h f h fthe three joints of each finger as well as abduction of the fingers and complex motion of the thumb. Safire This is a force reflecting master for fingers and wrist. It has a total of 2, 5, 8, or 11 DOF., , , The 11 DOF version weighs 3.4 kg. The torque on the fingers can be 35-70 N.cm. Exoskeletons of PERCRO (Pisa)Iowa State University MASTER ARM (Southern Methodist University) Force feedback realized by actuating joints of master manipulator by pneumatic cylinders. Each double-acting cylinder is connected to a pneumatic proportional valve. Pressure transducers used as sensors in control loop that is responsible for applying desired forces at actuators. h l l f d d The pressure signals amplified and communicated to A/D converters on Mac that houses force actuation controller. 9. 9 Other Haptic Devices Adaptive Impedance Control for Haptic Interfaces of Georgia Institute of Technology. adjusts the resistance of a robot to variations in its environment dynamics. Extends capabilities towards unstructured tasks. The high bandwidth force display of the University of Washington Magnetic Levitation Haptic Interface (CMU) based on a recently developed magnetic levitation technologylevitation technology. The magnetic levitation technology uses Lorentz forces to stably levitate and control a rigid body in six degrees of freedomdegrees of freedom. http://www.msl.ri.cmu.edu/projects/haptic/ HapticMASTER: Admittance haptic controlled device FCS (2005) The HapticMASTER is a 3 degrees of freedom3 degrees of freedom, force-controlled haptic interface. Racing Simulation FCS 2004 (Now, Cruden) 10. 10 SOFTWARE Introduction Because of this diversity, it is difficult to write generic software or API that handles them all. By analogy with Visual Rendering, it is difficult to create an OpenGL or Direct3D-like API. ~3D graphic cards follows the same principles (same architecture) 38 Existing Related Work Existing Software GHOST SDK / 3D T h SDK (S bl )~ GHOST SDK / 3D Touch SDK (Sensable) Ease the programming of VR application with Haptic Feedback. Ease the physic of touch calculation. However, it works only with Phantom devices. No graphical rendering 39 No graphical rendering Existing Related Work Existing Software ~OpenHaptics SDK k l k ( f Looks like OpenGL API (easy for graphic programmers) But it works only with Sensable devices (strange notion of open) 40 11. 11 Existing Related Work Existing Software ~Force Dimension API d Swiss product Works with Force Dimension products Low-Level API 41 Existing Related Work Existing Software ~Virtual Hand Library d h l d Hand grasping mechanism implemented Linkable with Catia Works only with Immersion products. Scene graphs are static 42 Existing Related Work Existing Software ~ReachIn API Complete Library : Visualization and Force Feedback Includes also sound generation. Support Force Dimension, SensAble products and others. Price 43 Existing Related Work Existing Software ~Chai3D API Complete Library : Visualization and Force Feedback Support different commercially available hardware (Force dimension, Sensable, FCS). Allow virtualization of devices Reall Open Source 44 Really Open Source 12. 12 Existing Related Work Existing Software ~These APIs have a main problem in our case : They are mainly suitable for point to object collision detection and force feedback response (except VirtualHand). We need at least deformable object (the hand) to object 45 Software techniques for two-handed interaction Two-handed interaction seems more natural. But simulating it is difficult ~ Lack of devices ~ Lack of software To perform nearly realistic interaction, we need: ~ A system for acquiring posture and position of both hands 46 A system for acquiring posture and position of both hands ~ A system for applying force feedback on the hands and on the arms ~ The workspace should be large enough Main Requirements API that connects with the Haptic Workstation library that handles collision between virtual objects and two hands (deformable objects)objects and two hands (deformable objects). Integration e with visualization rendering system. Should be simple enough to create quickly VR applications. 47 3. Realistic Haptic Interaction To enable the interaction and the feeling of various Virtual Environments ~Access to the hardware ~Calibration of the devices ~User comfort improvement ~Contact detection ~Virtual objects animation 48 j ~Two-handed force feedback computation ~Simplification of the integration with existing VE. 13. 13 Access to the Hardware (1) ~Need to get input data Dataglove angle values Position/Rotation of the wrist String positions ~Need to set output data Force on the fingers Force on the wrist ~This should be performed as fast as possible 49 p f f p Access to the Hardware (2) Access to the Hardware (1/6)( / ) 1337s 50 2674s Calibration of the devices (1) ~ Objectives To obtain visual fidelity T ll f t fi ti To allow a fast reconfiguration ~ CyberGloves Linear Bend Sensors values ~ Need to convert them in degrees 2 point interpolation for each sensor 51 2 point interpolation for each sensor ~ 4 postures calibrate the phalanges and the abducts between fingers. Calibration of the devices (2) 52 Visual fidelity is achieved Limitations: the thumb 14. 14 Calibration of the devices (3) ~CyberTrack ~Two problematics: Conversion between raw optical sensors to cartesian space (already done) The hands are not in the same space. Methods to put them in same coordinate system provided in the API H f ll l iti d t b lib t d ft h 53 ~Hopefully, only position needs to be calibrated after each startup (rotation and scaling remains valid after shutdown) Calibration of the devices (4) ~Validation using secondary tracking system 54 Calibration of the devices (5) ~CyberGrasp and CyberForce Need to convert values to Newton units We use a spring weighting scale to find the conversion 55 User comfort Improvement (1) ~Not mandatory but increase immersion ~We use biofeedback device to measure muscular activity of the arm. ~This provides indication about 56 This provides indication about fatigue 15. 15 User comfort Improvement (2) ~Measure of the exoskeleton influence 57 User comfort Improvement (3) ~Zero-G and One-G comfort improvement : Exoskeleton heavy and long sessions exhausting. Weight of exoskeleton function of position of wristg p We use the Cyberforce itself to counter exoskeleton Algorithm to calculate vertical force that should be applied to Cyberforce according to its position. weight 58 User comfort Improvement (4) time mV 59 User comfort Improvement (5) time mV 60 16. 16 Contact Detection (1) ~2 problems Intersection test between two objects ~ Difficult when using triangles meshes O(nm) ~ Easier with basic primitives (sphere, box) Intersection between many objects ~ Bounding volumes ~ Spatial partitioning 61 ~ Time coherence Virtual objects animation (1) ~A dynamic rigid object has Mass Center of gravityg y Inertia tensor ~Newton and Euler laws of motion ~In our context, we chose the Ageia (Nvidia) Physx 62 In our context, we chose the Ageia (Nvidia) Physx library to perform these tasks (collision detection and animation) Two-handed force feedback computation (1) ~Two Approaches Direct mapping Proxy based technique 63 Two-handed force feedback computation (2) ~Direct Mapping Technique Consists in reproducing a collision hand that is similar to the calibrated visual hand 64 17. 17 Two-handed force feedback computation (3) ~Proxy-based technique ~ Real hand drives an intermediate hand (wireframe) ~ This intermediate hand applies soft constraints on a mass springpp f p g hand model. ~ Position of the mass spring hand is displayed to the user 65 Software framework (1) We want to produce reusable code, as well as to ease the programming of haptic applications We want to produce stable and realistic interaction Optimizations needs to be performed. ~Multithreading tasks ~Synchronizing componentsSynchronizing components Embed a visual rendering system 66 Software framework (2) General Organization 67 Software framework (3) Class diagram of the library (main components) 68 18. 18 Software framework (4) Dynamic Engine ~ODE (Open Dynamic Engine) Provides physical animation of objects Provides physical animation of objects Joint mechanism between objects ~ Mass-spring systems ~ Elastic model ~ Strings, etc. C lli i d t ti 69 Collision detection : ~OPCode (well-integrated with ODE) Fast computation, accurate. Provides important data (normal of collision, penetration depth) Software framework (5) Haptic scene: ~Contains haptic nodes (Scene graph representation) Haptic nodes: ~2 kind: dynamic and static ~Dynamic data : Inertia tensor 70 Mass ~Collision data (list geometries) Primitives (cylinder, sphere, boxes) Mesh (soup of triangles) Software framework (6) 71 Software framework (7) Synchronization of haptic input/output data 300Hz950Hz 19. 19 Software framework (8) Synchronization of Virtual Objects 73 300Hz100Hz Software framework (9) Interaction with generic VE Problematic: a lambda Virtual Environment does not include proper haptic dataproper haptic data. ~Triangle meshes are too complicated to perform collisions ~Mass and center of gravity information is not easily computable from the trianglep f g meshes ~It is not possible to extrapolate haptic texture. 75 5. Interaction with generic VE Objectives: ~Ease the addition of this important data to the i l ivisual environments. ~Simplify the Triangle meshes ~Minimize users intervention and automates as much as possible the addition of haptic data. Solution: ~The Haptic Scene Creator 76 20. 20 ~Developed in VRlab by Achille Peternier ~Easy-to-use 3D engine intended to students (learning 3D graphics) Visualization engine: MVISIO 3D graphics) ~Multi device 3D engine Works with PDA, HMD, CAVE ~MHAPTIC created above MVISIO MHAPTIC provides user interfaces to connect hardware, calibrate debug etc 77 calibrate, debug, etc. APPLICATIONS Applications Virtual Juggler ~Made by a student last year 80 21. 21 Applications Volvo truck gearbox simulation ~Made by a Patrick Salamin 81 HAPTIC Feedback in Truck Design Forces calculation Lots of constraints: ~Avoid step effect ~Must be smooth ~Must change quickly Based on the PID algorithmBased on the PID algorithm List of the Forces Static forces (Double-H, 2 supports, special) Dynamic forces (gear changing) Pseudo grasping 22. 22 System architecture Used Peripherals and Libraries CAD to VR models Modular Architecture EMG analysis Muscular comfort Graphs obtained with the help of an EMG Applications Metaball Modeling ~CAD application 87 Applications Assembly training under Mixed-Reality condition 88 23. 23 The "Rutgers Ankle" rehabilitation interface Foot attached on a multi-directional platform. The movements (different forces applied) of patient detected by platform and data transmitted to system.p y System displays a Virtual Environment (VE) and user interacts with it. Also a force feedback: resistive forces depending on rehabilitation exercise made by patient. The Rutgers Ankle Interface integrated with the PC workstation NSF Funded When system connected to network, telerehabilitation possible with therapist monitoring and giving instructions to devices. The "Rutgers Ankle" rehabilitation interface (2) Movements of patient registered by PC and monitor change parameters of exercise with these data. Also possible to find libraries of exercises on Internet. 24. 24 The haptic device The interface is made up of 4 pneumatic cylinders. Each cylinder is accompanied by a potentiometer which are used The "Rutgers Ankle" rehabilitation interface (3) Each cylinder is accompanied by a potentiometer which are used for collecting their current respective position. All is made so that the user has a good capacity of movement and without risks. Between the foot and the pistons, there is a platform containing a 3D force sensor used to mesure the actual forces and torques receive by this platformreceive by this platform. (Burdea et al.) VR ankle rehabilitation exercise Burdea et al.