gesture-based interactive beam- bending by justin gigliotti mentored by professor tarek el dokor and...

10
Gesture-Based Interactive Beam-Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium University of Arizona April 19, 2008

Upload: jonathan-griffin

Post on 29-Dec-2015

217 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Gesture-Based Interactive Beam-Bending

By Justin Gigliotti

Mentored by Professor Tarek El DokorAnd Dr. David Lanning

Arizona/NASA Space Grant SymposiumUniversity of Arizona

April 19, 2008

Page 2: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Introduction to gesture-based interactive beam-bending

• A teaching tool that allows students to interact with a three-dimensional virtual I-beam– User input is captured with a visual sensor– Markerless tracking and associated gesture

recognition is performed on the captured frames– The point tracked is interpreted as an applied

force to the beam– The I-beam deforms according to the applied force– The I-beam stresses are color coded and shown

according to the applied force to the beam– The 3D model is rendered in real-time

Page 3: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Advances made to the project

• Added a third degree of freedom to the gesture-based interaction– The third degree of freedom allows the user to apply

tension and compression forces to the beam– Application of these forces requires the user to move the

hand toward or away from the visual sensor– Specialized machine vision algorithms are utilized to

analyze the gesture

• The user interface is updated to reflect the extra degree of freedom– The tension and compression forces act to respectively

increase or decrease the length of the beam– The corresponding axial load and axial displacement are

displayed to the user in the statistics window

Page 4: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Gesture-based interaction• A single point of motion is

used to determine the direction of beam deformation– The point serves as a reference

for beam bending– The point is treated as if the user

reached out and grabbed the end of the virtual beam

• The user has three degrees of freedom– The hand tracking algorithm

tracks the depth of the user’s hand

– This allows movement forward/backward in addition to up/down and left/right

Page 5: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Beam bending

• Once the gesture is tracked, an applied force is simulated based on the gesture

• Beam bending algorithm– The simulated applied force is used to calculate a beam

deformation model for the X, Y and Z directions– The individual stresses across the entire beam are

calculated– The beam is rendered in virtual reality and color coded according to the stresses distributed across the beam

Page 6: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Deformation model• Y-Direction deflection: Corresponding change in

length:

where,

• Z-Direction deflection:

• Point Stress:

z

zz IE

DF

**3

* 3

** yz

z y

F DF D PS y z

I I A

yyy IE

DLDF

**6

))*3((*)*( 2

y direction force on beam

z direction force on beam

P = Compressive/tensile force

A = cross-sectional area of beam

distance of force from beam base

length of the beam

modulus of elasticity

y

y

z

y

F

F

D

L

E

I

-direction moment of inertia

z-direction moment of inertia

y-coordinate of point

z-coordinate of point

zI

y

z

*

*x

P L

E A

Page 7: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Real-time rendering in virtual reality • Using the calculated deformation model, the

beam is mapped into 3D space• The 3D space allows the beam to be rendered

point by point, therefore, each point is colored according its stress value

• Vertical load and displacement, horizontal load and displacement, axial load and displacement and the beam’s modulus of elasticity are all updated based on the physical parameters of the beam and the force being applied

• Real-time update– Tracking, applied force calculations, deflection

calculations and stress color coding are all performed numerous times every second

Page 8: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Beam bending environment

Page 9: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Demonstration

Page 10: Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium

Conclusion• The beam bending project

– User motion is tracked and a reference point is calculated

– An applied force is simulated based on the point tracked by the machine vision algorithms

– Deformation model complete with statistics and stresses is created

– The beam is rendered in 3D and statistics are displayed

• Result: low-cost teaching tool• One conference paper and one journal

paper were submitted on this work