Download - Gesture Analysis
Student Name : Saif SamaanStudent Number :42083850Supervisor: Manolya Kavakli
111/11/11
Abstract Introduction Literature Review Experiments and Analysis Experiments Survey and Results Conclusion
211/11/11
A computer contains three main parts, input interfaces, processing devices and output interfaces.
Input and output interfaces called, Human Computer Interfaces (HCI), these interfaces provides the communication environment between human and computers.
311/11/11
This project aims to explore human hand-gesture classification from a designer point of view, to support a discovery project sponsored by the Australian Research Council.
The discovery project examines a novel environment in which a designer can define the contour of a sketch by controlling a pointer using a pair of data gloves and can interact with the design product in 3D space.
411/11/11
I provide an analysis study for current and
previous hand-gesture recognition systems,
also I experiment the gesture design
process with twenty volunteers, and I
defined a set of suggested gestures,
techniques and possible future challenges.
511/11/11
Abstract Introduction Literature Review Experiments and Analysis Experiments Survey and Results Conclusion
611/11/11
This section presents the background and a general description for the problem and the aim of this project.◦ Background
Human-computer interfaces (HCI) logic have developed through different stages using different tools, traditional tools like keyboards for text input, mice,2D and 3D graphical interfaces and sophisticated 3D virtual environment (VE) systems.
711/11/11
◦Problem Description Most human-machine interfaces based
on joysticks, keyboards, and keypads,
but few such interfaces use gestures.
Interaction based on hand gestures
commonly uses two types of interface,
gloved-based and vision-based .
811/11/11
◦ Vision-based interfaces require powerful image processing algorithms to segment the hand from stationary background and lighting conditions that enhance gesture classification accuracy.
◦ A glove- based system requires the user to be connected to the computer.
Even wireless systems currently offered in the market require the user to wear a glove. Furthermore, accurate devices are expensive and hard to calibrate.
911/11/11
◦The Aim of this work
The main aim of this report is to explore
human computer interaction and
gesture recognition, in order to support
the design process.
1011/11/11
Abstract Introduction Literature Review Experiments and Analysis Experiments Survey and Results Conclusion
1111/11/11
1-Video Capture(Vision-based Technology)
◦Using one camera or more to record images of
human hand Gestures.
Infrared-camera Based System
Mono-camera Based Gesture Recognition
System
Multi-camera Based Gesture Recognition
System
1211/11/11
13
Infrared camera
Mono-camera Multi-camera
11/11/11
2-Image Segmentation
◦Threshold-Based Segmentation
◦ Active Contour Models
◦ Split and Merge
◦ Boundary Based Segmentation
1411/11/11
3-Image Processing
The computer detects images
produced from original hand image
but usually the quality of the detected
images comes along with a level of
noise.
1511/11/11
Image capture
1611/11/11
//Code of capturing image sequence
• int width=320;//Define the image resolution ratio
• int height=240;
• IplImage *pRgb=cvCreateImage(cvSize(width,height),
IPL_DEPTH_8U, 3);
videoInput video;//Create capturing object
video.setupDevice(0, width, height);//Configure device
while(1)
{
if(video.isFrameNew(0)
{
video.getPixels(0,(unsigned char *)pRgb->imageData, false, true);//Capture one frame
char c=cvWaitKey(1);
if(c==27) break;//Press ESC to exit
cvShowImage(“Video”, pRgb);//Show display window
}
1711/11/11
18
Cyber-glove (http://www.immersion.com/ )
11/11/11
Vision-Based and Sensors-Based
19
“Rise of the Planet of the Apes (2011)”
11/11/11
4-Gesture Classification
David McNeill specifies four types of
gestures, which are part of human
communication, they are, iconics,
metaphorics , beats and deictics.
2011/11/11
Abstract Introduction Literature Review Experiments and Analysis Experiments Survey and Results Conclusion
2111/11/11
Twenty volunteers helped me in my experiments.
2211/11/11
J. Liu and M. Kavakli, “Temporal Relation between Speech and Co-verbal Iconic Gestures in Multimodal Interface Design,”
◦ Differences and similarities in human gestures
◦ Single Line description◦ 13/20 volunteers:
2311/11/11
◦ Differences and similarities in human gestures
◦ Single Line description◦ 1/20 volunteer:
2411/11/11
◦ Differences and similarities in human gestures
◦ Single Line description◦ 6/20 volunteer:
2511/11/11
Two Parallels Lines description.
2611/11/11
Twenty volunteers helped me in my experiments.
2711/11/11
Curve description.
2811/11/11
Circle description.
2911/11/11
Start and end point.
First hand
Second hand
3011/11/11
Critical region problem.
First hand
Second hand
3111/11/11
Curve description:
3211/11/11
Abstract Introduction Literature Review Experiments and Analysis Experiments Survey and Results Conclusion
3311/11/11
Left Public for future study
https://docs.google.com/spreadsheet/viewform?formkey=dFVOcklIaXV1b3c0d25BaHhpMVp5Q3c6MQ
3411/11/11
5-Do you agree that gesture-recognition system replace the keyboard and mouse?
80% Not now
17% Disagree
3% Agree
3511/11/11
9- If you have to chose only one input device with
gesture recognition system, which one you will
select?
80% Touch screen
15% Mouse
5% Keyboard
3611/11/11
Abstract Introduction Literature Review Experiments and Analysis Experiments Survey and Results Conclusion
3711/11/11
The project is outlining a hand gesture recognition system that designed to be capable to recognize the hand gestures in real time, this will be in a picture of examining environment in which a designer can sketch by controlling a pointer using a pair of cyber gloves and can interact with the design product in 3D space.
3811/11/11