a study on robot motion authoring system using finger-robot interaction
TRANSCRIPT
-
7/29/2019 A Study on Robot Motion Authoring System using Finger-Robot Interaction
1/5
Journal of International Council on Electrical Engineering Vol. 1, No. 2, pp. 151~155, 2011 151
A Study on Robot Motion Authoring System using
Finger-Robot Interaction
Kwang-Ho Seok *, Junho Ko ** and Yoon Sang Kim
Abstract This paper proposes a robot motion authoring system using finger-robot interaction. Theproposed robot motion authoring tool is a user-friendly method that easily authors creates and controlsrobot motion according to the number of fingers. Furthermore, the system can be used to simulate user-created robot contents in the 3D virtual environment. This allows the user to not only view the authoringprocess in real time but also transmit the final authored contents. The effectiveness of the proposedmotion authoring system was verified based on motion authoring simulation of an industrial robot.
Keywords:Finger-robot interaction, Industrial robot, Robot motion authoring system
1. Introduction
With rapid developments in today's robot technology,
robots have been used in a wide range of fields. As the
increasing number of applications adopts intelligence, and
thus robot requires a new type of robot software
environment that allows user to author various commands
for robots. However, robots must become easier to use for
general users in order to expand the scope of applications of
intelligent robots in our everyday life. Based on this request,
a number of studies are being conducted regarding new
types of user-friendly and intuitive robot motion authoring
that can render various motions [1]-[4]. Also, there are
research projects actively taking place in human-robot
interaction, thanks to the progress that has been made in
vision technology. A study is being conducted to develop a
finger pad that can replace the computer mouse [5], and
there are other studies proposing algorithms to improve
hand gesture recognition [6]-[8]. However, user-intuitive or
user-friendly robot command authoring tools that focus on
facilitating general users are still rare. The lack of user-intuitive or user-friendly tools is likely to create a barrier
for providing robot services (commands/motions) that
correspond with the preferences and demands of general
users including children, the elderly and housewives, who
are expected to be the major clients in the service robot
industry.
This paper proposes a robot motion authoring system
using finger-robot interaction. The proposed system is
capable of authoring (creating and controlling) robot
motion according to the number of fingers. It is an intuitive
robot motion authoring method that allows the user to
easily author robot motions. The effectiveness of the
proposed method is verified based on motion authoring
simulation of an actual industrial robot.
2. Proposed Robot Motion Authoring System using
Finger-Robot Interaction
With the exception of language, the hand is most
frequently used for human communication among our body
parts such as hands, eyes, mouth, arms and legs. This
chapter explains how industrial robot motions can be the
authored according to the number of fingers and verifies the
effectiveness of the proposed approach based on simulation.
Fig. 1 illustrates the overall authoring process of the robot
motion authoring system implemented for this study.
Fig. 1. Robot motion authoring system.
Corresponding Author: School of Computer Science and
Engineering, Korea University of Technology and Education, Korea
* Dept. of Computer Science and Engineering, Korea University of
Technology and Education, Korea ([email protected])
** Dept. of Computer Science and Engineering, Korea University ofTechnology and Education, Korea ([email protected])
Received: January 17, 2011; Accepted: March 23, 2011
-
7/29/2019 A Study on Robot Motion Authoring System using Finger-Robot Interaction
2/5
A Study on Robot Motion Authoring System using Finger-Robot Interaction152
2.1 Finger recognition
There are a number of techniques for finger recognition,
which is being used in various fields. In this section, fingerrecognition unit is implemented using color values scheme
[8-9]. The finger recognition unit first converts RGB colors
into gray scales and YCrCb for binary representation. Then
the region inside the hand is filled by masking and noise is
removed. The binary image is examined in 25-pixel units,
as shown in Fig. 2. If sum of 1s examined is greater than
ten, every digit is set to 1. With the image obtained by
masking performed by the finger recognition unit, the
center of the hand can be calculated to identify hands
location as well as the hand region furthest from the center.
The center of the hand is marked with a red dot, the palmregion with a red circle and the hand region with a white
circle. If the number of fingers is 0 or 5, a circle image is
displayed. For 1, 2 and 3 fingers, the image is shown in
blue, green and red, respectively. If there are 4 fingers, the
hand is shown in white on a black background (Fig. 2).
Fig. 2. Finger recognition process.
2.2 Robot Motion Authoring Tool
In this section, a motion authoring unit capable of
creating and controlling robot motions (industrial robot in
this paper) is implemented using the finger-robot
interaction based on the finger recognition unit explained in
the previous section. The motion authoring unit(editor)
consists of four panels: the finger interface panel, mode
panel, simulation panel and data panel(Fig. 3). The finger
interface panel means the finger recognition unit that
recognizes the number of fingers (from 0 to 5) from the
finger images taken by camera. The number of fingers
recognized by the finger interface panel allows the user to
control the industrial robots (such as SCARA, articulated
robot, and etc.) displayed on the simulation panel. The
mode panel provides three modes WIRE, SOLID and
ZOOM/VIEW. The mode panel provides TRANSMISSION
button and STOP button. The WIRE mode allows viewingof robots internal structure and the SOLID mode only
provides robots external view. In the ZOOM/VIEW mode,
the view can be zoomed in/out and rotated, allowing the
user to view robots entire structure in detail. The
simulation panel provides the user with a real time 3Dviewing of the robot motion being authored with finger-
robot interaction. The authored contents(motions) can be
transmitted to control the robot by clicking TRANSMISSION
button, and the actual robot motion can be stopped using
STOP button. The data panel provides the virtual robot
motion degrees and PWMs to the actual industrial robot in
real time. The data panel provides RESET, PAUSE,
RESTART button. If the RESET button is clicked, the robot
is initialized to the default value. If the RESTART button is
clicked, the robot is restarted.
Fig. 3. Robot motion authoring tool.
2.3 Virtual Robot Motion authoring simulation
This section evaluates the effectiveness of the proposedauthoring method based on robot motion authoringsimulation. The virtual industrial robot used for the robotmotion authoring simulation is an articulated robot with 6DOF, and its motion ranges and definition are given inTable 1 and Fig. 4.
Table 1. Motion Range
AXIS ANGLE (degree)
1-Axis 0 ~ -360
2-Axis -60 ~ 60
3-Axis -90 ~ 90
4-Axis 0 ~ -60
5-Axis 0 ~ -360
6-Axis 0 ~ 90
Fig. 4 depicts the virtual and actual articulated robot for
the motion authoring simulation respectively. The motionranges and directions of the virtual articulated robot are
almost similar to ones of the actual robot.
-
7/29/2019 A Study on Robot Motion Authoring System using Finger-Robot Interaction
3/5
Kwang-Ho Seok, Junho Ko and Yoon Sang Kim 153
Fig. 4. Definition of an articulated robot used for the
motion authoring simulation.
In WIRE and SOLID mode, internal and external
structures of the robot can be viewed, respectively. In either
mode, if the number of fingers recognized by the finger
interface is 0 or 4, 1-Axis or 5-Axis rotates around the Y
axis. For 1, 2, 3 fingers, 2-Axis, 3-Axis, 4-Axis rotates
around the Z axis, respectively(Fig. 5). The ZOOM/VIEW
mode allows the user to zoom in and out of the robot
structure and vary the camera angle for a detailed view.
Similar to the WIRE mode, the camera is rotated clockwise
and counter-clockwise for 3 and 4 fingers, respectively,
according to the definitions of Table 3 so that the user can
view the robot from a desired angle. Accordingly, the user
is able to not only create but control motions for a part of a
robot based on the number of fingers recognized.
Table 2 and 3 list the definitions of finger-value events
for WIRE, SOLID and ZOOM/VIEW mode.
Table 2. Definitions of Finger-value event for WIRE and
Solid mode
MODE FINGER EVENT
0 1-Axis rotation (base Y)
1 2-Axis rotation (base Z)
2 3-Axis rotation (base Z)
3 4-Axis rotation (base Z)
4 5-Axis rotation (base Y)
A. WIRE
B. SOLID
56-Axis rotation
(Gripper ON/OFF)
Table 3. Definitions of Finger-value events for ZOOM/VIEW
mode
MODE FINGER EVENT
0 default
1 ZOOM OUT
2 ZOOM IN
3 VIEW rotation (CW)
C. ZOOM/VIEW
4 VIEW rotation (CCW)
Fig. 5. The motion authoring simulation in WIRE and
SOLID mode.
Fig. 6(a) depicts how a robot rotation part is enlarged in
the ZOOM/VIEW mode when the number of finger is
recognized as 2 and 3. Also, Fig. 6(b) depicts how a robot
part is downsized in the ZOOM/VIEW mode when thenumber of finger is recognized as 1.
Fig. 6. (a) Enlarged robot rotation part in ZOOM/VIEW
mode with 2 and 3 recognized (b) downsized robot
in ZOOM/VIEW mode with 1 recognized
3. Experimental Results and Discussions
3.1 Virtual Robot Experimental Result
The robot motion authoring system proposed in this
paper was implemented on a personal computer with
1GByte memory, 1.80GHz AMD AthlonTM
64-bit processor
CPU and ATI Radeon X1600 graphic card that supports
DirectX. As well, Logitech Webcam Pro 9000 was used.
The robot motion authoring tool in this paper was
implemented with OPENGL, the open source programming
library.
Authoring was effective because different colors were
displayed according to the number of fingers recognized,
allowing the user to immediately discern the part of the
robot being authored. Fig. 7 is a captured image of the robot
motion authoring process using the proposed method. The
experimental results confirmed that the robot motion
authoring system proposed by this study provides general
users with an intuitive, fun and easy way to create motions.
-
7/29/2019 A Study on Robot Motion Authoring System using Finger-Robot Interaction
4/5
A Study on Robot Motion Authoring System using Finger-Robot Interaction154
Fig. 7. Robot motion authoring process using the proposedmethod.
3.2 Actual Robot Experimental Result
Actual articulated robot is an miniature-type robot (NT-
Robot Arm product of NTREX CO., LTD) 7 RC-servo
motors were used, and 5 ball casters were used on the
BASE part for smooth turning. The gripper can seize an
object of maximum 80mm. For control, the robot arm can
be easily controlled by a computer or embedded board by
using NT-SERVO-16CH-1 and NT-USB2UART (Fig. 8).
Table 4 depicts NT-Robot Arm specification.
Fig. 8. Actual articulated robot structure.
Table 4 . NT-Robot Arm Specification [10]
SPECIFICATION
Speed 6V servomotor
Voltage DC 6V
Size 100550
Power 16.1Kg-cm (max)
Weight 1 Kg
Structure 6 axis + gripper
Center distant 130m/m
Current 12.5 A
The purpose of experiment in 3.2 was to confirm whether
or not robot motions authored by the proposed tool works
with an actual robot well. The motions authored by the tool
were applied to an actual robot for the experiment (Fig. 9).
The motion data was transmitted to the robot using serial
communication and it was examined how the contents
produced by the robot motion authoring tool controlled the
robot according to users intention.
Fig. 9. Experiment results using the actual robot.
4. Conclusion
This paper proposed an authoring system capable of
creating and controlling motions of industrial robots based
on finger-robot interaction. The proposed system is user-
friendly and intuitive and facilitates motion authoring of
industrial robots using fingers, which is second only to
language in terms of means of communication. The
proposed authoring system also uses graphic motion data to
simulate user-created robot contents in a 3D virtual
environment so that the user can view the authoring process
in real time and transmit the authored robot contents tocontrol the robot. The validity of the proposed robot motion
authoring system was examined based on simulations using
-
7/29/2019 A Study on Robot Motion Authoring System using Finger-Robot Interaction
5/5
Kwang-Ho Seok, Junho Ko and Yoon Sang Kim 155
the authored motion robot contents as well as experiments
of actual robot motions. The proposed robot motion
authoring method is expected to provide user-friendly and
intuitive solutions for not only various industrial robots, butalso other types of robots including humanoids.
References
[1] http://www.cyberbotics.com/ (CyberRobotics WebotsTM)
[2] http://www.microsoft.com/korea/robotics/studio.mspx
[3] Sangyep Nam, Hyoyoung Lee, Sukjoong Kim, Yichul
Kang and Keuneun Kim, A study on design and
implementation of the intelligent robot simulator
which is connected to an URC system, vol. 44. no. 4,IE, pp. 157-164, 2007.
[4] Kwangho Seok and Yoonsang Kim, A new robot
motion authoring method using HTM, International
Conference on Control, Automation and Systems
(ICCAS), pp. 2058-2061, Oct. 2008.
[5] Hyun Kim, Myeongyun Ock, Taewook Kang and
Sangwan Kim, Development of Finger Pad by
Webcam, The Korea Society of Marine Engineering,
pp. 397-398, June 2008.
[6] Kangho Lee and Jongho Choi, Hand Gesture
Sequence Recognition using Morphological ChainCode Edge Vector, Korea Society of Computer
Information, vol. 9, pp. 85-91.
[7] Attila Licsar and Tamas Sziranyi, User-adaptive hand
gesture recognition system with interactive training,
Image and Vision Computing 23, pp. 1102-1114, 2005.
[8] Kunwoo Kim, Wonjoo Lee and Changho Jeon, A
Hand Gesture Recognition Scheme using WebCAM,
The Institute of Electronics Engineers of Korea, pp.
619-620, June. 2008.
[9] Mark W. Spong, Seth Hutchinson and Mathukumalli
Vidyasagar, ROBOT MODELING AND CONTROL,
WILEY, 2006.
[10] http://www.ntrex.co.kr/
Kwang-Ho Seok received the B.S.
degree in Internet-media engineering,
from Korea University of Technology
and Education (KUT), Korea, in 2007
and the M.S. degree in Information
media engineering, from Korea
University of Technology and Education
(KUT), Korea, in 2009. Since 2009 to now, he has beenPh.D. student in Humane Interaction Lab (HILab), Korea
University of Technology and Education (KUT), Cheonan,
Korea. His current research interests include immersive
virtual reality, human robot interaction, and power system.
Junho Ko received the B.S. degree inInternet-software engineering, fromKorea University of Technology andEducation (KUT), Korea, in 2009 andthe M.S. degree in Information mediaengineering, from Korea University ofTechnology and Education (KUT),Korea, in 2011. Since 2011 to now, he
has been Ph.D. student in Humane Interaction Lab (HILab),Korea University of Technology and Education (KUT),Cheonan, Korea. His research fields are artificial
intelligence and vehicle interaction.
Yoon Sang Kim received the B.S.,
M.S., and Ph.D. degrees in electrical
engineering from Sungkyunkwan
University, Seoul, Korea, in 1993, 1995,
and 1999, respectively. He was a
Member of the Postdoctoral Research
Staff of Korea Institute of Science and
Technology (KIST), Seoul, Korea. He was also a Faculty
Research Associate in the Department of ElectricalEngineering, University of Washington, Seattle. He was a
Member of the Senior Research Staff, Samsung Advanced
Institute of Technology (SAIT), Suwon, Korea. Since
March 2005, he has been an Associate Professor at the
School of Computer and Science Engineering, Korea
University of Technology Education (KUT), Cheonan,
Korea. His current research interests include Virtual
simulation, Power-IT technology, and device-based
interactive application. Dr. Kim was awarded Korea
Science and Engineering Foundation (KOSEF) Overseas
Postdoctoral Fellow in 2000. He is a member of IEEE,IEICE, ICASE, KIPS and KIEE.