two-hand-arm manipulation based on tri-axial tactile date

9
Int J Soc Robot (2012) 4:97–105 DOI 10.1007/s12369-011-0131-x Two-Hand-Arm Manipulation Based on Tri-axial Tactile Data Masahiro Ohka · Sukarnur Che Abdullah · Jiro Wada · Hanafiah Bin Yussof Accepted: 19 November 2011 / Published online: 21 December 2011 © Springer Science & Business Media BV 2011 Abstract Tactile sensing ability is important for social robots, which perform daily work instead of persons. The authors have developed a three-axis tactile sensor based on an optical measurement method. Since our optical three-axis tactile sensor can measure distributed tri-axial tactile data, a robot equipped with the tactile sensors can detect not only grasping force but also slippage from its hands. In this paper, the authors have two objectives: one of them is evaluation of the three-axis tactile sensor in actual robotic tasks; the other is to demonstrate effectiveness of tri-axial tactile data for motion control. To accomplish these objectives, the au- thors have developed a two-hand-arm robot equipped with three-axis tactile sensors. In the robot motion control, we implement a recurrent mechanism in which the next behav- ior is induced by the tactile data to make the robot accept intention embedded in the environment. Since this mecha- nism is based on the tactile data, it is easy to apply it to com- munication between the hand-arms to obtain the best timing for cooperative work. In a series of experiments, the two- hand-arm robot performed object transfer and assembling tasks. Experimental results show that this tri-axial tactile base programming works well because appropriate behavior is induced according to slippage direction. Keywords Robot tactile systems · Tactile sensors · Tri-axial tactile data · Two-manipulator systems M. Ohka ( ) · S.C. Abdullah · J. Wada Graduate School of Information Science, Nagoya University, Nagoya, Japan e-mail: [email protected] H.B. Yussof Faculty of Mechanical Engineering, Universiti Teknologi MARA, Shah Alam, Malaysia 1 Introduction Tactile sensation possesses a salient characteristic compared to vision and hearing in that it does not occur without in- teraction between sensory organs and objects. Touching an object induces both deformation of it and the sensory organ. Since robots’ tactile sensing is important for performing any task [13], we are developing an optical three-axis tactile sensor that measures distributed tri-axial tactile data using the principle of a uni-axial optical tactile sensor [47]. If a robot is equipped with tactile sensors on its fingers, it can detect not only grasping force but also slippage caused on its hands [810]. In this paper, the authors have two objectives: one of them is evaluation of the three-axis tactile sensor in actual robotic tasks; the other is to demonstrate the effectiveness of tri- axial tactile data for motion control. To accomplish these ob- jectives, the authors have developed a two-hand-arm robot equipped with three-axis tactile sensors on both hands. In the robot motion control, we implement a recurrent mecha- nism in which the next behavior is induced by the tactile data to make the robot accept intention embedded in the environ- ment. Since this mechanism is based on the tri-axial tactile data, it is easy to apply it to communication between the hand-arms to obtain the best timing for cooperative work. In the experiments, which were performed for the objec- tives, a small set of sensing-action programs was first pro- duced, to accomplish basic tasks such as object transfer and assembling two parts. In the former task, a fragile object is grasped by the left hand and transferred to the right hand. In the latter task, a cap grasped by the right hand is twisted onto a bottle grasped by the left hand. These tasks require cooperative work of the two hand-arms of a robot, in which appropriate grasping force should be generated to prevent from dropping or crushing the object. Program modules in-

Upload: caerlos-gatxus

Post on 22-Dec-2015

14 views

Category:

Documents


0 download

DESCRIPTION

ASDFGHJK

TRANSCRIPT

Page 1: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

Int J Soc Robot (2012) 4:97–105DOI 10.1007/s12369-011-0131-x

Two-Hand-Arm Manipulation Based on Tri-axial Tactile Data

Masahiro Ohka · Sukarnur Che Abdullah · Jiro Wada ·Hanafiah Bin Yussof

Accepted: 19 November 2011 / Published online: 21 December 2011© Springer Science & Business Media BV 2011

Abstract Tactile sensing ability is important for socialrobots, which perform daily work instead of persons. Theauthors have developed a three-axis tactile sensor based onan optical measurement method. Since our optical three-axistactile sensor can measure distributed tri-axial tactile data, arobot equipped with the tactile sensors can detect not onlygrasping force but also slippage from its hands. In this paper,the authors have two objectives: one of them is evaluationof the three-axis tactile sensor in actual robotic tasks; theother is to demonstrate effectiveness of tri-axial tactile datafor motion control. To accomplish these objectives, the au-thors have developed a two-hand-arm robot equipped withthree-axis tactile sensors. In the robot motion control, weimplement a recurrent mechanism in which the next behav-ior is induced by the tactile data to make the robot acceptintention embedded in the environment. Since this mecha-nism is based on the tactile data, it is easy to apply it to com-munication between the hand-arms to obtain the best timingfor cooperative work. In a series of experiments, the two-hand-arm robot performed object transfer and assemblingtasks. Experimental results show that this tri-axial tactilebase programming works well because appropriate behavioris induced according to slippage direction.

Keywords Robot tactile systems · Tactile sensors ·Tri-axial tactile data · Two-manipulator systems

M. Ohka (�) · S.C. Abdullah · J. WadaGraduate School of Information Science, Nagoya University,Nagoya, Japane-mail: [email protected]

H.B. YussofFaculty of Mechanical Engineering, Universiti Teknologi MARA,Shah Alam, Malaysia

1 Introduction

Tactile sensation possesses a salient characteristic comparedto vision and hearing in that it does not occur without in-teraction between sensory organs and objects. Touching anobject induces both deformation of it and the sensory organ.Since robots’ tactile sensing is important for performing anytask [1–3], we are developing an optical three-axis tactilesensor that measures distributed tri-axial tactile data usingthe principle of a uni-axial optical tactile sensor [4–7]. If arobot is equipped with tactile sensors on its fingers, it candetect not only grasping force but also slippage caused onits hands [8–10].

In this paper, the authors have two objectives: one of themis evaluation of the three-axis tactile sensor in actual robotictasks; the other is to demonstrate the effectiveness of tri-axial tactile data for motion control. To accomplish these ob-jectives, the authors have developed a two-hand-arm robotequipped with three-axis tactile sensors on both hands. Inthe robot motion control, we implement a recurrent mecha-nism in which the next behavior is induced by the tactile datato make the robot accept intention embedded in the environ-ment. Since this mechanism is based on the tri-axial tactiledata, it is easy to apply it to communication between thehand-arms to obtain the best timing for cooperative work.

In the experiments, which were performed for the objec-tives, a small set of sensing-action programs was first pro-duced, to accomplish basic tasks such as object transfer andassembling two parts. In the former task, a fragile object isgrasped by the left hand and transferred to the right hand.In the latter task, a cap grasped by the right hand is twistedonto a bottle grasped by the left hand. These tasks requirecooperative work of the two hand-arms of a robot, in whichappropriate grasping force should be generated to preventfrom dropping or crushing the object. Program modules in-

Page 2: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

98 Int J Soc Robot (2012) 4:97–105

Fig. 1 Two-hand-arm robot equipped with optical three-axis tactilesensors

cluded in the experimental program software are very sim-ple, and they adopt tangential force vector and slippage in-formation (defined as time derivative of tangential force) askey signals that induce robot behaviors. Since each hand-arm determines its motion based on the above keys withouta wired connection between them, each is controlled by anisolated controller. In a series of verification tests, we showthat the above tasks are accomplished by the information ofkeys.

2 Two-Hand-Arm Robot Equipped with OpticalThree-Axis Tactile Sensors

2.1 Robot Hardware

We adopted a two-hand-arm robot to accomplish the presentobjectives. Figure 1 shows the two-hand-arm robot, which isan advancement over the one-finger robot presented in a pre-vious article [10]. Figure 2 shows the structure of the presentrobot; the arm system’s DOF is 5; each finger’s DOF is 3.To compensate for the lack of arm DOF, this robot uses itsfinger’s root joint as its wrist DOF (two joints of the fingermounted on the left hand are fixed). The actuator of the fin-ger joint has the following specifications: maximum torque,0.7 Nm to generate 10 N force; encoder resolution, 40,960pulses/revolution.

On each fingertip, it has a novel, optical three-axis tactilesensor (Fig. 3) that can obtain not only normal force distri-bution but also tangential force distribution. We presentedthis sensor in previous articles [8–10], which are helpful forexplanation. Although ordinary tactile sensors can only de-tect either normal or tangential force, since this tactile sensorcan detect both, the robot can use the sensor information asan effective key to induce a specified behavior.

Fig. 2 DOFs of two-hand-arm robot

Fig. 3 Optical three-axis tactile sensor

A local coordinate is embedded on sensing elementsof the three-axis sensor attached to the fingertip. Figure 4shows the relationship between the location of the sensingelement and the local coordinate. Although the three-axistactile sensor has 41 sensing elements, experimental resultsof the specified sensing element will be demonstrated in asubsequent chapter. Using coordinate transformation, com-ponent slippage vectors are calculated with respect to theglobal coordinate O-XGYGZG in Fig. 2.

2.2 Control Program

The architecture of the present control program resemblesthat of many ordinal behavior-based robots, such as sub-sumption architecture (SSA) [11, 12], which is for hard-ware architecture to mimic reflection behavior of insects.However, since the present architecture is established assoftware architecture to accomplish more complicated tasksthan those performed by insects, its aspects are differentfrom SSA as follows.

Page 3: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

Int J Soc Robot (2012) 4:97–105 99

Fig. 4 Local coordinate on sensor

Behaviors included in the present software have nomarked priority, while behaviors in SSA have priority. Whentactile information obtained from the environment is inputinto this software, a command for robot behavior is output,and the robot actuators are controlled based on that com-mand. After that, the environment is changed due to the de-formation and transformation caused by the robot behavior,and the result is sent as tactile information to the programsoftware inlet by feedback loop. In this software, the ba-sic tasks such as pick up-and-place, cap twisting and objecttransfer are accomplished with several program modules asdescribed in the following sections.

In each module, the specific pattern of tactile data is re-lated to the specific simple behavior such as “release” and“grasp” to accomplish reflection “sensor input versus reflec-tion behavior”. This control program concept resembles anexpert system in artificial intelligence [13]. The differencebetween this program and an expert system is that the factdatabase is stored inside a computer but in this program thewhole environment is treated as the fact database. In the ac-tion module, we can include a relatively complex procedureif we need it.

In our architecture, each module is programmed based ona simple algorithm. This is an advantage of this architecturebecause individual module check is possible. The timing ofactivating these modules is not previously decided, but thespecific module is activated when the requirement of inputsensor data is satisfied.

3 Basic Experiment

3.1 Object Grasping

In this chapter, the simplest and most basic case is explained.Since the relationship between sensor input and action iscomplicated, we use a schematic diagram as shown in Fig. 5,and use it in subsequent chapters. For simplicity, the object-grasping task is adopted as the most basic case.

Fig. 5 Object grasping task

First, the object-grasping task is a common problem forboth one hand and two hands. Figure 5 shows a typical pro-gram module. To accomplish object grasping tasks, we usedtwo basic behaviors: (1) grasping force is enhanced by slip-page to prevent instability in grasping an object; (2) graspingforce is maintained to prevent crushing if no slippage occurs.

In subsequent sections, we will introduce several parame-ters summarized in Table 1. Threshold values were adjustedto be able to grasp several specimens such as aluminum,wood, Styrofoam and paper dies of 30 mm in each segmentand to perform pick up-and-place based on ad hoc trial anderror.

Although we produce three program modules for a grasp-ing task, action contents are changed according to twomodes: search mode and grasping mode. Before the handtouches an object, search mode is valid and grasping modeis not valid.

In the search mode, touching is judged through the timederivative of tangential force caused on one element becauseit is more sensitive than the normal force time derivativedFt and is designated as derivative of the tangential forcemeasured on an element accepting the largest normal forceamong 41 elements. Three modules’ functions are shown in

Page 4: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

100 Int J Soc Robot (2012) 4:97–105

Table 1 Parameters used forexperiments

the following. Module 1: if dFt ≤ dr (dr is a threshold), ini-tial velocity v0 (magnitude is 2 mm/sec) is substituted intofingertip velocity v. Module 2: if dFt > dr , it is judged thatthe first touch has occurred, and then the mode changes intograsping mode. Module 3: if normal force of even one ele-ment accepting the largest normal force exceeds the thresh-old of normal force Fnmax, grasping force enhancement isimmediately stopped to prevent from crushing the object andthe sensing element itself.

In the grasping mode, grasping force is changed if slip-page occurs; otherwise the current grasping force is main-tained. Module 1: if dFt ≤ dr , zero velocity is substitutedinto fingertip velocity v to maintain current grasping force.Module 2: if dFt > dr , re-push velocity cv0 is substitutedinto fingertip velocity v where c is determined through thehardness (soft: c = 0.25, medium: c = 0.5, hard: c = 0.6).Module 3: this is the same as one of the search modes.

After the first touch during 100 msec, the robot measuresthe object’s hardness (three levels: soft, medium, hard). Thecoefficient for re-push velocity c is selected to an appropri-ate value based on maximum normal force increase amongelements, which specifies hardness of the object.

With these modules, the robot can grasp not only a hardobject but also a very fragile object such as a paper die with-out experiencing insufficient grasping force. Since aroundseven tactile elements touch an object in the common condi-tion and slippage occurs from the outer elements, the robotcan enhance grasping force to prevent from causing fatalslippage. Consequently, the robot can grasp an object withminimum grasping force to prevent it from crushing.

Since the element of rubber acts as a nonskid pad, thehand can grasp a light object through even one element con-tact on each finger except for the case of changing weight,such as pouring water into a paper cup.

3.2 Cap-Twisting Task

The cap-twisting task is performed based on modules 1, 2and 3 described in the previous section. The initial trajectoryfor the fingers is applied to the robot to start cap twisting;the termination condition of the cap-twisting task must bedetermined. For these behaviors, we added two modules tothe previous modules in Fig. 5.

Module 4 provides an initial cap-twisting task in whichthe fingers follow the square trajectories shown in Fig. 6 asan inserted figure. This behavior is repeated until the be-havior provided by Module 5 is defined by the following.Since the finger trajectory is intersected at the cap’s con-tour, the fingertips slip on its surface. If slippage occurs,the increment of the shearing force is observed. At this mo-ment, module 2 of grasping mode is activated to enhance thegrasping force.

Module 5 decides when to terminate the twisting task.Empirically, we know that much slippage occurs when thecap is tightened. If the increment of shearing force on an ele-ment dFt exceeds 1.4 times dFt max, which is the maximumduring the first touch, the finger motion is terminated. In theauthors’ previous work [14], this termination rule was notincluded.

The fingertip trajectory is shown in Fig. 6. Modificationof the initial trajectory is saturated after closing the cap. Al-though the initial finger trajectory is a rough rectangle deter-mined to touch and turn the cap, a segment of it is changedfrom a straight line to a curved line to fit the cap contour.The curved line is obtained from this task.

3.3 Pick up and Place

Since the pick-up task is accomplished by the object grasp-ing explained in Fig. 5, releasing-object behavior is added

Page 5: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

Int J Soc Robot (2012) 4:97–105 101

Fig. 6 Modified trajectory based on interaction

Fig. 7 Picking up and placing task

to accomplish the placing task. Due to this, modules 4 and 5for cap-twisting mode are replaced by new modules 6 and 7.

Module 6 conveys and sets an object down at a speci-fied plane position after it picks the object up from a speci-fied position. Laying down is terminated by the activation ofmodule 7, which is shown in the following.

Module 7 releases the object when the tactile sensorscatch the upward slippage. If the robotic hand feels up-ward slippage while moving down, the object’s bottom hastouched a table or the floor. In this module, the upward slip-page induces opening of the hand. Using this program, thepick up-and-place task shown in Fig. 7 is accomplished.

4 Task Accomplished by Tri-axial Tactile Data

4.1 Object Transfer

When an object grasped by one hand of a robot is trans-ferred to the other hand, both hands must adjust their grasp-ing force. When a person transfers an object from one handto the other, one hand increases grasping force while theother reduces it. In this study, we are attempting to achievethis type of handover between two hand-arms of a robot.

The object-transfer task is performed as an applicationof the pick up-and-place task mentioned in the precedingsection. While the key of releasing an object is defined asthe direction of reaction from the floor in the pick up-and-place task, we improve Module 7 for the pick up-and-placetask to specify an arbitrary direction of reaction vector d forthis object-transfer task.

In this experiment, the robot grasps an object (paper die;each segment is 30 mm) on point A with the left hand, andputs it on point B. Points A and B are located on the endsof posts. Planar positions of A and B are programmed inthe robot while the vertical position of B is not provided.Figure 8 shows the relationship between the modules of theleft and right hands.

After picking up the object with the left hand, the robotpasses it to the right hand. At that moment, Modules 6 and 7of pick up-and-place mode for the right and left hands workas shown in Fig. 8(a). After that, the right hand places theobject according to module activation as shown in Fig. 8(b).This task proceeded as shown in Fig. 9.

To show this succession task, variation in normal forceapplied to each sensing element is shown in Fig. 10, whichshows 17 of 41 selected sensing elements because theseelements usually touch the object. In this figure, variationin normal force of the selected element is shown, and theselected element number appears in the figure. At around7 sec, normal force abruptly increases in Fig. 10(a) becausethe left hand grasps the object. At around 40 sec, normalforce abruptly increases in Fig. 10(b) as well because theright hand grasps the object.

Next, for variation in slippage (time derivative of tan-gential force in the global coordinate obtained from an el-ement causing the largest normal force), we select one fin-ger from both hands and exemplify their variations as shownin Fig. 11. Since finger position in the local hand coordi-nate (open-and-close axis) is shown by a solid line, openingand closing of the hand are determined by the decrease andincrease of the solid line. At around 40 seconds, the righthand grasps and retrieves the object and horizontal slippageis generated on the left hand. This horizontal slippage in-duces a releasing motion of the left hand. After that, on theright hand, upward slippage occurs by the bottom of objectcontacting the end of post B.

Page 6: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

102 Int J Soc Robot (2012) 4:97–105

Fig. 8 Object-transfer task

4.2 Assembly

If all modules described in the previous section are com-bined, the robot can perform simple assembly tasks. Inthis experiment, first the right and left hands grasp a capand a bottle-like object, respectively. The cap and bottleare made of plastic and wood, respectively; their sizes are∅32×14 mm and 30×30×93 mm; their masses are around

Fig. 10 Variation in normal force of each sensor element

Fig. 9 Sequential pictures in object-transfer task (http://ns1.ohka.cs.is.nagoya-u.ac.jp/ohka_lab2/images/Passing(short).wmv)

Page 7: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

Int J Soc Robot (2012) 4:97–105 103

Fig. 11 Observed slippage during object transfer task on sensor ele-ment accepting largest normal force

2.5 g and 32.8 g. The cap and bottle were set by a testerbefore the start of this experiment. Then, the left hand ap-proaches the right hand to mate the screw thread of the bot-tle to the cap. After that, the twisting task is started in thesame manner of the twisting task described in Sect. 3. Afterscrewing is completed, the left hand moves the assembledbottle to the home position. At that time, horizontal slippageoccurs and it induces release of the cap.

Figure 12 shows this chain reaction of modules duringthis assembly task with respect to important modules. In thischain reaction, ±XG, ±YG, ±ZG-directional slippages be-come keys for specified behaviors. These directions are de-fined in Fig. 3. Using this program software, this assemblytask is completed as shown in Fig. 13. This assembly in theair using the two-arm-hand robot is advanced compared toour previous work [14].

Next, for variation in slippage (time derivative of tan-gential force in the global coordinate), we select one fingerfrom both hands and exemplify their variations as shown inFig. 14. At around 40 seconds, the right hand grasps andretrieves the object and horizontal slippage is generated onthe left hand. This horizontal slippage induces a releasingmotion of the left hand.

To show this task succession, we select one finger fromthe left hand and exemplify the variations as shown in

Fig. 12 Chain-reaction of modules

Fig. 14, which shows variation in slippage during interactionbetween the right and left hands. From Fig. 14(a), −ZG-directional slippage occurs around 45 seconds. At this time,the mouth of the bottle hits the cap grasped by the righthand. Since finger position in the local hand coordinate (per-pendicularly intersecting open-and-close axis) is shown by asolid line, the hand movement is observed by the solid line.

Page 8: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

104 Int J Soc Robot (2012) 4:97–105

Fig. 13 Sequential pictures in assembly task (http://ns1.ohka.cs.is.nagoya-u.ac.jp/ohka_lab2/images/Assy(short).wmv)

Fig. 14 Observed slippage in left-hand finger during assembly task

The reaction causes −ZG-directional slippage on the finger-tips of the left hand.

Around 104 seconds in Fig. 14(b), +YG-directional slip-page occurs, caused by tightening of the cap. Since theright hand screws on the cap, information about completionof screwing is transmitted from the right hand to the lefthand through +YG-directional slippage. The former slip-page causes the behavior in which the right hand screws the

cap; the latter causes the behavior in which the left handmoves the assembled bottle to the home position.

In the passing-object test, the intention of the right hand’staking the object is acquired by the left hand through theslippage. In the latter assembling test, the former slippageintends to make the right hand screw the cap. The latter slip-page intends to make the assembled bottle return to the homeposition with the left hand. Since the present tactile sensorcan distinguish direction of slippage, intentions embeddedin the slippage direction are acquired through the tactile sen-sor to perform the passing-object and assembly tasks.

Since the present robot does not have a vision system, itcannot perform any task without a-priori knowledge. For ex-ample, the global coordinate (XG,YG) of the cap grasped bythe right hand is previously provided in the assembly task,when the task is begun. However coordinate ZG is obtainedby hitting the bottle on the cap.

5 Conclusion

To evaluate the three-axis tactile sensor in actual robotictasks and demonstrate the effectiveness of tri-axial tactiledata for motion control, we implemented a program soft-ware to induce behavior of a robot by tri-axial tactile infor-mation acquired by the two-hand-arm robot. After each ruleis transformed into an algorithm, a program module is codedbased on the algorithm. The program software is composedof several program modules, and a module is selected fromthe set of modules based on sensor information. We estab-lished the program software to be composed of 3 to 7 mod-ules to accomplish such tasks as object grasping, picking upand placing, cap screwing, and assembling. The two-hand-arm robot equipped with an optical three-axis tactile sensorperformed the above tasks. Since logged tri-axial tactile data

Page 9: Two-Hand-Arm Manipulation Based on Tri-Axial Tactile Date

Int J Soc Robot (2012) 4:97–105 105

show several keys for specified behavior, the effectiveness ofthe present tactile sensor is confirmed.

In the tasks performed in this study, the series of reflec-tions is accomplished based on tri-axial tactile data. In thisseries, since a short interval is required between behaviors,the task is not always performed smoothly. Although exclud-ing this idle period is difficult because the basic behavior isswitched based on tactile data, we will treat this issue infuture work by increasing the processing speed of tactile in-formation.

The robot cannot assemble the cap and bottle if there ismisalignment. Furthermore, if an unknown external forcecoincides with a sensory key inducing specific behavior,the robot might demonstrate incorrect behavior. To over-come these problems, we will incorporate visual keys ver-sus specified behavior into the present software in futurework.

References

1. Nicholls HR, Lee MH (1989) A survey of robot tactile sensingtechnology. Int J Robot Res 8–3:3–30

2. Göger D, Gorges N, Wörrn H (2009) Tactile sensing for an an-thropomorphic robotic hand: hardware and signal processing. In:2009 IEEE int conf on robotics and automation, pp 895–901

3. Ho VA, Dao DV, Sugiyama S, Hirai S (2009) Analysis of slid-ing of a soft fingertip embedded with a novel micro force/momentsensor: simulation, experiment, and application. In: 2009 IEEE intconf on robotics and automation, pp 889–894

4. Mott H, Lee MH, Nicholls HR (1984) An experimental very-high-resolution tactile sensor array. In: Proc 4th int conf on robot visionand sensory control, pp 241–250

5. Tanie K, Komoriya K, Kaneko M, Tachi S, Fujiwara A (1986)A high-resolution tactile sensor array. In: Robot sensors vol 2:Tactile and non-vision. IFS, Kempston, pp 189–198

6. Kaneko M, Maekawa H, Tanie K (1992) Active tactile sensing byrobotic fingers based on minimum-external-sensor-realization. In:1992 IEEE int conf on robotics and automation, pp 1289–1294

7. Maekawa H, Tanie K, Komoriya K, Kaneko M, Horiguchi C, Sug-awara T (1992) Development of a finger-shaped tactile sensor and

its evaluation by active touch. In: 1992 IEEE int conf on roboticsand automation, pp 1327–1334

8. Ohka M, Mitsuya Y, Matsunaga Y, Takeuchi S (2004) Sensingcharacteristics of an optical three-axis tactile sensor under com-bined loading. Robotica 22-2:213–221

9. Ohka M, Kobayashi H, Takata J, Mitsuya Y (2008) An experi-mental optical three-axis tactile sensor featured with hemispheri-cal surface. J Adv Mech Des Syst Manuf 2(5):860–873

10. Ohka M, Takata J, Kobayashi H, Suzuki H, Morisawa N, Yus-sof HB (2009) Object exploration and manipulation using arobotic finger equipped with an optical three-axis tactile sensor.Robotica 27:763–770

11. Brooks RA (1986) A robust layered control system for a mobilerobot. IEEE J Robot Autom 2(1):14–23

12. Kube CR, Zhang H (1993) Controlling collective tasks with anALN. In: IEEE/RSJ int conf on intelligent robots and systems, pp289–293

13. Winston PH (1984) Artificial intelligence, 2nd edn. Addison-Wesley, Reading, pp 159–204

14. Ohka M, Morisawa N, Yussof HB (2009) Trajectory generation ofrobotic fingers based on tri-axial tactile data for cap screwing task.In: 2009 IEEE int conf on robotics and automation, pp 883–888

Masahiro Ohka is a Professor in the Graduate School of InformationScience at Nagoya University. In 1986, he received a Doctorate in En-gineering from Nagoya University. He was a Researcher at Fuji Elec-tric Company working to produce three-axis tactile sensors for an ad-vanced robot of the Japanese National Project promoted by MITI. Hisresearch focuses in the areas of robotic tactile sensors, complex sys-tems science, human-robot interaction, and behavior-based robotics.

Sukarnur Che Abdullah is a post graduate student in the DoctorateCourse of the Graduate School of Information Science, Nagoya Uni-versity. He received an M.Sc. in Manufacturing System Engineeringfrom University Putra Malaysia, in 2005. He was an academic staffmember and an assistant researcher at Universiti Teknologi MARA,Malaysia from 2005 and is currently on leave for pursuing his Ph.D.His research focuses in the areas of robotic tactile sensors, humanoidrobots and robot vision sensors.

Jiro Wada is an employee at Mitsubishi Heavy Industries. In 2009, hereceived a Master’s Degree of Engineering from Nagoya University.

Hanafiah Bin Yussof is a Senior Lecturer at MARA University ofTechnology (UiTM). In 2008, he received a Doctorate of InformationScience from Nagoya University. His research focuses in the areas ofhumanoid robotics, tactile sensors and behavior-based robotics.