interactive sand art drawing using rgb-d sensor

17
October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect International Journal of Software Engineering and Knowledge Engineering c World Scientific Publishing Company Interactive Sand Art Drawing Using RGB-D Sensor Sai-Keung Wong National Chiao Tung University Hsinchu, Taiwan (R.O.C.) Kai-Min Chen National Chiao Tung University Hsinchu, Taiwan (R.O.C.) Ting-Yu Chen National Chiao Tung University Hsinchu, Taiwan (R.O.C.) Received (Day Month Year) Revised (Day Month Year) Accepted (Day Month Year) We present an interactive sand art drawing system using one RGB-D sensor, such as Kinect. Our system supports the common sand drawing functions, such as sand erosion, sand spilling, and sand leaking. To use hands to draw virtual sand on a drawing plane, we design four key gestures which enable drawing manipulation actions for sand. The basic idea is that the gesture of one hand controls the drawing action. The motion and gesture of the other hand controls the drawing position. The process of our system has three major steps. Firstly, our system adopts a vision- based bare hand detection method which computes the hand position and recognizes the hand gestures. Secondly, the drawing position and the drawing action are sent to the sand drawing system. Finally, the sand drawing system performs the corresponding sand manipulation action. Experimental results show that our system enables sand drawing with bare hands and our system can produce a variety of pictures. Keywords : Human-computer interaction; sand art; hand tracking; gesture recognition. 1. Introduction Sand drawing is a kind of drawing technique that people manipulate the sand on beach or sandy ground to create beautiful pictures. Sand art is a style of live perfor- mance art that introduces an indoor and dynamic way of sand painting. Sand art performance is improvisational and continuous. Furthermore, the changes of hand gestures are attractive but complicated during the drawing process. Therefore, it is a nice subject for developing a user interface to perform such a sand art performance in a virtual environment. Kazi et al. [19] developed a system called SandCanvas. There are many useful 1 Manuscript Click here to download Manuscript: ws-ijseke_sand_kinect.pdf

Upload: others

Post on 20-Nov-2021

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

International Journal of Software Engineering and Knowledge Engineeringc© World Scientific Publishing Company

Interactive Sand Art Drawing Using RGB-D Sensor

Sai-Keung Wong

National Chiao Tung University

Hsinchu, Taiwan (R.O.C.)

Kai-Min Chen

National Chiao Tung UniversityHsinchu, Taiwan (R.O.C.)

Ting-Yu Chen

National Chiao Tung UniversityHsinchu, Taiwan (R.O.C.)

Received (Day Month Year)

Revised (Day Month Year)

Accepted (Day Month Year)

We present an interactive sand art drawing system using one RGB-D sensor, such asKinect. Our system supports the common sand drawing functions, such as sand erosion,

sand spilling, and sand leaking. To use hands to draw virtual sand on a drawing plane,

we design four key gestures which enable drawing manipulation actions for sand. Thebasic idea is that the gesture of one hand controls the drawing action. The motion and

gesture of the other hand controls the drawing position.

The process of our system has three major steps. Firstly, our system adopts a vision-based bare hand detection method which computes the hand position and recognizes the

hand gestures. Secondly, the drawing position and the drawing action are sent to the

sand drawing system. Finally, the sand drawing system performs the corresponding sandmanipulation action. Experimental results show that our system enables sand drawing

with bare hands and our system can produce a variety of pictures.

Keywords: Human-computer interaction; sand art; hand tracking; gesture recognition.

1. Introduction

Sand drawing is a kind of drawing technique that people manipulate the sand onbeach or sandy ground to create beautiful pictures. Sand art is a style of live perfor-mance art that introduces an indoor and dynamic way of sand painting. Sand artperformance is improvisational and continuous. Furthermore, the changes of handgestures are attractive but complicated during the drawing process. Therefore, it isa nice subject for developing a user interface to perform such a sand art performancein a virtual environment.

Kazi et al. [19] developed a system called SandCanvas. There are many useful

1

ManuscriptClick here to download Manuscript: ws-ijseke_sand_kinect.pdf

Page 2: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

2 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

(a) (b)

Fig. 1. Our system using one RGB-D sensor for sand drawing. (a) Left: the user interface. A user

uses four key gestures to control the manipulation operations for sand drawing. (a) Right: the red

point indicates the position of the finger tip. The pink point indicates the position of the palm.(b) A mushroom produced by our system.

functions which are provided for users to use virtual sand to draw pictures in acreative manner. Chen and Wong [8] presented the sand art drawing system whichcan perform basic sand drawing actions to manipulate the virtual sand to drawpictures. The basic actions include sand erosion, sand spilling and sand leaking.The manipulation part for sand in these two systems is limited to a multi-touchscreen.

Gestures and finger motions play important roles in sand drawing. In this paper,we aim to design an HCI (human-computer interface) for using bare hands to per-form sand art drawing. There are two critical issues that we need to tackle. First,we need to use the gestures of one hand to control the drawing actions. Second,once the drawing actions are detected, we need to analyze the motion and gestureof another hand to draw virtual sand. We propose efficient methods to handle thesetwo critical issues. We adopt a vision-based hand tracking system using one RGB-Dsensor. The hand tracking system is integreated with the sand drawing system [8].Our hand tracking system computes the positions of the hands of a user and alsorecognizes the hand gestures. Then, the hand tracking system drives the sand draw-ing system to draw sand. Thus, our system enables the sand art drawing using barehands. Our system uses a low-cost RGB-D sensor, such as Kinect, for implementingthe hand tracking technique. A conference version of this paper was published in[7].

Figure 1(a) shows the user interface of our system. We define four key gestures:closed palm, opened palm, single-finger, and two-finger. The system has two activeareas: command area and sand art drawing area. In the command area, a usercan change the drawing actions. When the user performs a hand gesture on thecommand area, the system detects and recognizes the hand gesture. Thus, thecorresponding drawing action can be enabled. Then, the user uses another handto control the drawing position of the sand. The drawing action and the drawingposition are employed to drive the sand art drawing system.

Page 3: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 3

The organization of the remaining sections of this paper is as follows. Section 2reviews techniques on vision-based hand tracking, HCI based on hand interaction,and sand simulation. Section 3 gives an overview of our system architecture. Sections4, 5 and 6 present the hand tracking algorithm and the integration of our systemwith the sand drawing system. Section 7 presents the experimental results. Finally,Section 8 concludes this paper and presents the future work.

2. RELATED WORKS

We review three different classes of techniques:1) HCI based on hand interaction,2) sand simulation and 3) sand drawing.

2.1. HCI based on hand interaction

There are fruitful techniques that have been developed in the field of HCI, such ashand detection and gesture recognition techniques. The resulting information canbe applied to interact with virtual objects. As a hand has high degrees of freedom,it is critical to compute hand features that are needed in performing HCI. Feng etal. [11] introduced the writing-in-the-air system which focuses on the 2D motion ofa single finger. The major result of the work is to detect the single-finger gestures.Jia et al. [17] developed a method which aims at computing the 3D position anddirection of a finger.

The hand detection and gesture recognition are two major tasks that should beperformed properly in the applications of using hand interaction. Von Hardenberget al. [31] developed a shape-based method to extract fingertips in the scene. Song etal. [29] improved on the method by taking into account the lengths of fingers. Theirtechnique was applied to some interesting virtual reality applications with physicssimulation. There are techniques which focus on curvature-based hand tracking,such as the work by Shi et al. [28] and Yeo et al. [33].

The stereo images and the depth images provide useful information for segmen-tation. The techniques proposed in [11] [33] employed depth data to perform depthsegmentation. There are a large volume of work [12, 13, 22, 29, 32] which rely on thedepth data to compute the three-dimensional positions of hands and fingers.

2.2. HCI for artistic authoring

The HCI itself is an important element in designing artistic authoring system. Inpainting, the writing pad is an intuitive tool that people can use the stylus pento draw directly on the drawing surface. The commercial software such as AdobePhotoshop [1] allows designers to use the writing pad as a tool for painting. Thereare also techniques that expand the degree of freedom of the stylus pen with thetablet computer, such as Irpen by Han et al [14].

There are studies about the Haptic-based interface that provides an interactivesystem for users to draw virtual arts. Baxter et al. [3] presented a haptic-based

Page 4: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

4 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

interface using a virtual brush to enable the painting system for artistic painting.McDonnell et al. [23] developed a haptic-based interface for a virtual clay sculptingsystem. There are also some clay sculping studies based on low-cost cameras. Daveet al. [10] presented a gesture interface using Kinect for clay sculpting, which can usebare hands to manipulate the virtual clay. zPot [27] used low cost motion devicesand presented a gesture-free application with bare hand motion to shape and colorthe virtual pots.

2.3. Particle system and sand simulation

Sand has liquid-like fluidity. But its shape is solid when it is static. Also, the sanddrifts in the air like gas. Since sand has so interesting motion behaviors, the meth-ods for sand simulation are studied extensively in graphics [16]. The methods canbe categorized into particle based methods [4, 35] and height map based methods[24, 25]. Also, there are hybrid methods that combine both a particle system and aheight map to simulate sand. Zhu and Yang [34] used particles to represent the sandsurface but employed a height map to simulate for the sand near to the ground. Inthis way, significant computation cost can be cut down. Particle systems are alsoemployed in ice simulation [15, 21] and simulation of water drops and flows [5, 6].

There are techniques which are devoted to sand art. Kazi et al. [19] proposed asystem called SandCanvas which is a multi-touch media for sand art drawing. Thesystem support a rich set of gestures to manipulate sand. The user study indicatesthat the system is useful in creating interesting sand art pictures. Chen and Wong[8] developed a sand art system which can automatically generate stylized sandart. The sand drawing system combined a particle system and a height map. Forthe sand in the air, particles are used. If the sand piles up on the drawing plane,the sand is converted into a height value. Thus, the computation of the approach isefficient. Furthermore, a user can input a picture and then their system computes thecontours of the objects in the picture. The contours consist of a set of line segments.After that the system produces the drawing sequences for the line segments. Thesystem also decides the drawing actions, such as sand spilling, erosion, and sandleaking, that are used during the drawing process. For the regions that do not haveenough sand, the system spills sand over those regions. The users can watch thepicture while the picture is reproduced by using sand in their system. Thus, userscan enjoy the simulated sand art performance in real-time.

3. System Overview

In this section, we present an overview of our system. A user can use gestures ofone hand to select the action and uses the gestures of another hand to manipulatesand in the virtual environment. Our system consists of three phases: image pre-processing, hand feature extraction, and motion/gesture management. The purposeof the image pre-processing phase is to eliminate the noise in the captured data andthe regions that do not belong to a hand, such as human face. After the image is

Page 5: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 5

filtered and the portion for a hand is kept, we can compute the features of the hand,such as hand position, and then determine the hand gesture. The motion/gesturemanagement is the bridge between the user and the sand art drawing system. Theextracted hand data, such as the position of the hand and the hand gesture, arepassed to the sand art drawing system. Then, the virtual sand is drawn in thevirtual environment. Fig. 2 and Fig. 3 show the flow chart and user interface of oursystem, respectively.

Fig. 2. The system flow chart.

Fig. 3. The user interface of our system. Left: the command area. Right: the drawing area. The

current action can be found at the bottom bar.

4. Image pre-processing

In the image pre-processing phase, our goal is to remove the redundant portionsthat do not belong to hands in the captured color and depth images. In this way,the search space for the hands can be reduced. To extract the portion of the hand inan image, we perform background subtraction, depth segmentation, and skin colordetection. The details are described in the following sections.

Page 6: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

6 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

4.1. Background subtraction

We assume that the camera system is fixed during the sand art performance. Thus,the camera has a fixed field of view and its position is fixed while the images arecaptured by Kinect. Therefore, we can apply the frame differencing based back-ground subtraction method to obtain the foreground portion of an image. Sinceour skin color detection method is based on HS (hue saturation color space), theproblem induced by illumination change has little effect to our system. We employthe running average method by Piccardi [26] to remove the stationary objects inthe scene.

4.2. Depth segmentation

In the scene, there are some parts that are also skin-color, such as human face.Thus, if we only consider skin color to eliminate the parts that do not belong tohands, the part belonging to the face would be remained. To tackle this problem,some techniques rely on face detection and then apply a face removal method toremove the face area from an image [2, 9]. However, it takes additional computationcost to track the face area. Another problem is that it may remove parts of handswhen there is occlusion between the face and the camera, such as the user’s handlying in front of the face. Thus, our system not only uses skin color but also depthinformation to extract the regions of hands.

When a user performs sand art, his/her hands always move in front the face.Then, we can apply a depth threshold to discard the parts that are farther awayfrom the camera and keep the parts close to the camera. In this way, we can isolatethe parts belong to hands from the parts that belong to the face. Of course, the usercan stand farther from the camera and move their hand closer to the camera. Thisalso helps our system effectively eliminates the face region from the image. Figure 4shows our depth segmentation result when the hand lies in front of the user’s face.

Fig. 4. The result of the depth segmentation. (a) the original image. (b) the color image of thedepth. The region of green color indicates that is is close to the camera. The regions of red color

indicate that they are farther from the camera.

Page 7: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 7

4.3. Skin color detection

After eliminating some redundant regions (e.g., human face) from the image, the re-maining portion belongs to the arm and hands of the user. We assume that the userwears a shirt/dress with long sleeves. Then, we can avoid the costly arm removalprocess. We can directly apply the skin color detection method to extract the handregion and then compute features for the hand. In our current implementation, weapply pixel-based skin color detection method [30] in hue saturation color space toextract the skin part from the image. As revealed from our results, this method isefficient and produces segmentation with good quality for the hand region. Eventhough the illumination changes in our testing environment, we still obtain accept-able segmentation results. After extracting the skin part, we use morphology closingoperation and Gaussian filter to smoothen the skin segmentation result.

5. Hand feature extraction

After the image pre-processing phase, the image should only contain the informationof the hands of the user. In the hand feature extraction phase, the goal is to analyzethe image and extract the information that is useful for the application of sand artdrawing. Since our application is sand art drawing, the palm position, the fingertips,and the hand gesture are the basic information that should be computed in thisphase. Furthermore, the stability of the position information is critical in such adrawing application. Thus, the extracted features should have high stability.

We extract the contours in the image and these contours represent the shapeof hands. These contours will be helpful for the feature extraction process. Afterperforming the depth segmentation and the skin based segmentation methods forthe image, the image may still have redundant regions. For example, there are smallareas that are incorrectly identified as skin parts. Therefore, after the contours areextracted, we perform a blob pruning method to remove contours with small areasas these small areas do not probably belong to hands (see Figure 5). After that were-compute the contours. Then, we proceed to perform the hand feature extractionalgorithm which has three steps.

Fig. 5. Left: Small regions may be remained after depth segmentation and skin color detection. In

this case, there are three small regions. The green lines are the contours of regions. Right: afterapplying the blob pruning method.

Page 8: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

8 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

Step One: distance transform. We apply distance transform on the image(see Figure 6(a)) to find the distance between each pixel to the hand contour.Figure 6(b) shows the result of distance transform. We choose the pixel with thelargest distance to be the approximated palm center.

Fig. 6. Hand feature extraction result. (a) the extracted contour; (b) the distance transform result;

(c) the extracted hand features, including the palm center and the finger point; (d) the time seriescurve. The pink line indicates the height threshold.

Step Two: finger extraction based on the time series curve. The timeseries curve records the distance between each point on the contour and a referencepoint. This kind of a shape representation has been successfully used for the clas-sification and clustering of shapes, for example in the work by Keogh et al. [20]. Inour system, we set the reference point as the palm center. The horizontal dimensionof the time series curve is the relative angle with respect to the vector (1, 0). Andthe vertical dimension is the distance between the contour points and the referencepoint.

In the time series curve, each peak corresponds to a finger. We remove theportion of the time series curve lies underneath a user defined height threshold.Then, the image can be decomposed into fingers. In our system, the height thresholdis set to 1.6 times the distance at the approximated palm center. Figure 6(d) showsthe time series curve and the pink horizontal line indicates the height threshold.The part underneath the pink line is eliminated, resulting in the part belonging toa finger.

Step Three: computation of the finger position. For the gesture recog-nition, we can use the decomposition result to find out the number of fingers inan image. However, in order to find the stable finger feature points, we apply theheight threshold to the original color image and then obtain the finger blobs. Wedo not extract fingertips due to that the fingertips are unstable. The fingertips mayslightly change from frame to frame even though the fingers do not move. This kindof slight movement of fingers leads to unnatural sand art drawing. We attempted toimplement a Kalman filter [18] to stabilize the positions of the fingertips. However,after applying the Kalman filter to obtain the fingertips, it may lead to an unnat-ural reaction for the drawing action. For example, the user may draw a curve withcorners but the system would return a smooth curve without any corners.

Page 9: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 9

In our system, we first draw a black circle on the approximated palm center.The radius of the circle is equal to the height threshold that is used in Step 2.After that, the image contains only the finger blobs and the blobs should have beensmoothen in the image pre-processing phase. We compute an oriented bounding boxto bound the finger blob. The middle point of the edge of the oriented boundingbox that is farthest away from the palm center is the approximated finger point.Figure 6(c) shows the result of this step. The pink point at the fingertip is obtained.Experimental results show that this method achieves acceptable results.

6. Motion/gesture management

We study the user’s habits in sand art drawing so that we can properly design thekey gestures for the sand art drawing system. We also note the following facts aboutsand drawing actions:

1. When users use their hands to erode the sand, they tend to use only onefinger. The index finger is used the most.2. In sand spilling and leaking, the users grab the sand and release a portion of thesand from their hands.3. In area erosion, users may open their palm, and use their palm to feel about thesand and erode the sand within the area.

For the first case, we can also detect a single finger to trigger the single-pointerosion. However, we face a problem for the other two cases: we cannot feel the sandin such a non-touching interface. Therefore, we must find another way to performthe other two sand drawing actions.

We cannot feel the sand when doing area erosion. Thus, we use the two-fingergesture to make sure that the user can see where to draw. On the other hand, ifwe use the same way to spill the sand in real world, the hand gesture would be theclosed palm. The amount of sand for spilling depends on how close the fingers arenear to each other. However, it is hard to check since the low-cost depth sensor doesnot have so high level of accuracy to capture such detail of closeness of fingers.

Based on the aforementioned discussion, we define the mapping for the keygestures to the drawing operations in a drawing system in Table 1. We also designthe corresponding key gestures as shown in Figure 7. Although the closed palmgesture is not used in the spilling action, it is used for another important action insand drawing. This important action is the halt action. Because we cannot feel thevirtual sand, it is a problem for determining when the users stop drawing. With thishalt action, a user can control when he/she wants to stop the drawing process. Theopened palm is used when we want to spill the sand. The single finger determinesthe point that we want to do the single-point operation, such as single-point erosionand sand leaking. The two-finger gesture corresponds to the pinch erosion and pinchspilling in the sand art drawing system.

Although we have defined a set of key gestures, these key gestures are not appliedat the same time while sand art drawing is performed. In other words, there is only

Page 10: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

10 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

Table 1. The mapping between the four key gestures and the sand art drawing operations

Gesture in command area Gesture in sand drawing area ActionClosed palm - HaltSingle-finger Single-finger Single-point erosionOpened palm Single-finger Single-point spillingOpened palm Two-finger Pinch spilling

Two-finger Single-finger LeakingTwo-finger Two-finger Pinch erosion

(a) (b) (c) (d)

Fig. 7. Key gestures of the system. (a) closed palm. (b) opened palm. (c) single-finger. (d) two-

finger.

Fig. 8. The finite state machine for controlling the drawing actions.

one action that can be activated at one moment. We must develop a subsystem todetermine which drawing action is activated at a time but the other drawing actionsare deactivated. We consider each sand drawing process as a finite state machine(see Figure 8). The user must use the halt gesture before changing to the otherdrawing actions.

Figure 9 shows the user interface of our system. The left area is the commandarea that would not affect the virtual sand and a user can give command gesturesto change the sand drawing actions. The right area is the sand drawing area thatthe user actually manipulates the virtual sand.

7. EXPERIMENTAL RESULTS

In the following we present our experimental results including the experimentalenvironment, results of gesture recognition, results of sand art drawing and a user

Page 11: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 11

Fig. 9. The integration of our system with a sand art system. Left area marked as blue color is thecommand area. Right area marked as green color is the sand drawing area. In this case, two-finger

mode, i.e., sand leaking, is used.

Fig. 10. The environment setting.

testing study.

7.1. Experimental environment

In our experiment, the Microsoft Kinect sensor was used to capture the color imageand depth image. The resolution of an image was 480x640 and the images werecaptured by 30 fps. We used OpenNI library to calibrate the depth image with thecolor image. The testing process was performed on a standard desktop PC with2.40GHz Intel Core Quad CPU. The system works fine when the user is around40-60cm in front of the Microsoft Kinect sensor. The height of the Microsoft Kinectsensor should be a little above the head so that the user can operate the systemcomfortably. The user can sit on a chair (see Figure 10). We integrated our systemwith the sand drawing system [8] which is implemented based on a height map anda particle system.

Page 12: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

12 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

Fig. 11. Two failure cases. In case one, it is the single-finger gesture but there is no finger extractedfrom the time series curve. In case two, it is not possible to recognize the correct gesture because

the image is blurred heavily.

7.2. Results of gesture recognition

In order to evaluate the method for recognizing gestures, we collected around 500frames of offline data for each gesture. In each trial, a user performed a specific keygesture (e.g., single-finger, two-finger, etc) in front of the camera, and then movedthe hand around freely. Table 2 shows the performance result of the method. Theaccuracy rate is above 90 percent for all the trials and all the gestures.

Table 2. The performance of the method for offline data

Closed palm Opened palm One-finger Two-fingerNo. of frames 483 510 558 542

Incorrect 0 13 11 28Accuracy 100.00% 97.45% 98.02% 94.83%

In the incorrect case, most of them are due to the hand orientation problem andfast movement of hands. Figure 11 illustrates two failure cases. In the first case, thefingers are inside the hand contour. Thus, our method does not extract correctlythe portions of the fingers. The time series curve is quite flat in this case. In thesecond case, the hand moves too fast and the captured image is blurred (d). Thepeaks of the time series curve are not mapped exactly to the fingers of the hand.

7.3. Results of sand art drawing

The results of sand art drawing are shown in Figure 12 for the basic drawing actions.Figure 13 shows the pictures which were drawn in real time. As can be revealed, ourapproach can produce a variety of pictures. The integration of our system and thesand art system does not affect much for the performance of the sand art system.

Page 13: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 13

(a) single-point erosion. (b) pinch erosion. (c) single-point spilling.

(d) pinch spilling. (e) sand leaking.

Fig. 12. The sand drawing result of each action in the sand art drawing system. (a) single-point

erosion: sand is pushed away from the finger position. (b) pinch erosion: sand is pushed awayfrom the region formed by two fingers. (c) single-point spilling: sand particles are spread around

the finger position. (d) pinch spilling: sand particles are spread around the region formed by twofingers and then fall down to the drawing surface. (e) sand leaking: sand particles are created at

the finger position and then fall down to the drawing surface.

In the spilling and area erosion actions, the action delay is increased by only a fewmilliseconds. Thus, the extra computation for hand detection does not affect thedrawing process. As the sand art drawing system is a real time dynamic system,the users can also see the sand move naturally.

7.4. A user testing study

We carried out a user testing study for the sand art drawing system. There were fiveparticipants. In the user testing, we first described and demonstrated the system.Then, we let the users operate the system, do some basic drawing operation, andget familiar with the system. At this stage, all of them could realize the associationbetween the key gestures and the sand art drawing actions under one to two minutes.Moreover, they can easily perform their desired operations.

In the next stage, we asked participants to draw some paintings using our system.At this stage, all of them could use our system to draw pictures in a few minutes.After that, we consulted with participants to evaluate the accuracy, the challenge,and the applicability of our system. All of them were satisfied with the accuracy ofour system. However, most of them agreed that the most challenging thing is thatthey cannot feel the sand. So the three-dimension interaction lacks of controllingsand in detailed drawings. On the other hand, our proposed interface encountersanother problem that users may feel uncomfortable if it takes too long time to draw.This is due to that the hands hang in the air and they felt tired. After all, theyfound that our system was interesting and useful for drawing sand art pieces.

Page 14: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

14 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

Fig. 13. The results of our system. All the pictures were drawn by using Kinect under a fewminutes. Top row (from left to right): music symbols, an atom, and a human face. Middle: the

tableware, robot, castle, and sun flower. Bottom: text, flower, love symbol, and night sky.

8. CONCLUSIONS

In this paper, we have presented an interactive sand art drawing system usingone RGB-D sensor, such as Kinect. A vision-based bare-hand detection method isadopted to recognize the four key gestures. The finger positions are also computedusing their bounding boxes. Then, the finger positions and hand gestures are in-jected to the sand art drawing system. Our system supports the basic sand drawingactions, including leaking, single-point spilling, single-point erosion, pinch spilling,pinch erosion. Experimental results show that the users can use bare hands to drawa variety of sand pictures.

There are avenues for future work. We would like to enhance the interactionexperience of users by introducing behavior models to make the drawing processnaturally and comfortably. We would also like to enhance the drawing tools so thatthe users can draw pictures with great creativity. For example, the current systemcreates sand particles at the finger position or the region formed by two fingers. Wewould like to extend our system so that the sand particles can be spread in a specificor different shapes when the hand moves around. Finally, the sand drawing systemcould be improved further so that a large number of particles can be simulated inreal time rate.

Page 15: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 15

9. Acknowledgements

We thanked the anonymous reviewers for their useful and constructive comments.This work was supported in part by Construction of Intelligent Situation Systemsby Advanced Computer Graphics, 3D Video, and Cloud Storage Technologies (NSC103-2218-E-009-009) and the National Science Council Taiwan under contract num-ber NSC 102-2221-E-009-103-MY2.

References

[1] Adobe. Photoshop. From World Wide Web,http://www.adobe.com/tw/products/photoshop.html, 2014.

[2] A. Barczak and F. Dadgostar. Real-time hand tracking using a set of cooperativeclassifiers based on haar-like features. Res. Lett. Inf. Math. Sci., 7:29–42, 2005.

[3] B. Baxter, V. Scheib, M. C. Lin, and D. Manocha. Dab: interactive haptic paintingwith 3d virtual brushes. In Proceedings of the 28th annual conference on Computergraphics and interactive techniques, pages 461–468, 2001.

[4] N. Bell, Y. Yu, and P. J. Mucha. Particle-based simulation of granular materials.In Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computeranimation, pages 77–86, 2005.

[5] K.-C. Chen, P.-S. Chen, and S.-K. Wong. A Hybrid Method for Water Droplet Sim-ulation. In Proceedings of the 11th ACM SIGGRAPH International Conference onVirtual-Reality Continuum and Its Applications in Industry, pages 341–344, 2012.

[6] K.-C. Chen, P.-S. Chen, and S.-K. Wong. A Heuristic Approach to the Simulationof Water Drops and Flows on Glass Panes. Computers & Graphics, 37(8):963–973,2013.

[7] K.-M. Chen and S.-K. Wong. Interactive Sand Art Drawing Using Kinect. In The7th International Symposium on Visual Information Communication and Interaction,pages 78–87, 2014.

[8] P.-Y. Chen and S.-K. Wong. Real-time auto stylized sand art drawing. InInternational Conference on Computer-Aided Design and Computer Graphics(CAD/Graphics), pages 439–440, 2013.

[9] Q. Chen, N. D. Georganas, and E. M. Petriu. Real-time vision-based hand gesturerecognition using haar-like features. In Instrumentation and Measurement TechnologyConference Proceedings, 2007. IMTC 2007. IEEE, pages 1–6, 2007.

[10] D. Dave, A. Chowriappa, and T. Kesavadas. Gesture interface for 3d cad modelingusing kinect. Computer-Aided Design and Applications, 10(4):663–669, 2013.

[11] Z. Feng, S. Xu, X. Zhang, L. Jin, Z. Ye, and W. Yang. Real-time fingertip tracking anddetection using kinect depth sensor for a new writing-in-the air system. In Proceedingsof the 4th International Conference on Internet Multimedia Computing and Service,pages 70–74, 2012.

[12] S. Ghosh, J. Zheng, W. Chen, J. Zhang, and Y. Cai. Real-time 3d markerless mul-tiple hand detection and tracking for human computer interaction applications. InProceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuumand its Applications in Industry, pages 323–330, 2010.

[13] G. Hackenberg, R. McCall, and W. Broll. Lightweight palm and finger tracking forreal-time 3d gesture control. In IEEE Virtual Reality Conference (VR), pages 19–26,2011.

[14] J. Han, S. Heo, H.-E. Lee, and G. Lee. The irpen: A 6-dof pen for interaction withtablet computers. IEEE Computer Graphics and Applications, 34(3):22–29, 2014.

Page 16: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

16 Kai-Min Chen and Sai-Keung Wong and Ting-Yu Chen

[15] K. Iwasaki, H. Uchida, Y. Dobashi, and T. Nishita. Fast Particle-based Visual Sim-ulation of Ice Melting. Computer Graphics Forum, 29:2215–2223, 2010.

[16] H. M. Jaeger, S. R. Nagel, and R. P. Behringer. Granular solids, liquids, and gases.Reviews of Modern Physics, 68(4):1259–1273, 1996.

[17] Y. Jia, S. Li, and Y. Liu. Tracking pointing gesture in 3d space for wearable visual in-terfaces. In Proceedings of the international workshop on Human-centered multimedia,pages 23–30, 2007.

[18] R. E. Kalman. A new approach to linear filtering and prediction problems. Journalof basic Engineering, 82(1):35–45, 1960.

[19] R. H. Kazi, K. C. Chua, S. Zhao, R. Davis, and K.-L. Low. Sandcanvas: A multi-touchart medium inspired by sand animation. In Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems, pages 1283–1292, 2011.

[20] E. Keogh, L. Wei, X. Xi, S.-H. Lee, and M. Vlachos. Lb keogh supports exact indexingof shapes under rotation invariance with arbitrary representations and distance mea-sures. In Proceedings of the 32nd international conference on Very large data bases,pages 882–893, 2006.

[21] S.-Y. Lii and S.-K. Wong. Ice Melting Simulation with Water Flow Handling. TheVisual Computer, 30:531–538, 2014.

[22] S. Malassiotis, N. Aifanti, and M. G. Strintzis. A gesture recognition system using3d data. In 3D Data Processing Visualization and Transmission, 2002. Proceedings.First International Symposium on, pages 190–193, 2002.

[23] K. T. McDonnell, H. Qin, and R. A. Wlodarczyk. Virtual clay: A real-time sculptingsystem with haptic toolkits. In Proceedings of the 2001 symposium on Interactive 3Dgraphics, pages 179–190, 2001.

[24] J. M. Moshell, X. Li, C. E. Hughes, B. Blau, and B. F. Goldiez. Nap-of-earth flightand the real-time simulation of dynamic terrain. In Cockpit Displays and VisualSimulation, pages 118–129. International Society for Optics and Photonics, 1990.

[25] K. Onoue and T. Nishita. Virtual sandbox. In Proceedings of 11th Pacific Conferenceon Computer Graphics and Applications, pages 252–259, 2003.

[26] M. Piccardi. Background subtraction techniques: a review. In IEEE InternationalConference on Systems, man and cybernetics, volume 4, pages 3099–3104, 2004.

[27] K. Ramani, K. Lee Jr, R. Jasti, et al. zpots: a virtual pottery experience with spatialinteractions using the leap motion device. In CHI’14 Extended Abstracts on HumanFactors in Computing Systems, pages 371–374, 2014.

[28] J. Shi, M. Zhang, and Z. Pan. A real-time bimanual 3d interaction method basedon bare-hand tracking. In Proceedings of the 19th ACM international conference onMultimedia, pages 1073–1076, 2011.

[29] P. Song, H. Yu, and S. Winkler. Vision-based 3d finger interactions for mixed realitygames with physics simulation. In Proceedings of The 7th ACM SIGGRAPH Inter-national Conference on Virtual-Reality Continuum and Its Applications in Industry,page 7, 2008.

[30] V. Vezhnevets, V. Sazonov, and A. Andreeva. A survey on pixel-based skin colordetection techniques. In Proceedings of Graphicon, volume 3, pages 85–92, 2003.

[31] C. Von Hardenberg and F. Berard. Bare-hand human-computer interaction. In Pro-ceedings of the 2001 workshop on Perceptive user interfaces, pages 1–8, 2001.

[32] S. Wu, Y. Zhang, S. Zhang, X. Ye, Y. Cai, J. Zheng, S. Ghosh, W. Chen, and J. Zhang.2d motion detection bounded hand 3d trajectory tracking and gesture recognitionunder complex background. In Proceedings of the 9th ACM SIGGRAPH Conferenceon Virtual-Reality Continuum and its Applications in Industry, pages 311–318, 2010.

[33] H.-S. Yeo, B.-G. Lee, and H. Lim. Hand tracking and gesture recognition system for

Page 17: Interactive Sand Art Drawing Using RGB-D Sensor

October 4, 2014 17:46 WSPC/INSTRUCTION FILE ws-ijseke˙sand˙kinect

Interactive Sand Art Drawing Using RGB-D Sensor 17

human-computer interaction using low-cost hardware. Multimedia Tools and Appli-cations, pages 1–29, 2013.

[34] B. Zhu and X. Yang. Animating sand as a surface flow. Eurographics 2010, ShortPaper, 2010.

[35] Y. Zhu and R. Bridson. Animating sand as a fluid. In ACM Transactions on Graphics,volume 24, pages 965–972, 2005.