the dawn of personal drones - alexandre alapetite

10
The Dawn of Personal Drones Abstract This paper argues that human interaction with unmanned flying vehicles (UAVs or “drones”) should be recognized as a CHI research area. The question of how to interact with drones may be approached empirically by experiments emphasizing performance measurements when a human controls a drone. We present our examination of a “fly-where-you-look” principle and map out other human control modalities for drone interaction. Finally, we suggest that a personal drone may have a significant impact in the future by amplifying vision and becoming a cornerstone in a community-based 3D mapping of the physical world. Author Keywords Drones, UAV, Gaze interaction; mobility; head-mounted displays, ACM Classification Keywords H.5.m. Information interfaces and presentation (e.g., HCI): Miscellaneous. General Terms Experimentation, Human Factors, Unmanned flying vehicles, UAVs, Drones, Eye tracking Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. CHI’13, April 27 – May 2, 2013, Paris, France. Copyright 2012 ACM 978-1-XXXX-XXXX-X/XX/XX...$10.00. John Paulin Hansen IT University of Copenhagen Rued Langgaards vej 7 Copenhagen, Denmark [email protected] Emilie Møllenbach University of Copenhagen Dept. of Computer Science Njalsgade 128-132 DK-2300 København [email protected] Alexandre Alapetite Dept. of Management Engineering Technical University of Denmark Produktionstorvet 424 DK-2800 Kongens Lyngby [email protected] I. Scott MacKenzie Dept. of Computer Science and Engineering York University Toronto Canada M3J 1P3 [email protected] Fiona B. Mulvey Eye Tracking Group Humanities Laboratory Lund University 22362 Lund, Sweden [email protected]

Upload: others

Post on 21-Jan-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The Dawn of Personal Drones - Alexandre Alapetite

The Dawn of Personal Drones

Abstract This paper argues that human interaction with unmanned flying vehicles (UAVs or “drones”) should be recognized as a CHI research area. The question of how to interact with drones may be approached empirically by experiments emphasizing performance measurements when a human controls a drone. We present our examination of a “fly-where-you-look” principle and map out other human control modalities for drone interaction. Finally, we suggest that a personal drone may have a significant impact in the future by amplifying vision and becoming a cornerstone in a community-based 3D mapping of the physical world.

Author Keywords Drones, UAV, Gaze interaction; mobility; head-mounted displays,

ACM Classification Keywords H.5.m. Information interfaces and presentation (e.g., HCI): Miscellaneous.

General Terms Experimentation, Human Factors, Unmanned flying vehicles, UAVs, Drones, Eye tracking

Permission to make digital or hard copies of all or part of this work for

personal or classroom use is granted without fee provided that copies are

not made or distributed for profit or commercial advantage and that

copies bear this notice and the full citation on the first page. To copy

otherwise, or republish, to post on servers or to redistribute to lists,

requires prior specific permission and/or a fee.

CHI’13, April 27 – May 2, 2013, Paris, France.

Copyright 2012 ACM 978-1-XXXX-XXXX-X/XX/XX...$10.00.

John Paulin Hansen IT University of Copenhagen Rued Langgaards vej 7 Copenhagen, Denmark [email protected] Emilie Møllenbach University of Copenhagen Dept. of Computer Science Njalsgade 128-132 DK-2300 København [email protected] Alexandre Alapetite Dept. of Management Engineering Technical University of Denmark Produktionstorvet 424 DK-2800 Kongens Lyngby [email protected]

I. Scott MacKenzie Dept. of Computer Science and Engineering York University Toronto Canada M3J 1P3 [email protected] Fiona B. Mulvey Eye Tracking Group Humanities Laboratory Lund University 22362 Lund, Sweden [email protected]

Page 2: The Dawn of Personal Drones - Alexandre Alapetite

Introduction Unmanned flying vehicles (UAVs) or “drones” have a long history in military applications. They are able to carry heavy loads over long distances, while being controlled remotely by an operator. However, low-cost drones are now entering the mainstream of computing with many non-military applications. By comparison, these drones are light-weight, fly only a limited time (i.e., <20 minutes), and have limited range (i.e., < 1 km). For instance, the Parrot AR.Drone cost around $400 and can be controlled from a PC, tablet, or smartphone (Figure 1). It has a front-facing camera transmitting live-images to the pilot via Wi-Fi and a simple interface (Figure 2). There is a user-community sharing videos, tips, and new software applications supported by the open-API.

More advanced flyers may prefer to build their own drone. There is a growing community at places like diydrones.com showing their prototypes and discussing improvements. Some of the drones are made of parts from electronics outlets and some are built using open-source Arduino parts intended for aviation (e.g., “Arducopters”). These drones may be equipped with cameras, GPS receivers, and sonar sensors for stabilization. Anderson [2], who initiated the community, sees this as an example of amateur makers potentially revolutionizing the industry by sharing, for instance, code for a 3D-printout of a construction part.

This paper describes a variety of non-military uses of drones. Our motivation is human-computer interaction with a particular view to the possibilities for empirical and experimental research in human-interfaces for drone control and interaction. While most of the

amateur drone pilots fly for fun, we argue that drones have a role in future everyday life because they afford a unique opportunity to see and record the world not bound by our body or gravity. We first present some professional, non-military uses of drones. Then we look into previous research within HCI. Our experiences with drones stem from our demonstration of gaze controlled flying at NordiCHI 2012 [1].These experiences taught us that control of drones, especially with new modalities, is not as straightforward as some web-videos of pilots flying with ingenious controllers or gestures might suggest (e.g., [9] or [10]). We will reflect on how to get a more scientific understanding of the problems in drone mastery with a call to the HCI community. If this research endeavor succeeds, there may then be a basis for our idea of a personal drone as a new paradigm for social and physical interaction. Obviously, privacy, safety, and security are all important research questions when dealing with drones. Space constraints, however, prevent us from addressing them thoroughly in this paper.

Professional Civilian Drones Professional civilian drones have a payload of several kilos. They may carry batteries for longer flights and carry additional equipment, such as high-quality cameras, beyond in-expensive amateur drones.

Thousands of drone pilots are licensed to inspect and spray paddy fields in Japan, a practice that extends back more than 10 years [12]. Drones could become a common agricultural machine on modern farms throughout the world since they can optimize spreading of fertilizers and pesticides. They may also assist in field surveys and livestock care.

Figure 1: Commodity drone (Parrot AR.Drone) with front-facing camera.

Page 3: The Dawn of Personal Drones - Alexandre Alapetite

Previously, movie shots from the air required expensive helicopters or film cranes. Today, professional photographers can mount a camera on a drone and deliver free-space video data at a modest cost. The view of the camera may be fixed in alignment with the direction of travel (i.e., “eyes forward”[16]). If the camera is mounted on a motor frame it can be turned independently of the drone, controlled by a second person, or it can be locked on an object by a tracking system [8].

Industry uses drones to inspect power-lines, pipe-lines, gas burners and dangerous installations, which might otherwise need to be taken out of operation before approach. Multiple drones in squads can do exploration of large surface areas and collect map data with an accuracy of a few centimeters. Afterwards, special software allows for multi-view 3D-reconstruction by combining visible tie points on the ground with GPS data from the drones. Detailed instructions for how to do this were presented on diydrones.com, which yielded the comment from a surveyor shown in the citation box to the left.

U.S. Customs and Boarder Protection are reported to have caught thousands of people illegally crossing the Mexican border with the help of drones [3]. Police and emergency personnel can use drones searching for missing persons – at night equipped with heat-sensitive cameras instead of regular video. Management of road traffic after an accident or disaster gives rise to another area where public authorities may start to use drones [3].

The above examples of professional civilian drones have four things in common: they afford low-cost

solutions compared to traditional aviation services, they are rather big, they work out-doors, and they are typically handled by a skilled operator. In contrast, we could imagine smaller, short-range drones controlled indoors and out-doors by ordinary people.

A new field – previous research Research in drone interaction is sparsely reported, probably because the technology was pioneered by the military, leaving most results undisclosed and classified. In 2003, Mohoula et al. [11] provided an early overview of human factors design issues in, for instance, automation of flight control, data-link delays, cognitive workload limitations, display design and target detection. A recent review is offered by Cahilane et al. [4], including issues of multimodal interaction, adaptive automation and individual performance differences – the latter examined in terms of spatial abilities, sense of direction, and computer game experience. Interaction with drones is studied as a case of human-robot interaction (e.g., by Goodrich & Schultz [7] motivating applied research in computer vision and robotics (e.g., by Oh et al. [13]).

In 2011 alt.chi hosted a session entitled “Look! Up in the Sky”. Two papers from that session have much in common with our idea of a personal drone. Higuchi et al. [8] proposed a drone based system, “Flying Eyes”, to track a person moving. One application is creating real-world content in computer games or for sports broadcasting. Tobita et al. [15] created a telepresence system by projecting a live face image on the front of a flying blimp (with the projector mounted inside the blimp). In this way the avatar becomes a physical object that moves at free will, speaks to people, and listens to nearby conversations.

“I'm a licensed surveyor and I see this as being a game changer for the way surveyors create topographic surveys. No longer would we need to hire the services of a photogrammetrist using a Cessna Airplane with a million dollar camera, nor would we need to field survey with GPS or a total station to locate thousands of points.”

Comment by Ty Brady, September 30, 2012, on http://diydrones.com/forum/topics/uav-for-photogrammetry-surveys?commentId=705844%3AComment%3A989380

Page 4: The Dawn of Personal Drones - Alexandre Alapetite

The CHI-community has only recently paid attention to the possibilities of drones, leaving most of the research to traditional human factors and aviation psychology. In the next section we map out human-drone control possibilities, including references to a few other studies that have addressed interaction.

Interaction with drones By our experience, interaction with drones is not easy. There are a number of highly dynamic parameters to monitor and control simultaneously, and if there are obstacles or turbulence in the air, reactions to changes must be quick. Often, delays on the data link complicate control. In particular, it is a challenge when the drone and pilot are oriented in different directions. Flying straight and flying on instruments (including live video stream from the nose camera) is less difficult than making turns and flying by direct sight. The fact that the drone may break if it crashes also puts extra stress on the pilot – this is not just a computer game!

In manual mode, there are several drone actions to monitor for effective control: speed (throttle up and down), pitch (elevator by tipping forward or backward, rudder (yaw by rotating around the Z-axis) and, aileron (roll by tipping left or right). In addition, there may be discrete controls for start (takeoff), emergency landing, video recording, or other actions depending on the payload. Some of the control may be automated, such as stabilizing the drone at a particular position or altitude even in strong wind. Some drones offer full-automated flight modes by setting GPS way-points on a map or by offering a “return to launch” function. Figure 2 shows the smartphone interface to a Parrot AR.Drone superimposed on a live video image.

The most common interface for civilian drones is the RC transmitter, which is a programmable remote control box with two joysticks, various mode switches, and pre-defined buttons. Newer models also include a small LCD-display. Figure 2 shows an example of a tablet/smartphone interface using tilt and touch to control the drone. Many of the low-cost systems provide a PC interface, some of which can be built and configured by the user and others using open web features such as Google maps on which the user can set waypoints, see the current position, and review flight mission data. There are several examples of information on the control of drones by hand and body gestures, accompanied by uploaded videos (e.g., [9] & [10]). Keyboard and speech commands have been compared in a simulated drone control task by Draper et al. [6] who found speech commands significantly better than key-control. He et al. [17] demonstrated how a person using a wheelchair could control a drone by EEG signals and eye blinking using off-the-shelf components. They describe a “FlyingBuddy” system intended to augment human mobility and perceptibility for people with motor disabilities who cannot otherwise see nearby objects. Finally, we can imagine other assistive input technologies, like a tongue mouse or sip-and-puff straw that may be relevant for future research.

The output from the drone to the pilot can take several forms. The basic form is a third-person view of drone from the ground (i.e., direct sight) and a 2D-map view. A remote video stream from the drone camera to a monitor or handheld display is also common (see Vailmont and Chappell [16] for an introduction to ego- and exocentric frames of reference in drone control). In our final section on future scenarios we elaborate on

Figure 2: Smartphone interface of the Parrot AR.Drone (FreeFlight 1.0). The throttle is controlled by touching and moving the left green circle up and down; altitude is controlled by moving the right circle up and down; turns are controlled by tilting the phone left or right. Tilting the phone forward/ backward controls pitch and tilting the phone sideways controls aileron. In addition, there is button for takeoff (bottom of display) and for emergency landing (top). At the top right there is a battery status. Several warnings (e.g., when the ultra-sound stabilizer is not working) are given in red text across the center of the display (not shown in figure).

Page 5: The Dawn of Personal Drones - Alexandre Alapetite

the idea of streaming the video from the drone to a head-mounted display (immersive or overlay/transparent). Some enthusiasts of radio-controlled model aircraft prefer to have the view from the airplane shown in display glasses when flying, because this blocks out sunlight and provides a more immersive experience.

Sound streaming from the drone is problematic because of noise generated by the drone; but it is certainly important for telepresence applications like the one suggested by Higuchi [8]. Perhaps noise filtering can help since drone noise tends to be predictable, at least indoors where there is no additional wind noise.

Clearly, the design of good interfaces for drone control is a challenge. The broad set of modalities and the different output options call for research to systematically combine and compare the different options. In the next section we suggest that the well-known Fitts’ law target acquisition test, commonly used to evaluate input devices, may also apply to drone interaction research.

Measurements and models Flying through an open or partly opened door is a good example of a target acquisition task performed when flying indoors (c.f. Figure 2). The breadth of the passage relative to the size of the drone obviously affects the difficulty of this common task. A possible lag between the controller and the drone, the resolution of the camera (when flying on a live video stream only), the lightning in the room, and the angle of approach would most likely have an effect also. And of course the

distance to the target plus the initial orientation with regard to the target would influence task time.

Several of the classical experimental paradigms within HCI could be relevant for the task just described, for instance Fitts’ law, the Hick-Hyman law, the steering law, or even keystroke-level modeling (KLM). To illustrate, we could think of a Fitts’ law experiment to compare, for instance, gaze versus joystick control. The independent variables would then be the two control methods (eye gaze, joystick), distance to target (e.g., 2, 4, 8 meters) and width of target (e.g., 0.75, 1.5 meters). Dependent variables would be task completion time, position and angle of drone when passing through, and anything else that could be observed and counted (e.g., steering corrections). Throughput is another possible dependent variable; however, there are significant barriers to the calculation, such as the inherent multi-dimensionality of the task or the difficulty in applying the customary “adjustment for accuracy”.

For a standard task, we suggest to use a target of vertical, black poles in a white room, a pair of poles at each end. The drone is piloted through a first pair of poles to begin a trial. The drone is piloted through the opposite set of poles to end a trial. The task clearly has elements of target selection, since the goal is to pass through the opposite set of poles. The task is also akin to path-following, since a successful trial will deviate minimally from the most direct path. Once the drone goes through the opposite set of poles, the drone is turned around using the disengage function, and then the trial is repeated except in the opposite direction. The control methods should be tested using counterbalancing, and for each control method,

Page 6: The Dawn of Personal Drones - Alexandre Alapetite

distance-width conditions should be presented in random order (without replacement).

At NordiCHI 2012 we tested a setup similar to this [1] when making a demo of gaze controlled flying, as seen in Figure 3. We expected this to be a very simple task, since the goal was just to fly straight ahead indoors with no wind and no obstacles. But the demo turned out to be rather difficult for the dozen or so people trying it. Only four subjects could get the drone through the target without touching the poles or colliding with the wall. We believe that there were at least three complicating factors: Some subjects could not help looking directly at the drone instead of looking at the video image (for this reason we later isolated the control station with an umbrella). Some subjects became victims of a variant of the “Midas touch” problem that is common in gaze interaction: Everything you look at will get activated. They would send the drone off in a wrong direction if they just looked away from the target. Since most of the participants were novices, they didn’t know about this. Finally, some subjects complained that there was a noticeable lag between their gaze shifts and the movement of the drone, which most likely also complicated control.

Scenario 1: Amplified vision The first idea that we invite the CHI-community to consider is the development of a control paradigm for drones that function as an extension of human vision and understood as an avatar with a human guide. The control of this device may be entirely through combinations of physiological inputs, with feedback in the form of augmented reality through a transparent display mounted on a pair of glasses. The drone sends visual information back to a heads-up, see-through

display. In this way, the remote environment is visible to the user without occluding the user’s environment (see Figure 4).

A novel approach, which appeals especially to us, is the viability of modifying the camera view of the drone using eye movements. This approach is intuitively appealing, as visual input is obtained by moving our eyes. Gaze data are sufficiently accurate and fine grained. If successfully interpreted, control would thus be implicit, meaning we do not complicate the task through additional control mappings. We also suggest investigating the efficacy of binocular eye data to measure convergence as a means of automatically selecting among multiple interfaces at various distances. It then becomes necessary to explore the viability of working with layered drone control interfaces, within several concurrent user contexts, from a perceptual point of view, with a variety of eye, head, face, and muscle input combinations.    

Gyroscopes and accelerometers are now extensively used in handheld devices and in game consoles. Applying this sensor technology to a head-mounted pair of glasses will enable the system to be sensitive to head movements. Facial movements are easy to control for most people and research suggests that they can be monitored by sensors embedded in the glasses [14]. Finally, yet another camera may be placed on the front frame of the glasses (see Figure 4, top). This can record hand gestures as input for the system. We suggest exploring the possibility of using gaze in conjunction with hand gestures, which enables richer interaction, and can afford an effective filter for accidental commands: Only when the user looks at her hands should the gesture be interpreted as input.

Figure 3: Demo set-up for gaze controlled flying at NordiCHI 2012. The person controlling the drone with gaze was standing behind a PC with a gaze tracker, looking into a live video stream from the drone. When the drone was launched by the automatic takeoff function, the task was simply to fly through the target poles 3 meters in front of the drone.

Page 7: The Dawn of Personal Drones - Alexandre Alapetite

What would people use this personal drone for – except for fulfilling an ever-fascinating dream of flying free and safe? Our belief is that personal drones are an opportunity for open-community data collection of the environment (indoors and outdoors). Previous recordings may be tagged with the steering commands and fixation points that the human guide produced when generating them. This constitutes a complementary new set of information to the recorded environment that no previous research has yet explored and with a potential breakthrough in providing vision robots with high-level human perceptual intelligence and behavioral knowledge.  

The idea of exchanging information and images recorded by gaze navigated drones would also allow for people sharing recordings of, for example, favorite sites. Stitching images together that were recorded at a famous castle, for instance, would allow for building a photo-realistic 3D model of the castle. This model could eventually become an attraction in itself that may find use as a virtual scenario for computer games or it may offer an augmented experience for visitors walking in the castle that would allow them to see through walls or fly 50 meters above the castle.

Crandall and Shavely [5] have built large 3D models of the city of Rome from hundreds of thousands Flickr photos. If videos recorded by drones are tagged with GPS information (i.e., time, latitude, longitude and altitude) and attitude (i.e., absolute pitch, roll, yaw provided by e.g. accelerometer and compass), these

models may grow far above the ground. In short, it will be like having Google-car recordings of street views, expanded to contain everything people may find interesting from all perspectives and free for them to share. As is often the case with emerging technology of a foundational kind, this provokes a most important discussion on how to avoid hostile use of the technology and how to protect privacy. These considerations are important for future HCI research.

Scenario 2: Telepresence Our second future scenario is inspired by the FlyingBuddy project [17] and the flying blimp project by Tobita et al.[15]. We have noticed that people often use videoconference systems (e.g., Skype) with a distant participant on a laptop “seated” at the table. While this works well in the meeting room, the distant participant will be left behind when the group goes for coffee or lunch. Systems like DoubleRobotics.com or WowWee.com offer a solution to the problem of immobility by mounting a camera on a driving robot. However, a robot driving on a floor cannot navigate staircases and may even get into trouble when facing a doorstep. A drone carrying a smartphone with a video link to the remote participant could get around more easily, we believe. And it could keep itself at eye-height with the other participants. Figure 5 is a cartoon illustration of a day at the office when drones are around.

Figure 4: Glasses to control a drone by gaze and gestures (top) and sketch of a person flying his personal drone with gaze and gestures (bottom).

Page 8: The Dawn of Personal Drones - Alexandre Alapetite

Figure 5: Future teleconference scenario with a drone.

2. The hover dome has four rotors providing stability. An app on the mobile device links up to the drone and makes remote control possible.

1. An architecture firm in Copenhagen has been commissioned to build a new skyscraper in downtown New York. They want to show the initial model to one of their partners in New York. They place a phone in the drone and make a call to New York.

3. In New York the partner is controlling the drone and receiving direct feedback through the camera in it.

4. They decide to continue discussing their plans while going to lunch. As they walk-and-talk down the hall, the partner in the drone continues to follow and participate in the conversation.

Page 9: The Dawn of Personal Drones - Alexandre Alapetite

Conclusion Many different disciplines will be required to achieve the visions presented in this paper. The knowledge base includes specialists in physiology, computer science, interaction design, aviation, and microelectronics. If the ideas become a full success it will constitute a fundamentally new paradigm in human-machine interactions as well as bringing into the world an intriguing hardware/software solution that will allow us to virtually fly around, using only our bodies and line of sight to guide us.

References [1] Alapetite, A., Hansen, J.P. and MacKenzie, I.S.

2012. Demo of gaze controlled flying. Proceedings of the 7th Nordic Conference on Human-Computer Interaction - NordiCHI 2012: Making Sense Through Design (New York, NY, USA, 2012), 773–774.

[2] Anderson, C. 2012. Makers: The New Industrial Revolution. Cornerstone Digital.

[3] Argrow, B., Weatherhead, E. and Frew, E.W. 2009. Real-Time Participant Feedback from the Symposium for Civilian Applications of Unmanned Aircraft Systems. Unmanned Aircraft Systems. (2009), 87–103.

[4] Cahillane, M., Baber, C. and Morin, C. 2012. Human Factors in UAV. Sense and Avoid in UAS: Research and Applications. 61, (2012), 119.

[5] Crandall, D. and Snavely, N. 2012. Modeling people and places with internet photo collections. Commun. ACM. 55, 6 (Jun. 2012), 52–60.

[6] Draper, M., Calhoun, G., Ruff, H., Williamson, D. and Barry, T. 2003. Manual versus speech input for unmanned aerial vehicle control station operations. Proceedings of the Human Factors

and Ergonomics Society Annual Meeting (2003), 109–113.

[7] Goodrich, M.A. and Schultz, A.C. 2007. Human-robot interaction: a survey. Foundations and Trends in Human-Computer Interaction. 1, 3 (2007), 203–275.

[8] Higuchi, K., Ishiguro, Y. and Rekimoto, J. 2011. Flying eyes: free-space content creation using autonomous aerial vehicles. Extended Abstracts of the ACM Conference on Human Factors in Computing Systems - CHI 2011 (2011), 561–570.

[9] Kinect controlled drone: http://www.youtube.com/watch?v=KBBCMudnAKM. Accessed: 2012-12-28.

[10] Leap and LabVIEW controlled drone: http://www.youtube.com/watch?feature=player_embedded&v=ZoiwL0SK3jA.

[11] Mouloua, M., Gilson, R. and Hancock, P. 2003. Human-centered design of unmanned aerial vehicles. Ergonomics in Design: The Quarterly of Human Factors Applications. 11, 1 (2003), 6–11.

[12] Nonami, K., Kendoul, F., Suzuki, S., Wang, W. and Nakazawa, D. 2010. Autonomous Flying Robots: Unmanned Aerial Vehicles and Micro Aerial Vehicles. Springer.

[13] Oh, H., Won, D.Y., Huh, S.S., Shim, D.H., Tahk, M.J. and Tsourdos, A. 2011. Indoor UAV Control Using Multi-Camera Visual Feedback. Journal of Intelligent & Robotic Systems. 61, 1 (2011), 57–84.

[14] Rantanen, V., Verho, J., Lekkala, J., Tuisku, O., Surakka, V. and Vanhala, T. 2012. The effect of clicking by smiling on the accuracy of head-mounted gaze tracking. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA 2012 (2012), 345–348.

Page 10: The Dawn of Personal Drones - Alexandre Alapetite

[15] Tobita, H., Maruyama, S. and Kuzi, T. 2011. Floating avatar: telepresence system using blimps for communication and entertainment. Proceedings of the ACM Conference in Human Factors in Computing Systems - CHI 2011 (2011), 541–550.

[16] Valimont, R.B. and Chappell, S.L. 2011. Look where i’m going and go where i’m looking: Camera-up map for unmanned aerial vehicles.

Proceedings of the 6th ACM/IEEE Conference on Human-Robot Interaction (HRI) (2011), 275–276.

[17] Yu, Y., He, D., Hua, W., Li, S., Qi, Y., Wang, Y. and Pan, G. 2012. FlyingBuddy2: a brain-controlled assistant for the handicapped. Proceedings of the 2012 ACM Conference on Ubiquitous Computing (New York, NY, USA, 2012), 669–670.