epidermal robots: wearable sensors that climb on the skin · 2019-04-11 · epidermal robots:...

22
102 Epidermal Robots: Wearable Sensors That Climb on the Skin ARTEM DEMENTYEV , MIT Media Lab, USA JAVIER HERNANDEZ, MIT Media Lab, USA INRAK CHOI, Stanford University, Department of Mechanical Engineering, USA SEAN FOLLMER, Stanford University, Department of Mechanical Engineering, USA JOSEPH PARADISO, MIT Media Lab, USA Epidermal sensing has enabled significant advancements towards the measurement and understanding of health. Most of the existing medical instruments require direct expert manipulation of a doctor, measure a single parameter, and/or have limited sensing coverage. In contrast, this work demonstrates the first epidermal robot with the ability to move over the surface of the skin and capture a large range of body parameters. In particular, we developed SkinBot, a 2x4x2 centimeter-size robot that moves over the skin surface with a two-legged suction-based locomotion. We demonstrate three of the potential medical sensing applications which include the measurement of body biopotentials (e.g., electrodermal activity, electrocardiography) through modified suction cups that serve as electrodes, skin imaging through a skin-facing camera that can capture skin anomalies, and inertial body motions through a 6-axis accelerometer and gyroscope that can capture changes of body posture and subtle cardiorespiratory vibrations. CCS Concepts: Human-centered computing Mobile devices; Applied computing Consumer health; Hardware Sensors and actuators; Additional Key Words and Phrases: Wearable, health, sensors, robotics, epidermal robots, skin ACM Reference Format: Artem Dementyev, Javier Hernandez, Inrak Choi, Sean Follmer, and Joseph Paradiso. 2018. Epidermal Robots: Wearable Sensors That Climb on the Skin. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 3, Article 102 (September 2018), 22 pages. https://doi.org/10.1145/3264912 1 INTRODUCTION Robots are widely used to explore and study challenging and remote settings such as the rubble of natural disasters, the bottom of the oceans, or distant planets such as Mars. While their main advantage is to provide access to these locations, such robots also enable systematic and objective exploration. With a similar philosophy in mind but a significant difference in scale, this work leverages the advantages of such robots to explore a more intimate and constrained setting: the human body. This is the corresponding author Authors’ addresses: Artem Dementyev, MIT Media Lab, 75 Amherst St, Cambridge, MA, USA, [email protected]; Javier Hernandez, MIT Media Lab, 75 Amherst St, Cambridge, MA, USA, [email protected]; Inrak Choi, Stanford University, Department of Mechanical Engineering, 440 Escondido Mall, Stanford, CA, USA, [email protected]; Sean Follmer, Stanford University, Department of Mechanical Engineering, 440 Escondido Mall, Stanford, CA, USA, [email protected]; Joseph Paradiso, MIT Media Lab, 75 Amherst St, Cambridge, MA, USA, [email protected]. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. © 2018 Association for Computing Machinery. 2474-9567/2018/9-ART102 $15.00 https://doi.org/10.1145/3264912 Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Upload: others

Post on 22-May-2020

5 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102

Epidermal Robots: Wearable Sensors That Climb on the Skin

ARTEM DEMENTYEV∗,MIT Media Lab, USAJAVIER HERNANDEZ,MIT Media Lab, USAINRAK CHOI, Stanford University, Department of Mechanical Engineering, USASEAN FOLLMER, Stanford University, Department of Mechanical Engineering, USAJOSEPH PARADISO,MIT Media Lab, USA

Epidermal sensing has enabled significant advancements towards the measurement and understanding of health. Most of theexisting medical instruments require direct expert manipulation of a doctor, measure a single parameter, and/or have limitedsensing coverage. In contrast, this work demonstrates the first epidermal robot with the ability to move over the surface ofthe skin and capture a large range of body parameters. In particular, we developed SkinBot, a 2x4x2 centimeter-size robot thatmoves over the skin surface with a two-legged suction-based locomotion. We demonstrate three of the potential medicalsensing applications which include the measurement of body biopotentials (e.g., electrodermal activity, electrocardiography)through modified suction cups that serve as electrodes, skin imaging through a skin-facing camera that can capture skinanomalies, and inertial body motions through a 6-axis accelerometer and gyroscope that can capture changes of body postureand subtle cardiorespiratory vibrations.

CCS Concepts: •Human-centered computing→Mobile devices; • Applied computing→ Consumer health; •Hardware→ Sensors and actuators;

Additional Key Words and Phrases: Wearable, health, sensors, robotics, epidermal robots, skin

ACM Reference Format:Artem Dementyev, Javier Hernandez, Inrak Choi, Sean Follmer, and Joseph Paradiso. 2018. Epidermal Robots: WearableSensors That Climb on the Skin. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 3, Article 102 (September 2018),22 pages. https://doi.org/10.1145/3264912

1 INTRODUCTIONRobots are widely used to explore and study challenging and remote settings such as the rubble of naturaldisasters, the bottom of the oceans, or distant planets such as Mars. While their main advantage is to provideaccess to these locations, such robots also enable systematic and objective exploration. With a similar philosophyin mind but a significant difference in scale, this work leverages the advantages of such robots to explore a moreintimate and constrained setting: the human body.∗This is the corresponding author

Authors’ addresses: Artem Dementyev, MIT Media Lab, 75 Amherst St, Cambridge, MA, USA, [email protected]; Javier Hernandez, MITMedia Lab, 75 Amherst St, Cambridge, MA, USA, [email protected]; Inrak Choi, Stanford University, Department of Mechanical Engineering,440 Escondido Mall, Stanford, CA, USA, [email protected]; Sean Follmer, Stanford University, Department of Mechanical Engineering,440 Escondido Mall, Stanford, CA, USA, [email protected]; Joseph Paradiso, MIT Media Lab, 75 Amherst St, Cambridge, MA, USA,[email protected].

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided thatcopies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the firstpage. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copyotherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions [email protected].© 2018 Association for Computing Machinery.2474-9567/2018/9-ART102 $15.00https://doi.org/10.1145/3264912

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 2: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:2 • A. Dementyev et al.

Let us consider, for instance, a person requiring medical attention in a place where medical services are notreadily available. Even if the person has the necessary medical equipment, s/he would need to know how tooperate it well enough to provide meaningful measurements and how to interpret the data to infer some medicalassessments. In an alternative scenario, however, one or several centimeter-sized robots could be shipped tohis/her location and be teleoperated by a physician who uses them to carefully examine the body. In a similarfashion to other exploratory robots, the physician could design and submit missions to systematically extractdifferent types of information. For instance, the robots could be instructed to arrange themselves around theheart to form a 10-electrode electrocardiogram (EKG), move to the arms to monitor muscle activity, scan the bodyin search of skin anomalies such as moles, locally examine certain wounds, and/or provide certain treatmentssuch as small injections. The physician could also decide to leave the robots on the body to collect longitudinaldata and gain a better understanding of certain evolving changes such as the appearance of moles and tumorsas well as physiological changes associated with different emotional and health conditions. The robots wouldautonomously navigate using a previous 3-D scan of the body and onboard sensors.The above scenario shows important benefits of such robots. The robots provide whole body coverage. The

robots are small, therefore can be easily transported and deployed. The robots provide continuous and repeatablesensor measurements. Furthermore, the robots can carry sophisticated sensors such as high-resolution cameras.The robots can cooperate together in a swarm. The robots can actuate their environment. All of such benefitscannot be realized with current technologies. Some technologies (e.g., surgical robots) can provide some of thebenefits such as whole-body coverage, but not all of them together.

Microcontroller PumpPump

Pressure Sensor

Pressure Sensor

Servo Servo Servo Servo

A

Suction Cup

Suction Cup

SolenoidSolenoidAir (blue)

D

B

air hose20.9mm

C

suction cuphorizontal linear servogeared DC

motor

27.1

mm

42.2mmvertical linear servo

tether

DC motor

On the robot

On the master

Fig. 1. SkinBot design. A) The system diagram of the main components of SkinBot. B) Robot attached to the arm. C) Thefront image of the robot. The robot combined four linear servo motors and DC gear motor to allow translation and rotation.D) The side view of the robot.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 3: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3

We believe that the robots described in the previous scenario belong to a new family of robots called EpidermalRobots; they live on the human skin to enable systematic exploration and study of the human body. To performtheir tasks successfully, these robots need to meet several requirements. First, the robots need to be lightweightand small (under 80 grams and centimeter-sized according to our experiments) to minimally disrupt the personwhile exploring the different parts of the body. Second, epidermal robots need to have direct access to the skin.Human skin is not only the largest organ of the human body but also offers a good proxy to capture relevantinformation about the outer skin (e.g., appearance, texture) and inner body responses such as physiological signals(e.g., electrocardiograms, electrodermal activity). Third, the robot needs the ability to adhere and locomoteover the non-uniform skin which contains many irregularities such as wrinkles, joints, and hair. Moreover, thelocomotion should be robust to different robot orientations. Forth, epidermal robots should offermultimodalsensing. The human body contains a large range of information that requires different types of sensing modalities.To ensure the robot can successfully digitize the human body, the sensing module should contain as many sensorsas possible while still satisfying the previous considerations. Finally, epidermal robots should have the ability ofaccurate self-localization on the body (under 18 mm. error rate), which is key for autonomous operation andmapping of the human body.To meet the previous requirements, we iteratively designed and built several prototypes which are shown in

Fig. 2. The latest implementation, which we call SkinBot, consists of a 2-legged suction-based locomotion system(see Fig. 1). In terms of sensors, the robot incorporates three modalities: a 6-axis accelerometer and gyroscope, askin-facing RGB camera-based microscope, and biopotential signal pickup using conductive suction cups. Theremaining of the manuscript thoroughly describes the system design of SkinBot, and a series of experimentsstudying relevant factors associated with skin locomotion, adhesion, localization energy consumption, anduser perception. Finally, the paper discusses some of the potential applications of SkinBot and provides someconcluding remarks.

1.1 ContributionsWe make the following contributions in this paper:

(1) We introduce a novel area of Epidermal Robotics.(2) As an instance of Epidermal Robots, we develop SkinBot, a robot that is capable of moving on the human

skin. The robot uses inch-worm locomotion and suction cups for attachment. The robot has limited on-bodylocalization capabilities. We believe it is the first robot capable of moving directly on the skin.

(3) We develop example applications of Epidermal Robots for medical sensing.(4) We conduct multiple studies to better understand skin locomotion, power requirements, and localization.

2 PREVIOUS WORKCurrently, there is a wide gamut of wearable sensors designed to contact the skin to monitor biosignals. Thesedevices have taken many forms such as commercial wearables [17, 34], electronic tattoos [18] or fabric integratedsensors [28] but they are usually designed for a specific body location and can only sense a limited numberof parameters. To provide continuous and repeatable skin-contact sensing of the whole body would require afull body suit with a large number of sensors (e.g., [29]). Adding multi-sensor capabilities or/and sophisticatedsensors (e.g., cameras) can make the suit heavy, power intensive and/or expensive. Furthermore, the suit wouldhave to be custom designed to fit the body to make reliable contact with the skin. Large devices such as somesurgical robots [30] offer the possibility to scan the whole body but do not usually provide long-term continuousmeasurements. In addition, those devices are too large and require support infrastructure, thus are not adequatefor remote and/or home scenarios. Some physiological parameters can be continuously sensed without skin

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 4: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:4 • A. Dementyev et al.

A B C D E F

Fig. 2. The iterative design process of SkinBot showing different prototypes. A) Rovables [9], cloth climbing robot. The initialdesigns were based on Rovables platform. B) Prototype based on a wheel-leg climbing robot design in [8]. Tracks on theoutside were used to synchronize the wheels. This prototype does not have enough contact area to reliably adhere to theskin. C) Similar wheel-leg design, but contains tracks on the inside. This design also does not have enough contact area. D)Prototype with sticky tracks. E) Initial suction-based robot with two servo motors that allows suction cups to extend andcontract. We found that two motors are not enough for reliable movement, as the suctions need to be pushed down to createa reliable seal. F) Current robot design, which is explained in detail in this paper.

contact such as using radio signals or cameras to sense heart rate and respiration [1, 26], but usually cannotprovide as much information as contact-based electrodes.In terms of locomotion, many of the existing studies have devised successful climbing mechanisms through

pinching [22], magnetic wheels [9, 11], suction [6, 25], or gecko-like adhesion [8, 19, 32]. However, most of theprojects consider locomotion on clothes, irregular surfaces, and/or flat surfaces which offer a different set ofchallenges than the skin. Robots moving on the clothing (e.g., [9]) do not provide reliable skin contact. Oneexception is HeartLander [25] which is a suction-based robot with the ability to locomote over the surface ofthe heart and deliver injections. While heart tissue offers a different set of challenges and applications than theepidermis, their design serves as an inspiration for this work and complements the solution presented here. Weare not aware of any robots that can move directly on the skin.To a limited extent, research in the human-computer interactions (HCI) field has explored the use of robots

and actuators as wearable devices. For instance, some studies have explored the idea of robots that move onclothes [9, 16, 31, 36] for haptics, interactions, and fashion applications. A different line of work has exploredthe use of wearable actuators in various form factors such as watches (e.g., [15, 24, 37]). In the sensor context,Parasitic Mobility [20] concept explored how sensor nodes can attach and jump from one human host to another.To the best of our knowledge, no previous studies have explored direct skin climbing.

3 ROBOT DESIGNFor reference and reproduction, all the design files and software can be found in an online repository1.

3.1 Skin AdhesionHuman skin has complex mechanical behavior and is elastic at small loads. In particular, its Young’s modulusranges from about 0.1MPa to 1.1MPa [10], with a large dependence on the test subject’s age and on the mechanicalmodel and measurement instrument [2]. In addition, the human body has some degree of curvature and featuressuch as hair which makes adhesion even more challenging. Since skin is a complex surface, we conduct in vivoexperiments as much as possible. However, in some cases such as testing specific skin curvature, we performexperiments on artificial skin created with silicone (Ecoflex 00-30, Smooth-On)2 that has similar Young’s modulus3to the skin.1https://github.com/adementyev/SkinBot2https://www.smooth-on.com/products/ecoflex-00-30/3Young modulus is a measure of material’s stiffness. It is defined by the relation between strain and stress

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 5: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:5

0 20 40 60 80 100 120 140 160 180 200Adhesion attempts

0

20

40

60

80

100

120

Rel

ease

forc

e (g

f)sticky gel pad fithydrogel fitsticky gel padhydrogel

Fig. 3. Test of different adhesives with the skin. Testing of two adhesives for repeated sticking and releasing on the skin.Overlaying the data are lines of linear fitting. We tested a hydrogel and a sticky gel pad. The hydrogel worked for moreadhesions than sticky gel pad, before losing adhesive properties. The release forces had large variations from sample tosample. The testing was conducted with a 20N force gauge, and peak force was recorded.

We designed and built a total of six robot prototypes that considered different locomotion systems (Fig. 2).On the one hand, some adhesion approaches such as pinching the skin were not practical and were excludedfrom the start. On the other hand, adhesive wheels and tracks did not provide consistent adhesion force. Forinstance, the adhesive force of two commercial adhesives (Katecho and Premium Fixate Cell Pads, CloudValley)decreased with each peel by about half a percent (Fig. 3) while having large variations between peels. Withcontinuous attachment and detachment cycles, required for locomotion, the adhesive force quickly degrades.In addition, the hydrogel adhesive picks up dirt, oil and dead epithelial from the skin, thus requiring periodiccleaning. After considering different methods, we finally selected a suction-based approach which was inspiredby living organisms such as leeches and cephalopods (e.g., squid, octopus).In suction-based locomotion, a suction force appears when a lower pressure is created inside a cup and

the pushing of the atmospheric pressure causes an adhesion force. While suction cups used in the industrialapplications are usually made of soft rubber, we found that rigid cups can be used with the skin. Under vacuum,the flexible skin gets pulled into the cup to seal the skin-cup interface (shown in Fig. 4C). While rigid suctioncups can be quickly prototyped with a standard 3D printer, flexible suction cups require a multi-part siliconemold. We found that the bell-shaped cup worked well with the skin. The bell provided a large inside volume, intowhich the skin can expand. Same bell design is often used in cupping therapy, an alternative medicine in whichsuction cups are applied to reduce pain and swelling.

The final implementation of SkinBot uses two 9mm-diameter suction cups. To make the suction cups as well asall other mechanical parts, we used a 3D printer (Form 2, Formlabs, gray resin). The 9mm suction cup providedthe best size-to-adhesion-force tradeoff and is analyzed in more detail in the results section. This configurationprovided a maximum of 200gf (gram-force) adhesion force which is enough to securely hold 20-gram SkinBot.The minimum vacuum pressure required for skin adhesion was measured to be about -10kPa. However, we pullthe maximum vacuum of -30kPa using a small membrane diaphragm pump (SC3101PM, Skookum Electronic co.).This pressure was determined by the construction of the pump, specifically by the piston displacement volume.When the vacuum pump is turned off, due to leakage, the pressure slowly returns to atmospheric pressure. Tospeed up this process, we added a solenoid valve (S070C-SDG-32, SMC Pneumatics) that vents the vacuum line.Using a diaphragm pump has also some problems such as the loudness (55 dB from 30cm) and the large size

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 6: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:6 • A. Dementyev et al.

D

0 10 20 30 40Time (sec)

-40

-30

-20

-10

0

10

Vacu

um P

ress

ure

(kPa

)

Attached DetachedPump offDetached

C

A

d

D

BNot attached Attached

0ms 290ms 373ms

456ms 539ms 622ms

Fig. 4. Suction cup design for skin attachment. A) The cross section of the suction cup CAD model. B) The vacuum pressurechanges during the suction cup attachment and detachment. C) Snapshots of suction cup attachment to the skin.

(32x8x18mm). We also considered piezoelectric pumps but current commercial models (e.g., mp5, Servoflo) onlyprovide a maximum of -10kPa which does not leave any safety factor for air leaks, hair and pump variationswhich can make adhesion less reliable.

3.2 Skin LocomotionWe wanted to achieve locomotion with the ability to turn with a minimum number of motors as they are one ofthe largest components. The selected gait was inspired by an inchworm mechanism where climbing is achievedby creating an anchor point and pushing the body away from that point. At a minimum, this motion requires oneactuator, to extend and contract the body. In our case, however, we used two linear servo motors to extend andcontract the body to allow independent left and right side control. In particular, we used linear servo motors(SPMSA2005, Spektrum) with a 9.1mm throw, commonly used in small radio controlled airplanes which canpull 100gf (gram-force) and weight around 1.8 grams. Furthermore, at least two independent anchor points arerequired, which can be detached and attached on demand. Thus, two independent pumps and suction cups wereused to provide controllable attachment. Also, we added two of the same linear motors to move the suction cupsup and down. This prevented dragging of the end effector during extension and contraction and gave the robotthe ability to attach to non-uniform surfaces. Finally, we added a planetary gear motor (TGPP06-D-136, TT Motor)and a motor controller (DRV8835, Texas Instruments) to add turning ability around one of the suction cups.

One of the key challenges with skin locomotion is to ensure reliable adhesion in the new end effector position.To address this, we added an air pressure sensor (MPXV611, NXP) in each of the vacuum lines to detect if thesuction cup is attached. The pressure data was collected at 100Hz and filtered by a moving average of 20 samplesto remove oscillations of the pump. The attachment was insured by moving the suction cup down in smallincrements and checking for adhesion each time. The whole locomotion was controlled by a finite state machinewith 6 states and was implemented on an ARM Cortex M3 microcontroller (Teensy 3.6, PJRC). As shown inFig. 5A, the transitions of the state machine were controlled by the pressure sensors. We conducted the testing

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 7: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:7

using a tethered prototype which contained valves, pumps, power and control electronics on a separate masterboard, shown in Fig. 6. The overall system architecture is shown in Fig. 1A.

Check left pressure for attachment

turn right pump off, release right valve

move left cup down by 1mm

PL > - 20kPaPL <= - 20kPa

Wait for right pump release

PR < -6kPa

Lift up, rotate, and pull forward

turn on both pumps

Check right pressure for attachment

PR > - 20kPa

PR <= - 20kPa

left pump off, release left valve

Wait till left pump release

PL >= -6kPa

PL< -6kPa

Lift up, rotate , and pull forward

Movement CAD model for the states

1

2

34

5

6

Stat

es

1 2 3 4 5 6 7 8 9 10Time (sec)

-30

-20

-10

0

Pres

sure

(kPa

)

LeftRight

23 4 5 6

1 23

1 Cycle4 5

1

4 652165432 31State

A B

C

PR >= -6kPa

move right cup down by 1mm

turn on both pumps

Check left pressure for attachment

turn right pump off, vent right valve

increment left vertical motor position

PL > - 20kPa

PL <= - 20kPa

Wait for right pump release

turn on both pumps

Check right pressure for attachment

PR > - 20kPa

PR <= - 20kPaleft pump off, vent left valve

Wait till left pump release

PL >= -6kPa

Lift up and pull forward1

2

34

5

6

Stat

es

1 2 3 4 5 6 7 8 9 10Time (sec)

-30

-20

-10

0

Pres

sure

(kPa

)

LeftRight

23 4 5 6

1 23

1 Cycle4 5

1

4 652165432 31State

A B

C

PR >= -6kPa

PL < -6kPa

increment right vertical motor position

PR < -6kPa

Lift up and pull forward

1. The left cup moves down and starts vacuuming.

2. Once the left cup is fullyattached on skin, the right cupstarts to release vacuum pressure.

3. The right cup movesup and forward.

4. The right cup moves downand starts vacuuming.

5. Once the right cup is attached,the left cup releases vacuum pressure.

6. The left cup moves up andforward.* The geared DC motor rotates if therobot need to change the orientation.

pressuredirection

suction cupmovement

left cup

right cup

Fig. 5. Locomotion mechanism overview. A) Finite state machine diagram for the robot locomotion. The rounded rectanglesand arrows represent the states and transitions, respectively. The red numbers next to the circles indicate specific states.The state machine is controlled by the pressure sensors. States 1 and 4 involved reattaching suction cups which was doneby moving the suction cup down in increments and checking the pressure each time. B) Pressure changes during thelocomotion sequence which was measured independently on the right and the left suction cups. The diagram also shows thecorresponding states of the finite state machine on the top. C) Model representing physical locomotion.

3.3 Multimodal SensingHaving a robot with direct access to the skin offers a unique opportunity to capture a wide range of physiologicaland behavioral cues while providing repeatable and complete coverage of the body. We designed SkinBot tosupport three types of sensing modalities but, in the future, we envision that different sensing platforms could beadded depending on the specific needs.Skin Electrical Properties. SkinBot contains an implementation of a circuit to monitor the electrical proper-

ties of the skin. To do so, we glued stainless steel washers (ID=9.0mm, OD=12mm, thickness=2mm) around thesuction cups so they could also serve as electrodes. The electrodes can be used to capture electrocardiogramsfrom the chest (ECG), electromyographic signals of the muscles (EMG), and electrodermal activity (EDA) fromareas of the body where the density of eccrine sweat glands is dense (e.g., wrists, upper arm) [5]. As the robotmoved to different locations, Fig. 7 shows EMG, ECG, and EDA traces captured from the chest, upper arm andthe interior part of the wrist, respectively. The detailed biopotential circuit diagram is shown in Fig. 8.

To capture biopotentials we used an instrumentation amplifier (INA114, Analog Devices) as front-end to rejectthe residual 60Hz noise. A 0.16Hz high-pass filter provided DC drift rejection, and 30Hz low-pass filter providedfurther 60 Hz noise rejection. We used quad op-amps (MCP6044, Microchip) for voltage reference and filtering.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 8: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:8 • A. Dementyev et al.

DisplayMicrocontroller

Pressure sensor

Vacuum Pump

Solenoid

Fig. 6. The master circuit board image. This board contains all the components required to run the tethered robot. Themaster board connected to the computer through the USB for communications and programming. The pumps, solenoids andservo motors run off a separate power supply at 3.3V

The data was digitized at 976 Hz and 13-bit resolution and further filtered with MATLAB (MathWorks). We useda digital 60 and 120 Hz Butterworth notch filters to remove 60Hz noise. A detailed circuit diagram is shown inFig. 8. The EDA was measured with a Q-sensor (Affectiva, Inc).Skin Imaging. SkinBot also contains a small skin-facing camera with a magnifying lens to emulate a digital

dermatoscope, which is a tool that physicians use to examine the skin. Dermatoscope significantly increasedmelanoma detection rate, in comparison to naked eye examination [23]. The camera module is appropriate tocapture close-ups snapshots of areas of interest such as birthmarks, warts, scars, irritations or scratches andother potential anomalies. The lens provides a 10x magnification and shallow depth, thus has to be constantlyrefocused the on the uneven skin. Using the existing vertical servo motors, the robot can automatically focus byadjusting its height, or alternatively with an auto-focusing camera. Fig. 7G, H shows an example of a birthmarkand dense hair captured with the camera.The microscope was constructed using a 5-megapixel camera (OV5647, OmniVision), silicone lens (MPL15x,

Cell Focus) and LED light (SM3527, Bivar) delivering 30 mW. The lens and LED were attached using hot glue.The camera was tethered to a single board computer (Raspberry Pi 3). All the video was 2592 x 1944 resolutionand 30 fps.Inertial Measurements. SkinBot contains an inertial measurement unit (IMU) with a 6-axis accelerometer

and gyroscope (MPU6050, Invensense), sampled at 100 Hz. Depending on its body location, IMUs have been usedto track different activities (e.g., typing, steps, cycling), body posture, and physiological parameters (e.g., heartrate, breathing rate). Fig. 7L shows accelerometer data captured while the SkinBot was on the chest of a personto capture different body postures and subtle cardiorespiratory vibrations from which heart and breathing ratesare extracted [14].

3.4 LocalizationTo navigate autonomously, epidermal robots require the ability to self-localize themselves on the body. If aperson is stationary in a controlled room, navigation could be provided by using external sensors such as IRtracking cameras which are very accurate. However, assuming that these robots will be carried during dailylife, localization needs to be performed with the onboard sensors. This section describes some of the main stepstowards providing reliable localization on the skin. Figure 9 shows an overview of the process.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 9: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:9

Birthmark

EKG shieldIMU board top IMU board back

MCU

B

Hair

D

Electrodes

A

Lens

LED

CameraE HF G

L MI J K

Q

R

S

C193

185

177

169

200

160

140

120

80

40

0

1 3 5 7 9 11

1 1.2 1.4

15

10

5

0

-5

1

0.9

0.8

0.7

0.6

0.5

1 3 5 7 9 11

20 40 60

Time (sec) Time (sec) Time (sec)

Time (sec)Time (sec)

mV

mV

Heart rate: 77 beats per minuteBreathing rate: 16 breaths per minute

Sitting Lying Down

X Y Z Zm/s

2

2

1.9

1.8

1.7

1.6

1.5

1.4

Hand Closing

0.4 0.8 1.2 1.6 2.0 2.4 2.8

Filtered

µmho

m/s

2

Fig. 7. Health sensing applications of epidermal robots. Biopotential (top row): A) Side view of SkinBot showing the modifiedsuction-cups to monitor the electrical properties of the skin. B) Electromyographic signal measured on the upper arm whenclosing the hand. C) Electrocardiographic (EKG) signal measured on the chest showing the QRS complex, D) Electrodermalactivity signals on the wrist in response to an auditory stimulus. Visual Imaging (middle row): E) Bottom view of SkinBotshowing the camera sensing module. F) SkinBot using the camera module with a white LED for illumination. G) Camerasnapshot showing a birthmark and H) Snapshot showing hair. Inertial (bottom row): I) IMU board mounted on the SkinBot,J) backside of the IMU board contains a microcontroller and a radio, K) IMU board also provides a connector to attachdifferent modules such as an EKG module, L) changes in accelerometer data during sitting and lying down position capturedon the chest, and M) cardiorespiratory motions captured on the chest.

To start with the localization, SkinBot needs a partial or full 3D body scan as an internal representation model.This model can be obtained with commercial 3D scanners or even mobile phones (e.g. 123D Catch, Autodesk).In our case, we used the Sense 2 hand-held scanner by 3D Systems. Since the robot is always bounded to thesurface (skin) of the 3D image, we used the texture map to correlate the robot’s movement to the 3D map. Thetexture map is an X-Y image of the unfolded 3D shape, which contains the textures for the 3D shapes, and isautomatically generated with a 3D scan using the color camera. Using this information, localization of SkinBot isas follows: 1) the robot’s coordinates are scaled to the texture map, 2) all the texture coordinates were searched tofind X-Y coordinates in the vicinity of robot’s coordinates, 3) the Euclidean distance4 between the found X-Ycoordinates and robot coordinates was computed, and 4) the coordinate with the smallest Euclidean distance tothe robot was assumed to be the robot’s location.

To reliably correlate the 3D map of the body with the corresponding physical space, calibration markers need tobe placed on the body. In particular, we printed markers on temporary tattoo paper (A4 Laser Printer, RoryTory)which does not interfere with the robot’s suction of the skin and are easily removed. The markers are 2cm indiameter. We designed a visible guide for initial manual robot placement and fiducials to recalibrate the robot’s

4d (x, y) =√(x1 − x2)2 + (y1 − y2)2

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 10: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:10 • A. Dementyev et al.

IA+

-

Electrode

Electrode

INA112

2M 2M

Reference voltage (1.65V)

10k

10k

+-

5.6K

1uF

1M

13 bit ADC 60 Hz notch filter

120 Hz notch filter

15 point moving average filter

Output EKG

Digital EKG

Digital EMG

60 Hz notch filter

120 Hz notch filter

Envelope Follower

Output EMG

3.3V

Fig. 8. The biopotentials pickup circuit diagram. EMG and EKG had the same analog front end: instrumentation amplifierwith a gain of 10. The amplifier followed by a high pass filter of 0.16 Hz, which removed the DC baseline. To capture thecomplete signal shapes, the circuit was referenced to 1.65V (half of the supply).

Marker and show what

each parts do

Fiducials

placement guide

proximity markers

1 2 3 4

567Close up of the marker

Fig. 9. Main self-localization steps: 1) scanning the body with a 3D scanner, 2) positioning body markers with a smartphonevisualization aid, 3) attaching body marker to the skin, 4) peeling body marker to reveal the navigation marker, 5) placingrobot on the marker as a starting position, 6) using the onboard robot camera and computer vision to calibrate the robotposition, 7) navigating on the 3D body map using a dead-reckoning approach and skin markers.

dead-reckoning position, as shown in Fig. 9 (steps 3 and 4). The fiducials are recognized with the robot’s cameraand custom machine vision methods (OpenCV 3.4.1) that run on Raspberry PI 3. To facilitate consistent markerplacement, we created a custom app that overlays contour guides on the mobile phone camera. The guides weremanually extracted from the 3D map of the body.

When the robot is moving between the markers, it needs to track its position using onboard sensors. To do so,we employ a dead-reckoning approach which is popular in many mobile robot applications [4, 13] and can beperformed with the existing onboard sensors. In particular, the method estimates the position of the robot with

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 11: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:11

the following equations:xn = xn−1 + qh cos(θn)

yn = yn−1 + qh sin(θn)where x and y indicate the position in a 2D plane, h is the linear traveled distance, θn is the rotation angle, andq is a scaling factor used for conversion of the sensor data onto centimeters. The linear distance is obtainedby simply counting each step of the robot. The step size is known from the servo motors, as they contain anintegrated encoder. The rotation angle is obtained by integrating the gyroscope x-axis rotation rate and using thefollowing equation:

θn = θn−1 + dθn − ct

where c is the gyroscope drift constant, which we measured by logging stationary gyroscope rotation angle, t isthe elapsed time, and dθn is the gyro rotation rate. We sample the gyroscope at 100Hz.

One of the limitations of the dead-reckoning approach is that it drifts over time, requiring occasional recalibra-tion with a known position. In our case, we use the temporary tattoo markers from step 2 which are placed atknown locations. The motion of the body greatly affects inertial sensors, therefore the current dead-reckoningcan only be performed on the stationary body.

4 EVALUATIONThis section evaluates the basic properties of SkinBot such as locomotion, adhesion, and power requirements.In addition, we conduct several experiments to understand some of the unique challenges of skin locomotionsuch as skin stretchability, its curvature, and hair presence. We also test the dead-reckoning localization method.Finally, we conduct a user study to assess the user perception of SkinBot.

4.1 Skin LocomotionFig. 10 shows a snapshot of the movement of SkinBot on an inclined arm. Fig. 5B shows the details of airpressure changes during the locomotion. The achieved vertical climbing speed of the robot is 31cm/min withsolenoid valves and 6.3cm/min without solenoid valves. However, adding the solenoid valves increases the powerconsumption by 50 mW, and the weight of the robot by 10g. Without the valves, the vacuum release time isaround 16sec, greatly limiting the climbing speed. In particular, the robot has to wait to reach atmosphericpressure by air leakage as the servo motors are not strong enough to lift an attached cup. By opening the vacuum

0 sec 20 sec 40 sec 60 sec 80 sec 100 sec

Fig. 10. Robot locomotion on the arm. SkinBot climbing on the arm during 100 seconds.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 12: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:12 • A. Dementyev et al.

4 6 8 10 12 14 16 18Suction Cup diameter (mm)

0

100

200

300

400

500

600

700

Forc

e at

rele

ase

(gf)

Experimental data

Theoretical force

0 10 20 30 40Hair density (hairs per square cm)

0

50

100

150

200

Forc

e at

rele

ase

(gf)

A B

Forc

e at

rele

ase

(gf)

Forc

e at

rele

ase

(gf)

C6mm 18mm

6mm

18m

m

Skin

seal area seal area

Fig. 11. Skin attachment experiments. A) Maximum adhesion forces with different suction cup diameters. as indicated byforce at release. B) Presence of hair on the skin reduces the maximum adhesion force. C) Diagram illustrating the verticaldisplacement of the skin under suction. Suction cup of 6mm and 18mm diameters are shown in the illustration. The 6mmdiameter cup creates a better cup-skin seal, as it has a larger seal area.

line to atmospheric air with a 3-way solenoid valve, the time can be significantly reduced to 0.5sec. A potentialfuture alternative would involve using mechanical cams driven by existing actuators to break the vacuum.

The robot can effectively walk backward by running the horizontal servos in reverse mode. To change direction,the robot can rotate 30o in 20 milliseconds. Due to the possible tangling of vacuum tubes, the rotation radiushas been limited to be between -30o and 30o per locomotion step. By combining multiple steps, the robot canpotentially rotate to any angle.

4.2 Skin AdhesionSuction provides a strong and reliable adhesion approach to the skin. In the case of SkinBot, the adhesion strengthis between 150gf and 200gf when attached to a single cup and between and 300gf and 400gf when attached to thetwo suction cups simultaneously. This range provides enough adhesion to sustain the current weight of the robotwhich is 20g. Thus, when the robot is hanging upside down, there is at least 7.5x safety factor that can be used toaccount for different skin types and irregularities.

To help study adhesion, we define adhesion force as the peak force at which a suction cup pulled in the normaldirection to the skin is detached from the skin. In theory, the maximum adhesion force is directly proportional tothe size of the suction cup, with the governing equation:

F = PA, and A = π (d/2)2

where F is the maximum adhesion force, P is the vacuum pressure, A is the skin contact area of the suction cup,and d is the diameter of the circular contact area. Therefore, larger suction cups have higher adhesion forces.While the theory generally agrees with in vivo experimental data (see Fig. 11A), the suction cups over 10mmdiameter consistently had lower adhesion forces than the theoretical ones. To help further understand this effect,we 3D-printed transparent suction cups with various diameters and video recorded the skin under the vacuum.After careful examination, we believe this discrepancy occurred due to two main factors. First, suction cups over

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 13: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:13

10mm have inadequate skin-cup seal area as shown in Fig. 11C. In other words, while the skin is displaced by thesame amount for the small and large suction cups, the displacement is spread over a larger area for the largercups. Second, larger suction cups require a larger seal area around its diameter. When the suction cup is beingslowly pulled off, there are more chances for gaps in the seal before reaching the maximum theoretical adhesionforce.

In our experiments, we noticed that hair presence can negatively impair adhesion performance. To study thiseffect, we measured adhesion forces on skin surfaces with different amounts of hair (see Fig. 11B). For each of theexperimental skin locations, hair density was manually counted by using a microscope. Moderate presence ofhair on the skin reduces the adhesion force by around 30% to 150 gf but still allowed attachment. Beyond that,excessive hair (above 35 hairs per cm2) prevented suction cups from making the necessary skin-cup seal.

For all adhesion measurements, we used the 20N digital force gauge (DFS20, Nextech). All measurements weredone on the forearm and repeated five times in different locations. The mean of the five trials was reported asresult. The attachment experiments were done on a 30-year old male. The suction cups were pulled manuallyand the peak force was recorded, as the maximum pull-off force. For the measurements, the suction cups were3D-printed with a custom force gauge attachment, so they can be pulled in a normal direction to the adhesionsurface.

4.3 Power ConsumptionUsing a digital multimeter (U1252B, Agilent) and average currents over five minute periods, the mean powerconsumption of the robot during locomotion was found to be 1221mW. A significant part of the power is consumedby the pumps (817mW) and the rest is used by the five motors (404mW). When the robot is statically attached tothe skin, the power consumption of the pumps is reduced to 30mW. This significant reduction can be achieved bymonitoring the pressure sensors and only activating the pumps when the vacuum pressure drops under -20kPa.Pumps run at a duty cycle of 2% to 5% at -20kPa pressure. Fig. 12 illustrates the power consumption at differentpressures, and shows that -20kPa provides the most vacuum pressure while consuming the least energy. Thecurrent for each pressure threshold was measured at five different locations on the forearm.

To evaluate the feasibility of an untethered version of SkinBot, we built the prototype shown in Figure 13. Inparticular, the PCB that held the original tether was replaced by a custom PCB that contained all the electronicsrequired for operation; 2.4 GHz radio (nRF24L01+, Nordic), an ARM-based microcontroller (ATMSAMD21G,

A B

Fig. 12. Power consumption during the adhesion experiment. A) The power consumption of the pumps required to keep therobot attached to the skin at different vacuum pressures. The pumps were duty cycled, by turning on only when vacuum goesunder a certain threshold. Overall, the power consumption is an order of magnitude lower than running pumps continuouslyat 1076mW. B) Sample pressure data showing duty cycling of the pressure to keep the pressure at the threshold of -25kPa.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 14: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:14 • A. Dementyev et al.

Atmel) and an IMU (MPU6050, Invensense). This version of the robot is powered by a 100mAh lithium polymerbattery. The vacuum pumps were also added to the robot but the torque from unbalanced weight made itunreliable for continuous vertical climbing. The electronics consumed a relatively small amount of energy(28.1mW) even with a 2-way radio transmission at 10Hz. Based on our measurements, the untethered robot couldmove continuously for around 16 minutes or remain attached to the skin for around 10 hours.

Radio

IMU

Microcontroller

Antenna 100mAh battery

pump

A B

Fig. 13. A) Circuit board and B) assembled untethered prototype of SkinBot

r=2.5cm 37o

5o

r=12cm

Skin

100gf

100gf

100gf

2.7mm

Gap; no seal

100gf

2.7mm

A

B

before motor pushes suction cup Skin deforms by 2.7mm

No gap: good seal

Fig. 14. Attachment to a curved skin. A) Robot attachment to a cylindrical surface with a 2.5cm radius. Left: robot before itengages the vertical servo motor to push the suction cup down. Right: suction cup displaces the skin by about 2.7mm, whichis not enough to make a reliable cup-skin seal. B) Robot attachment to a cylindrical surface with a 12.5cm radius. In this case,a reliable cup-skin seal is created.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 15: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:15

0 10 20 30 40 50 60 70 80 90 100 110 120 1300

2

4

6

8

10

12

14

16

18

20

Weight (grams)

Rot

atio

n An

gle

(thet

a)

location 1location 2location 3loc 1fitloc 2 fitloc 3 fit

Maximum rotation angle

FW

Mskin

FfFs

Robot is rotated due to moment

Cannot !attach

Mskin

A B C

(θ)

θskinθrobot

skin

r

θrobot

Fig. 15. Rotation of the robot due to sagging of the skin. A) Free force diagram of the robot in a vertical position. One footis detached, as the robot is taking a step. B) An example scenario where the robot is rotated in a position that does notallow attachment. This is caused by sagging of the skin caused by torque (Mskin ) on the skin, due to the weight of the robot.C) The experimental data from three locations on the arm. The data shows the relationship between the robot rotation angle(θrobot − θskin ) and its weight. Linear fitting lines are shown.

4.4 Skin CurvatureThe human body has many degrees of curvatures which may negatively affect the locomotion of SkinBot. Tofacilitate the analysis of curvature, previous studies have approximated the body with spheres and ellipsoidalcylinders [7]. In this work, we simplify each of the body parts with cylinders. In particular, we use cylinders withradii from 2.5cm (wrist-size) to 12cm (torso-size) to help cover some of the main adult-sized areas in which weenvision SkinBot exploring. Ideally, suction cups should always be normal to the skin to maximize attachment.However, this may be challenging for cylinders with a small radius (i.e., a high degree of curvature) as the suctioncups cannot reliably create the skin-cup seal. In our design, the suction cup is pushed towards the skin, causingindentation and allowing attachment with some degree of skin curvature. In particular, the skin can compress by2.7 mm (SD: ±0.71) when pushed by the linear servo motor before the motor stalls. Fig. 14 shows a visualizationof this process. The skin compression distance can be affected by many factors such as skin thickness and itselasticity. Theoretically, a 2.7mm compression distance allows an attachment to a minimum of 4.4 cm radiuscylinders or a 15o angle between the skin and the suction cup. To confirm this, we tested the robot on silicone(EcoFlex 00-30, thickness = 3.0 mm) placed over various 3D-printed cylinders and obtained similar results. Inthe future, the attachment to curved surfaces could be improved by adding the ability to pivot the suction cupsto at least 37o so it can attach to 2.5 cm cylindrical surfaces. Alternatively, the robot locomotion mechanismcould be shrunk by a factor of 2.2, from 9.1mm to 4.1mm distance between the centers of the suction cups to helpcircumvent smaller skin features.

4.5 Skin SaggingSkin is stretchable and flexible surface, so it can affect the robot locomotion and orientation. These propertiescan create situations in which the skin sags, thus rotating the robot into unfavorable orientations. Fig. 15B, forinstance, shows an example in which the robot is unable to attach the suction cup to the skin. Sagging is causedby the moment on the skin created by the weight of the robot. As determined in the previous section, the robotcannot reattach if the suction cup angle is larger than 15o in relation to the skin. To fully understand how theweight of the robot creates skin sagging, we measured how different suction cups rotate at different moments. Inparticular, we determined that to keep the angle under 15o , the weight of the robot should be under 80g (seeFig. 15C).

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 16: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:16 • A. Dementyev et al.

As shown in Fig. 15A, the moment on the skin (Mskin ) caused by the robot is defined as:

Mskin = rFw , and Fw =mд, so Mskin = rmд

where r is the distance to the center of mass of the robot, Fw is the force due to the weight of the robot,m is themass of the robot, and д is gravity constant. The rotational stiffness is defined as:

k = Mskin/θ , where θ = (θrobot − θskin)where θ is the rotational angle of the robot in relation to the skin. It follows that the rotational angle depends onthe following:

θ = Mskin/kθ = rmд/k

As in the above equation, the experimental data shows that θ changes linearly (r 2 = 0.96) with the weight ofthe robot. The slope is dependent on the rotational constant k . In turn, k depends on the skin’s dimensions andstructure, as well as its Young’s modulus. Constant k varies for the three tested locations from 0.13 to 0.24.We used a digital force gauge and DSLR camera (Mark IV, Canon) to record and later analyze the rotation

angle.

4.6 LocalizationTo evaluate on-body robot localization, we collected data of SkinBot moving horizontally on the forearm for20 cm and repeated the process three times. As described in section 3.4, SkinBot was initially placed on a skinmarker. Gold standard localization was obtained by adding 4 infrared reflective markers on the robot and usinga camera-based motion tracking system (Flex 13, OptiTrack). After the three repetitions, the mean error ofthe dead-reckoning algorithm was 4.60mm (SD:±4.1mm). Figure 16 shows an example of localization with theonboard sensors (blue) and the motion tracking system (red). One of the main sources of discrepancy between thetwo measures is due to the angle error from the gyroscope. In particular, the robot experienced some wobblingwhile moving (visible in motion tracking data) which slightly affected its heading. As expected, there was also acumulative error as the robot moved at a rate of 0.63mm per step. To address this, SkinBot needs to occasionallyrecalibrate its position using the skin markers. To do so, the robot has to find at least a small piece of the markerin the camera’s field of view. As the field of view is limited to about 18x18mm., the onboard localization accuracyshould be under 18mm. Considering the position drift, the robot will need recalibration after 21 cm of locomotion.

0 20 40 60 80 100 1200

10

20

X distance (mm)

Y di

stan

ce (m

m)

dead−reckoning

motion capture camera

A B reflective markers

Fig. 16. A) Example of SkinBot localization with the dead-reckoning approach (blue) and a motion capture system (red).B) SkinBot with four reflective markers

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 17: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:17

5sec

10s

30s

60s

10min 30min

0s 10s

60sSu

ctio

n tim

e

Fig. 17. The after effect of the suction cups on the skin. The Y-axis is the duration the suction cup was applied to the skin.The snapshots are shown at different time intervals after the suction cup was removed. For example, when the suction cupwas applied for 5 sec, no visible marks remained after 10 seconds.

4.7 Suction MarksWe noticed that the suction left visible marks on the skin. We investigated this undesirable effect further in Fig. 17,by recording the marks with a camera. We believe marks were caused by the fluid displacement due to pressurefrom the suction cup rim [21]. The duration of how long marks remained visible depended on how long suctionwas applied to the skin. In one participant, the marks disappeared in under 10 sec for 5 sec of suction, and under1 minute for 10 and 30 seconds of suction. Even after 10 minutes of continuous suction, the marks disappeared inabout an hour. In practice, the pumps would not operate continuously but will be duty cycled to conserve energy.

4.8 User StudyTo help better understand the potential use of epidermal robots, we performed a user study in which several par-ticipants had the opportunity to experience SkinBot and provide feedback. This section describes the experimentalprotocol as well as the results in terms of locomotion, user perception, and use-case applications.

4.8.1 Protocol. The experiment was divided into three main parts. During the first part, SkinBot was placedon the interior forearm of the participants and remained stationary for 30 seconds. After that, the robot walkedupwards for a distance of 10 cm. During the second part, the same process was repeated on the posterior forearmof the participant which usually contains a higher hair density. For these two parts, the participant remainedstanding up to make the adhesion and locomotion more challenging. During the final part, participants wererequested to complete a survey about their experience during the stationary and moving conditions of the robot.We collected data from a total of 10 participants (3 females) with ages from 20 to 39 years old from MIT. The BMIof participants ranged from 20.7 to 27.6 and hair density from 0 to 36 hairs/cm2. The duration of the experimentwas around 10 minutes and participation was voluntary.

4.8.2 Adhesion and Locomotion. To study the potential impact of hair density in adhesion and locomotion ofthe robot, we captured several skin images of the interior and posterior forearms with an overlaying clear ruler

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 18: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:18 • A. Dementyev et al.

and manually counted the number of hairs within a 1-cm square area. The mean of three patches was reported.The mean hair density in the interior and posterior forearms were 4.98 (SD:±5.06) and 17.5 (SD:±11.95) hairs,respectively. SkinBot successfully adhered to the skin of most participants but failed to do so on the forearmof the participant with the most hair (36 hairs per square cm). In the moving condition, SkinBot was able toclimb upwards 10 cm but the number of attachment attempts changed depending on whether the suction cupand the skin were not appropriately sealed as determined by the pressure sensor. On average (mean), SkinBotperformed 11.55 (SD:±3.44) and 17.61 (SD:±10.6) attachment attempts on the interior and posterior forearms,respectively. When computing the Pearson Correlation Coefficient between the number of attachment attemptsand hair density, we observed a positive correlation of 0.44 (p: 0.06) indicating that higher hair density requiresmore attachment attempts.

4.8.3 Perception. After experiencing the robot over the skin, participants were requested to answer to “Howmuch do you feel the robot?” “How uncomfortable was it?” and “How painful was it?” in 10-item Likert scaleratings with endpoints “1 - Not at all” and “10 - Very much” for each. Participants answered these questionsfor both stationary and moving conditions. However, no significant differences were observed across the twoconditions. The mean ratings were 5.9 (SD:±2.2), 3.1 (SD:±2.54), and 1.5 (SD:±1.42) indicating that participantswere able to moderately feel the robot and that they did not perceive it as particularly uncomfortable nor painful.In addition, participants were asked to describe how SkinBot felt on their skin. Two of the participants describedthe experience as a ticklish sensation, especially when moving. Three participants used animal analogies todescribe their experience. For instance, one participant described it as “[SkinBot] feels a bit like (I imagine) asmall frog or lizard crawling up my arm,” and another one as “[SkinBot feels] similar to as if an insect walkedon my skin.” Finally, two of the participants remarked that SkinBot was most noticeable at the beginning of theadhesion but that it quickly became less noticeable while remaining stationary.

4.8.4 Applications. Participants were also asked to rate how likely they would consider wearing the robotfor the different use-case purposes: 1) medical examination (e.g., physiology monitoring, skin analysis), activitytracking (e.g., step counting, body posture), body care (e.g., skin hydration, hair shaving), and fashion (e.g., robotas a garment, applying makeup). Similarly, we used 10-item Likert scale questions with endpoints “1 - Notat all” and “10 - Very much.” The mean ratings were 8.7 (SD:±2.16) for medical examination, 6.6 (SD:±2.63)for activity tracking, 7.6 (STD:±2.27) for body care, and 6.1 (SD:±3.03) for fashion purposes. While the fourcases showed above-average support across participants, medical examination and body care were the mostpositively supported. Finally, participants were (optionally) encouraged to suggest other use case applicationsfor which they though epidermal robots could become useful. Seven of the participants provided ideas and acommon underlying trend was around the use of robots to provide massages and some other forms of soothinginterventions (e.g., cooling and heating biofeedback). In addition, two participants proposed to use the robot toprovide notifications (e.g., on body wakeup alarm, reminders).

5 DISCUSSION

5.1 Skin LocomotionIrrespective of the adhesion mechanism, epidermal robots need to have the following: a sensor to detect adhesionstate, the ability to enable and disable adhesion, and at least two degrees of freedom (preferably, three) for theend effectors. To control skin locomotion, those three features need to be combined with a digital state machine.Commonly explored robot locomotion methods such as wheels and tracks do not perform well on skin. Skinlocomotion is difficult but possible with the use of sensors, feedback, and digital electronics.We determined that suction is an appropriate method for skin locomotion. Suction provides enough force to

hold the robot, even with the moderate presence of hair. Highly dense hair would have to be shaved beforehand.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 19: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:19

Also, suction can be energy efficient, given that the pumps are duty cycled. Particularly, for suction based robots,the weight should not exceed 80g to minimize skin sagging. The diameter of the suction cups should be under10mm if -10 to -30kPa vacuum pressure is used. In addition, the distance between suction cups should be ideallyless than 4.1mm to facilitate locomotion over most curved body surfaces. Finally, adding a thin soft rim to therigid suction cup will aid in adhesion to skin with hair.

5.2 ApplicationsIn contrast to existing medical instruments, the vision of epidermal robots offers several benefits. First, the robotsare small and lightweight, so they can be easily transported to any location as well as move over the body withoutcreating discomfort. Second, epidermal robots are not limited to a single location and, therefore, can maximize thespatial resolution of the sensor. For example, a single robot with one camera can obtain a high-resolution imageof the whole body. Third, epidermal robots can incorporate different sensing modules enabling different medicalapplications. For instance, a remote medical practitioner can collect different body parameters to help diagnosecertain conditions such as Parkinson’s disease based on motion data, asthma based on respiratory patterns, orother neurological disorders based on touch perception. Fourth, the robots could perform autonomous tasks toaid the remote teleoperator with routine tests. For example, before providing an injection, the robot could imagedifferent areas to find the optimal location. In addition, the robot could monitor evolving changes of the bodysuch as moles that may turn into melanoma or skin irritations that may indicate episodes of psoriasis. Fifth, whilenot explore in this work, epidermal robots could be used to exert forces on the body as high as their adhesionforce. As a result, the robot could provide contact sensing such as stiffness of the skin, and interventions such asinjections (e.g., [12]) or microsurgery. Finally, a combination of several robots could be simultaneously used toprovide more sophisticated sensing. For example, one robot can serve as a transmitter and another as a receiverto provide electrical impedance tomography.Beyond medicine, epidermal robots could be used in other settings such as activity tracking, body care, and

fashion. In terms of activity tracking, the robot can move to various locations depending on different just-in-timeapplications. For instance, it can move to the back to monitor body posture when sitting down, move to thewaist to track steps when walking, and/or move to the chest to monitor respiratory signals during sleep. Interms of body care, some versions of the robot could be used to monitor skin hydration and apply body lotionwhen needed, and another one could be used to detect and to remove excessive hair. Finally, we believe differentversions of the robot could be used in the context of fashion to make tattoos or apply makeup on the skin, or therobot itself could become a garment.

5.3 Tethered vs. Untethered ApplicationsWhile the experiments in this work have been mostly performed with a tethered version of SkinBot, we believeour results are generalizable to potential untethered prototypes. However, as described in section 4.3 there aresome challenges in terms of additional weight and stable balancing of the robot that still need to be addressed,especially when climbing upwards. While we envision epidermal robots to be completely untethered, there aresome scenarios in which tethered versions may be more appropriate. For instance, tethered epidermal robot maybe good for applications that require high power sensing and actuation (e.g., active camera sensing) and/or veryaccurate localization which cannot be easily matched with the onboard sensors. The tethered prototype would beused on stationary settings (e.g., sleeping or bed-ridden) or to limited-mobility people (e.g., confined to a room).On the other hand, untethered versions of the robot are particularly useful whenever the whole body needs to becovered (as the tether can be easily tangled) and/or the person is ambulatory. Furthermore, wearing clothing maybe more challenging to tethered robots than untethered ones.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 20: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:20 • A. Dementyev et al.

5.4 LocalizationAccurate body localization of epidermal robots proved to be more challenging than expected. We showed thatbody localization can be done with the onboard inertial tracking and vision markers but it only allowed foraround 21 cm of locomotion before the position drift became significant. While one marker should be sufficient tolocomote on smaller areas such as the forearm, upper arm, or the upper leg, multiple markers would be requiredto locomote larger surfaces such as the torso. In addition, inertial navigation only works on a stationary bodywhich requires the robot to stop locomoting whenever large motion is detected.

While we hope the onboard navigation will improve in the future, whole body location is still better addressedwith an external setup such as the infrared tracking system that was used in our experiments. Alternatively,epidermal robots could be teleoperated with an external camera view.

6 LIMITATIONS AND FUTURE WORKUntethered. This work has designed and evaluated different tethered prototypes and has created a first encour-aging version of an untethered one. The main challenge of the untethered prototype is the miniaturization ofmotors and pumps, as there are not off-the-shelf parts that can be readily used. Future work will explore creatinga more robust version of the untethered version by exploring novel actuators that can more evenly distribute theweight such as electroactive polymers [33] and shape memory actuators [3].

Autonomy. We envision future epidermal robots to be fully autonomous which will require the ability ofself-navigation and localization. This work demonstrates how dead-reckoning with onboard sensors and skinmarkers can provide the robot’s position. In the future, we hope to increase the accuracy by combining multiplesensors such as optical flow and inertial sensors, as well as using natural skin landmarks (e.g. scars, moles) forcalibration purposes.Utility. This work has described several applications in which epidermal robots could be useful. However,

long-term and real-life deployments are needed to better assess which ones are the most promising ones. Todo so, we plan to collaborate with medical doctors that can explore the use of the robots in different healthcareapplications. In addition, we also plan to explore collaborations in a variety of fields such as fashion and bodycare to explore applications beyond medicine.Locomotion. This work demonstrated that skin locomotion was possible in healthy individuals in the 20 to

39-year-old range. However, the studies only considered forearm locomotion of stationary participants. Futureresearch will need to consider different populations, body locations, and participants’ movement to fully assessthe generalization of our findings regarding skin climbing. In addition, the current version of the robot is notappropriate for locomotion on highly irregular body parts. To address this, it would be recommended to have asmaller design and suction cups with at least 3 degrees of freedom. Future work will consider exploring thesewith pneumatic actuators as well as soft robotics (e.g., [27, 35]) as they can match the elasticity of the skin.

7 CONCLUSIONSThis work demonstrates the first epidermal robot with the ability to move over the surface of the skin andcapture a large range of body parameters. We identified and met five critical design considerations for epidermalrobots: lightweight and small, have access to the skin, have the ability to adhere and locomote, and providemultimodal sensing. We found that suction-based locomotion worked better than adhesive-based methods. Themain challenge of skin climbing was the adhesion of the end effector (suction cup) to a new position. In oursolution, we used a feedback approach, with pressure sensors and servo motors to attach to complex surfaces. Therobot was able to traverse curved body surfaces having more than 4.4cm curvature radius, allowing locomotionon an adult torso, hands, and legs. Suction cups provided controllable adhesion of 150-200gf, even in the presenceof hair. The power consumption was 30mW and 1221mW during static adhesion and locomotion, respectively,

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 21: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:21

which is reasonable for an untethered robot. The robot can move at a speed of 31 cm/min and can change direction.The robots did not leave long-term skin marks and were perceived as non-invasive in a small user study with 10participants. The robot weight of 20 grams did not cause any significant sagging to the skin. Finally, we exploreda number of applications where the robot served as a mobile on-body sensor. We look forward to a future inwhich swarms of epidermal robots become our intimate companions and are used to promote long-term care andmaintenance of our bodies as well as enhance our daily life.

ACKNOWLEDGMENTSThis work is supported by the MIT Media Lab Consortium.

REFERENCES[1] Fadel Adib, Hongzi Mao, Zachary Kabelac, Dina Katabi, and Robert C Miller. 2015. Smart homes that monitor breathing and heart rate.

In Proceedings of the 33rd annual ACM conference on human factors in computing systems. ACM, 837–846.[2] PG Agache, C Monneur, JL Leveque, and J De Rigal. 1980. Mechanical properties and Young’s modulus of human skin in vivo. Archives

of dermatological research 269, 3 (1980), 221–232.[3] Hu Bing-Shan, Wang Li-Wen, Fu Zhuang, and Zhao Yan-zheng. 2009. Bio-inspired miniature suction cups actuated by shape memory

alloy. International Journal of Advanced Robotic Systems 6, 3 (2009), 29.[4] Johann Borenstein, HR Everett, and Liqiang Feng. 1996. Navigating mobile robots: Systems and techniques. AK Peters, Ltd.[5] W. Boucsein. 2012. Electrodermal Activity. Springer US. https://books.google.com/books?id=6N6rnOEZEEoC[6] Leoncio Briones, Paul Bustamante, and Miguel A Serna. 1994. Wall-climbing robot for inspection in nuclear power plants. In Robotics

and Automation, 1994. Proceedings., 1994 IEEE International Conference on. IEEE, 1409–1414.[7] Charles E Clauser, John T McConville, and John W Young. 1969. Weight, volume, and center of mass of segments of the human body.

Technical Report. ANTIOCH COLL YELLOW SPRINGS OH.[8] Kathryn A Daltorio, Andrew D Horchler, Stanislav Gorb, Roy E Ritzmann, and Roger D Quinn. 2005. A small wall-walking robot with

compliant, adhesive feet. In Intelligent Robots and Systems, 2005.(IROS 2005). 2005 IEEE/RSJ International Conference on. IEEE, 3648–3653.[9] Artem Dementyev, Hsin-Liu Cindy Kao, Inrak Choi, Deborah Ajilo, Maggie Xu, Joseph A Paradiso, Chris Schmandt, and Sean Follmer.

2016. Rovables: Miniature on-body robots as mobile wearables. In Proceedings of the 29th Annual Symposium on User Interface Softwareand Technology. ACM, 111–120.

[10] S Diridollou, F Patat, F Gens, L Vaillant, D Black, JM Lagarde, Y Gall, and M Berson. 2000. In vivo model of the mechanical properties ofthe human skin under suction. Skin Research and technology 6, 4 (2000), 214–221.

[11] Markus Eich and Thomas Vögele. 2011. Design and control of a lightweight magnetic climbing robot for vessel inspection. In Control &Automation (MED), 2011 19th Mediterranean Conference on. IEEE, 1200–1205.

[12] Kevin Fok, Nathan A Wood, and Cameron N Riviere. 2012. Improved locomotion for the HeartLander robot for injection of ananti-remodeling hydrogel. In Bioengineering Conference (NEBEC), 2012 38th Annual Northeast. IEEE, 207–208.

[13] Yasutaka Fuke and Eric Krotkov. 1996. Dead reckoning for a lunar rover on uneven terrain. In Robotics and Automation, 1996. Proceedings.,1996 IEEE International Conference on, Vol. 1. IEEE, 411–416.

[14] Javier Hernandez, Daniel J McDuff, and Rosalind W Picard. 2015. Biophone: Physiology monitoring from peripheral smartphone motions.In Engineering in Medicine and Biology Society (EMBC), 2015 37th Annual International Conference of the IEEE. IEEE, 7180–7183.

[15] Alexandra Ion, Edward Jay Wang, and Patrick Baudisch. 2015. Skin drag displays: Dragging a physical tactor across the user’s skinproduces a stronger tactile stimulus than vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in ComputingSystems. ACM, 2501–2504.

[16] Hsin-Liu Cindy Kao, Deborah Ajilo, Oksana Anilionyte, Artem Dementyev, Inrak Choi, Sean Follmer, and Chris Schmandt. 2017.Exploring interactions and perceptions of kinetic wearables. In Proceedings of the 2017 Conference on Designing Interactive Systems. ACM,391–396.

[17] Arvid Kappas, Dennis Küster, Christina Basedow, and Pasquale Dente. 2013. A validation study of the Affectiva Q-Sensor in differentsocial laboratory situations. In 53rd Annual Meeting of the Society for Psychophysiological Research, Florence, Italy.

[18] Dae-Hyeong Kim, Nanshu Lu, Rui Ma, Yun-Soung Kim, Rak-Hwan Kim, Shuodao Wang, Jian Wu, Sang Min Won, Hu Tao, Ahmad Islam,et al. 2011. Epidermal electronics. science 333, 6044 (2011), 838–843.

[19] Sangbae Kim, Matthew Spenko, Salomon Trujillo, Barrett Heyneman, Daniel Santos, and Mark R Cutkosky. 2008. Smooth verticalsurface climbing with directional adhesion. IEEE Transactions on robotics 24, 1 (2008), 65–74.

[20] Mathew Laibowitz and Joseph A Paradiso. 2005. Parasitic mobility for pervasive sensor networks. In Pervasive Computing. Springer,255–278.

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.

Page 22: Epidermal Robots: Wearable Sensors That Climb on the Skin · 2019-04-11 · Epidermal Robots: Wearable Sensors That Climb on the Skin • 102:3 We believe that the robots described

102:22 • A. Dementyev et al.

[21] Y Lanir, S Dikstein, A Hartzshtark, and V Manny. 1990. In-vivo indentation of human skin. Journal of biomechanical engineering 112, 1(1990), 63–69.

[22] Yuanyuan Liu, Xinyu Wu, Huihuan Qian, Duan Zheng, Jianquan Sun, and Yangsheng Xu. 2012. System and design of clothbot: A robotfor flexible clothes climbing. In Robotics and Automation (ICRA), 2012 IEEE International Conference on. IEEE, 1200–1205.

[23] Henrik Lorentzen, Kaare Weismann, Carsten Sand Petersen, Frederik Grønhøj Larsen, Lena Secher, and Vera Skødt. 1999. Clinical anddermatoscopic diagnosis of malignant melanoma: assessed by expert and non-expert groups. Acta dermato-venereologica 79, 4 (1999).

[24] Ken Nakagaki, Artem Dementyev, Sean Follmer, Joseph A Paradiso, and Hiroshi Ishii. 2016. ChainFORM: A Linear Integrated ModularHardware System for Shape Changing Interfaces. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology.ACM, 87–96.

[25] NA Patronik, MA Zenati, and CN Riviere. 2005. Preliminary evaluation of a mobile robotic device for navigation and intervention onthe beating heart. Computer Aided Surgery 10, 4 (2005), 225–232.

[26] Ming-Zher Poh, Daniel J McDuff, and Rosalind W Picard. 2010. Non-contact, automated cardiac pulse measurements using videoimaging and blind source separation. Optics express 18, 10 (2010), 10762–10774.

[27] Panagiotis Polygerinos, Zheng Wang, Kevin C Galloway, Robert J Wood, and Conor J Walsh. 2015. Soft robotic glove for combinedassistance and at-home rehabilitation. Robotics and Autonomous Systems 73 (2015), 135–143.

[28] Ivan Poupyrev, Nan-Wei Gong, Shiho Fukuhara, Mustafa Emre Karagozler, Carsten Schwesig, and Karen E Robinson. 2016. ProjectJacquard: interactive digital textiles at scale. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM,4216–4227.

[29] Daniel Roetenberg, Henk Luinge, and Per Slycke. 2009. Xsens MVN: full 6DOF human motion tracking using miniature inertial sensors.Xsens Motion Technologies BV, Tech. Rep 1 (2009).

[30] Jacob Rosen, Mitchell Lum, Mika Sinanan, and Blake Hannaford. 2011. Raven: Developing a surgical robot from a concept to atransatlantic teleoperation experiment. In Surgical Robotics. Springer, 159–197.

[31] Tamami Saga, Nagisa Munekata, and Tetsuo Ono. 2014. Daily support robots that move on the body. In Proceedings of the SecondInternational Conference on Human-agent Interaction. ACM, 29–34.

[32] Giuseppe Tortora, Paul Glass, Nathan Wood, Burak Aksak, Arianna Menciassi, Metin Sitti, and Cameron Riviere. 2012. Investigation ofbioinspired gecko fibers to improve adhesion of HeartLander surgical robot. In Engineering in Medicine and Biology Society (EMBC), 2012Annual International Conference of the IEEE. IEEE, 908–911.

[33] Francesca Tramacere, Lucia Beccai, Fabio Mattioli, Edoardo Sinibaldi, and Barbara Mazzolai. 2012. Artificial adhesion mechanismsinspired by octopus suckers. In Robotics and Automation (ICRA), 2012 IEEE International Conference on. IEEE, 3846–3851.

[34] Matthew P Wallen, Sjaan R Gomersall, Shelley E Keating, Ulrik Wisløff, and Jeff S Coombes. 2016. Accuracy of heart rate watches:implications for weight management. PLoS One 11, 5 (2016), e0154420.

[35] Michael Wehner, Ryan L Truby, Daniel J Fitzgerald, Bobak Mosadegh, George M Whitesides, Jennifer A Lewis, and Robert J Wood. 2016.An integrated design and fabrication strategy for entirely soft, autonomous robots. Nature 536, 7617 (2016), 451.

[36] Adam Whiton. 2013. Sartorial Robotics: Electronic-textiles and fiber-electronics for social soft-architecture robotics. Ph.D. Dissertation.Massachusetts Institute of Technology, Cambridge, MA.

[37] Lining Yao, Ryuma Niiyama, Jifei Ou, Sean Follmer, Clark Della Silva, and Hiroshi Ishii. 2013. PneUI: pneumatically actuated softcomposite materials for shape changing interfaces. In Proceedings of the 26th Annual ACM Symposium on User Interface Software andTechnology. ACM, 13–22.

Received February 2018; revised May 2018; accepted September 2018

Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 2, No. 3, Article 102. Publication date: September 2018.