panavi chi2012 presentation

77
Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Daisuke Uriu Mizuki Namai Satoru Tokuhisa Ryo Kashiwagi Masahiko Inami Naohito Okude

Upload: daisuke-uriu

Post on 18-Aug-2015

335 views

Category:

Design


0 download

TRANSCRIPT

Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts

Daisuke Uriu Mizuki Namai

Satoru Tokuhisa Ryo Kashiwagi Masahiko Inami Naohito Okude

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artscooking/recipe medium

Julia Child “The French Chef” (1963-1973) - the transition of recipe media from “text” to “video” -

Her cooking show, translating traditional print or verbal recipe, was a great hit with domestic audiences in the US.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artscooking/recipe medium

Cooking or recipe medium has entered a transitional period with regards to HCI.

Silver Spoon text based recipe book

The French Chef TV show

Vita Craft IHIQ automatically cooking system product

Digital Thermometer Frying Pan product

Personal Trainer: Cooking by NINTENDO

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsconcept: title of this paper

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master

Professional Culinary Arts

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsconcept: design overview

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master

Professional Culinary Arts

display: digital recipe application

sensors-embedded frying pan

wirelessly connected

with the display

programmed the ways of professional

chef’s cooking

video

research contribution

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsuncertain environment

This research attempts to obtain ways of designing computer mediated artifacts that

support uncertain domestic environments: kitchens.

firewater

uncertain domestic

situated actions;

all activities in kitchens based on

activities in kitchens cannot be planned as a static model.

users must manage uncertain troubles

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts

Introduction and Motivation Cooking advanced recipes can be complicated, and requires following detailed instructions to achieve the desired result. Textual instructions like paper recipes are often difficult to comprehend; they use technical terms or instructions that require prior experience to be understood (“How much is a handful of chopped basil?” or “How brown are caramelized onions?”). These types of information can be transferred more easily using images or videos, just as cooking shows do. However, cooking shows are not designed to be followed in real-time. Although the increasing audience of cooking shows in European television [4] indicates that there is a growing interest in this format, no participant of our preliminary studies followed a recipe while watching a cooking show. They rather liked the inspiring aspect of the visual aesthetics. Typically, they would follow textual instructions when cooking, or not use any kind of instructions at all.

The final outcome of a recipe depends on many factors and can fail at any point during the entire process. This can be demanding and stressful for beginners, perhaps discouraging them. We are developing PersonalChef, an interactive kitchen counter (see Fig.1), to avoid disappointing hobby chefs when preparing unfamiliar recipes and to increase their confidence. Related Work Several systems have been developed to bring technical innovations into the domestic kitchen environment.

Cooking Navi [7] considers cooking as a time optimization problem when preparing a menu consisting of multiple recipes, and tries to optimize single cooking processes in order to have all dishes finished at the right time. Semantic Cookbook [10] and Kitchen of the Future [11] installed various electronic devices into the kitchen to record and share cooking sessions. Living Cookbook [12] offers these possibilities as well, but it further focuses on the social experience of cooking and collaboration. eyeCOOK [3] and Smart Kitchen [8] focus on the actual user input. They try to facilitate active user input or reduce it by tracking user’s actions in the kitchen using eye-gaze, speech commands, and foot switches. CounterActive [9] integrates step-by-step cooking instructions using multimedia invisibly into the kitchen counter. Besides research prototypes, there are also commercial systems. Nintendo’s personal cooking instructor (nintendo.com) on the small-screen NintendoDS device is an interactive cookbook and provides live cooking demonstrations.

While other work often focused on efficiency [1,5], our system aims to support the entertaining and social

Figure 1. PersonalChef consists of two modules: counter screen and stove screen.

CHI 2010: Work-in-Progress (Spotlight on Posters Days 1 & 2) April 12–13, 2010, Atlanta, GA, USA

3404

Figure 1. HeatSink illuminates the stream of water according to its temperature,

becoming red when hot and blue when cold.

SeeSink One impediment to automatic faucets in bathroom and kitchen sinks is the lack of control over temperature and flow. Nevertheless, their simple application of automation conserves water and promotes hygiene by allowing for 'hands-free' operation and automatically shutting off. The infrared technology used in these systems has limited sensing ability and cannot account for the variety of scenarios possible in a sink. SeeSink is a project that seeks to combine the advantages of faucet automation with context-aware sensing and actuation to be useful in kitchen sinks. A CCD camera mounted to the faucet serves to interpret a variety of tasks and provide the proper flow and temperature of water automatically (See Figure 2). When a user presents his hands, the sink dispenses warm water for washing. When a user presents vegetables, the sink dispenses cold water. A pasta pot calls for filling with cold water, whereas a dish sponge indicates the need for hot water to wash the dishes. A PC interprets the video stream using Microsoft Vision SDK and dispenses the proper temperature and flow of water through an instantaneous heater and pumps. In order to communicate the temperature of the water to a user, a version of HeatSink is installed in the faucet that colors the water according to its temperature. Because of the multitude of possible scenarios for such a system, a provision exists in the software for training new tasks such as setting custom hand-washing temperatures. SeeSink helps to make the sink more functional while improving hygiene and water consumption.

CleanSink Whereas SeeSink is optimized for situations where water conservation can be effected at the point of use, there exist conditions where conservation of water needs to be discouraged in order to promote thorough hand-washing. Hospitals, restaurants and industrial clean rooms often need to install invasive systems that monitor hand-washing compliance so that non-compliance can be punished.

Figure 2. SeeSink can interpret a variety of tasks being

performed by the user to provide useful hands-free control of water temperature and flow.

Unfortunately these systems do not directly prevent non-compliance (which is estimated at 50% in hospitals, for example [30]). Dirty hands are the primary cause for infection, and certainly very easy to prevent [3].

CleanSink seeks to motivate critical behavior change by augmenting the role of the sink as part of the larger context in which hand-washing compliance is necessary. Several versions of the system have been prototyped. In its most basic form, the CCD camera used for SeeSink confirms the presence of hands under the stream of water and HeatSink provides a subtle prompt by flashing illumination in the stream of water when sufficient time has passed. In a more typical setting, the same system is combined with an RFID reader that logs the identity and compliance to a central database (See Figure 3). More persuasive techniques allow the sink to connect with automation in the space around the sink. In a medical examination room scenario, CleanSink was connected to a relay that controls the room lights and so that they only brighten once the staff washes their hands. For an industrial clean room, on the other hand, we have prototyped an electric door lock that impedes access until hands are clean. Used in combination, these techniques can directly impact hand-washing compliance at the point of use with broad impact on health and safety.

Figure 3. Clean Sink showing indicator (left) RFID reader (middle) and sink (right).

CHI 2005 ʜ PAPERS: Technology in the Home April 2–7 ʜ Portland, Oregon, USA

634

research through designThis paper follows “Research through Design.”

How to evaluates our work could contribute to the HCI community.

inventionprocess

relevance extensibility

related studies

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artstraditional text based recipes

Traditional media for recipes describe steps toward a perfect completion of cooking.

Silver Spoon very famous recipe book

SPAGHETTI ALLA CARBONARASPAGHETTI WITH CARBONARA SAUCE

Ingredient for 1 personPreparation time: 15' (preparation: 10' cooking: 5')

14 oz (100 g) spaghetti1.5 oz (45 g) pancetta1 egg1 egg yolk

MethodPrepare egg sauce, beating the egg and the egg yolk with Pecorino cheese and a little pepper in a bowl. Cut the pancetta into small strips. Place a large skillet onto a medium heat and slowly saute the pancetta. Cook the spaghetti in a pot of lightly salted boiling water and drain when al dente. Turn the pasta into the skillet with the pancetta, toss and turn off the heat. Add the egg sauce and a little of the cooking water. Stir for 30 seconds or so with a little heat. Turn off the heat, stir again, and serve immediately.

1 oz (100 g) Pecorino cheese1 tsp (5 ml) extra virgin olive oilSaltPepper

But it is usually difficult for non-professional users to reproduce the taste of meals.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts“planned” recipe medium

The problem of “planned” traditional recipe is close to Lucy Suchman’s “situated actions.”

SPAGHETTI ALLA CARBONARASPAGHETTI WITH CARBONARA SAUCE

Ingredient for 1 personPreparation time: 15' (preparation: 10' cooking: 5')

14 oz (100 g) spaghetti1.5 oz (45 g) pancetta1 egg1 egg yolk

MethodPrepare egg sauce, beating the egg and the egg yolk with Pecorino cheese and a little pepper in a bowl. Cut the pancetta into small strips. Place a large skillet onto a medium heat and slowly saute the pancetta. Cook the spaghetti in a pot of lightly salted boiling water and drain when al dente. Turn the pasta into the skillet with the pancetta, toss and turn off the heat. Add the egg sauce and a little of the cooking water. Stir for 30 seconds or so with a little heat. Turn off the heat, stir again, and serve immediately.

1 oz (100 g) Pecorino cheese1 tsp (5 ml) extra virgin olive oilSaltPepper

She revealed the programmed instructions do not support situated actions: the errors and problems users actually face are not expected by the system.

the “big green button” came from her research.

ethnography research about copy machine

“planned” text recipe

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts“situated” cooking support

Cooking support systems corresponding to users’ situated actions

have been already considered.Bradbury et al.

an interactive cookbook supporting situated actions

using eye-tracking and speech recognition

NATURAL INPUTS eyeCOOK is designed specifically to use natural input modalities: those that humans use in human to human, non mediated communication [5]. To reduce the need for users to provide explicit input, or change their behavior to accommodate interface constraints, implicitly provided attentional cues are observed and interpreted. We believe that this approach improves the learnability and intuitiveness of interfaces designed for novice users. Voice Commands eyeCOOK uses context-sensitive, localized grammars. This allows more synonyms for a given speech recognition command, reducing the chance of misinterpreting a word. Eye Gaze Commands When the user is in range of the eye tracker, eyeCOOK substitutes the object of the user’s gaze for the word ‘this’ in a speech command. For example, ‘Define this’ will trigger the define operation on the current eye gaze target. Since current eye trackers are spatially fixed and offer limited mobility to users, the user will not always be in a location where eye tracker input is available. Our speech grammar is designed so that system functionality is not reduced when using speech without eye tracking. This is accomplished by the user stating “define boil”, instead of saying “define this” while looking at the word boil. TOWARDS AN ATTENTIVE KITCHEN Interfaces that recognize and respond to user attention, and understand how it relates to the overall task within the kitchen environment can help the user efficiently engage in

her/his task. To achieve this, we must increase sensing capability [3], improve coordination among appliances [5], and give appliances the ability to affect the environment [3,5]. Environmental Sensors Temperature sensors used to keep track of the status of the oven and the elements of the stove can increase the system’s ability to guide the user’s cooking experience and could be synchronized with electronic timers. Appliance Information Integration Integrating knowledge of the environment can result in improved functionality, taking up less of the user’s time and effort. For example, user recipe preferences, timing constraints, as determined by the user’s electronic schedule, and currently available ingredients, communicated by food storage areas, can be combined to suggest recipes. As well, selecting a recipe can result in the addition of necessary ingredients to an electronic shopping list stored on the user’s Personal Data Assistant (PDA).

Active Environmental Actions The kitchen should not only be aware of its environment, but it should also be able to affect it. Thus, it should be able to take actions which increase efficiency, and reduce the user’s action load, like automatically preheating an oven. CONCLUSIONS We have presented eyeCOOK, a gaze and speech enabled multimodal Attentive User Interface. We have also presented our vision of an Attentive Kitchen in which appliances, informed by sensors, coordinate their behavior, and have the capability to affect the environment. This can reduce the user’s workload, and permit rationalizing requests for user attention. REFERENCES 1. Ju, W. et al. (2001). CounterActive: An Interactive

Cookbook for the Kitchen Counter. Extended Abstracts of CHI 2001 (Seattle, April 2001) pp. 269-270

2. Norman, D. A. The Invisible Computer, MIT press, 1999

3. Schmidt, A. et al. How to Build Smart Appliances. IEEE Personal Communications 8(4), August 2001. pp. 66-71.

4. Selker, T., et al. (2000). Context-Aware Design and Interaction in Computer Systems. IBM Systems Journal (39) 3&4, pp. 880-891.

5. Shell, J. et al. Interacting with Groups of Computers. Commun. ACM 46(3) March, 2003.

6. Tran, Q., et al. (2002). Cook’s Collage: Two Exploratory Designs. Position paper for Families Workshop at CHI 2002. Minneapolis, MN.

Figure 1. eyeCOOK in Page Display Mode

Interactive & Student Posters: Computers Everywhere CHI 2003: NEW HORIZONS

Posters: Computers Everywhere CHI 2003: NEW HORIZONS

997

Olivier et al.

situated coaching system by integrating projection

systems, RFID, accelerometers, and under-floor pressure sensing technologies.

explore different configurations of sensor-dependent display behavior significantly helps in the exploration and crafting of design ideas. Figure 4 shows a prototype under development in which the ambient displays (on the wall above the main worktop) respond to the turning of the pages of a specially designed cookbook. The cookbook has an RFID tag embedded in each page allowing the kitchen to detect the page that the book is currently open at. Responding to this, the ambient displays present relevant food information and even personal media related to the recipe and past times in the cook’s life when the corresponding meal was prepared.

4.2 A design tool for users Another significant challenge in designing pervasive computing applications is the involvement with users in the design process. We have conducted a number participatory design exercises involving older users and found that a significant barrier to exploring design concepts is adequately explaining the scope of the technologies involved. By demonstrating simple mappings between sensors and display in demonstration applications in the Ambient Kitchen we have found that we can greatly improve lay users' understanding of the potential functionality. For example, figure 5 shows one such commonly used application in which sample recipes are projected in response to the ingredients placed on the bench. Traffic light indicators on the display also show which of the recipes ingredients are in the kitchen’s cupboards. Though a simple demonstration of how sensor data can be mapped to information sources (and then to information displays) our experience is that such illustrations can help users think about both more mundane and adventurous (and useful) applications of such “invisible” technologies.

Figure 5. The Ambient Kitchen has been used to facilitate

your

different people. For an

discussions and focus groups on the topic of pervasive computing as part of a wider participatory design process looking at ICT and nutrition for older people.

4.3 An observatory to collect sensor data Realizing situated services that are responsive to both actions and intentions requires significant development of activity recognition algorithms themselves. As such, multi-sensor benchmarks of everyday activities are not widely available, and as part of our own research, and to support the research of collaborators and the wider research community, we are

developing a number of such benchmarks by capturing data for multiple subjects and activities, and hand annotating these datasets. Activities range from gaze data for head pose tracking algorithm development, primarily using video streams alone (see figure 6), to naturalistic data sets relating to multi-step food preparation for which RFID, accelerometer, pressure and video data is collected and hand annotated.

4.4 An evaluation test bed Evaluation means different things to engineer the question is "does it work?" That is, are the functional requirements met, does it complete certain tests accurately and without failing. For the human factors engineer the question is "does it perform a useful function in the context that it is intended to be used". The latter question can only really be answered by installing the technology in a range of real contexts. In the case of kitchen technologies, this means real lived-in homes. Constraints such as household routines and the different uses different members of a family use different rooms for at different times can be critical in the success or failure of home technologies. These constraints are only really apparent in the context of real home use. Laboratory-based facilities such as the ambient kitchen thus are of limited use in this respect. However it is possible to use them to do more limited evaluations of functional requirements that still have face validity.

Figure 6. Simultaneously captured data from the embedde

y evaluations of this kind but we are

taken. Thus the teabags might be in a container on work surface,

dcameras allows us to develop a benchmark for attention detection algorithms based on tracking head pose and position from multiple viewpoints.

We yet have to complete anplanning to do so. For example, we will use actors trained using video recordings of people with dementia carrying out simple kitchen tasks such as making a hot drink. These recordings were made of people doing these tasks, which were of their own choosing, in their own kitchens. The intention is to make initial tests of algorithms to detect when these people need prompting because they have made an error or stopped in the middle of the task. It would be disorienting and therefore unrealistic to bring people with dementia into the lab but these existing videos can be used to configure the Ambient Kitchen to match the kitchen in the video and then to get an actor to perform the sequence of actions

Figure 1. The Ambient Kitchen is a lab-based high fidelity pervasive computing prototyping environment. The kitchen is situated in the main research space in Culture Lab, Newcastle University and uses modified IKEA units and standard laminate flooring installed within a wooden structure (see top figures). Significant care was taken that the underlying technology is not apparent to the people using the kitchen – even the wall projection is achieved using “up-lighter” style projection onto mirrors below the overhead cabinets.

Figure 2. Sample off-the-shelf and custom technologies integrated in the Ambient Kitchen: (a) 4 u DLP projectors for situated displays; (b) 4 u Micaz zigbee motes and sensor boards for object motion sensing; (c) 200 x custom capacitive sensors for floor pressure measurement; (d) 8 u Feig long range RFID readers (and sample tag).

Figure 3. Using an RFID tagged control object the state of all the Ambient Kitchen sensors can be examined on the main display: floor pressure map (left); accelerometers (center-top); RFID (center-bottom); and video feeds (right).

4. CASE STUDIES The Ambient Kitchen has been developed to support a range of research activities around the problem of providing situated support for people with dementia, and situated services associated with food planning, preparation and cooking. As a high fidelity prototyping environment it allows us to support these activities in a number of different ways, as an experimental space for designers, for explaining new technologies to users, for collecting sensor data in benchmark development for activity recognition algorithms, and for the evaluation of complete solutions in a naturalistic setting.

Figure 4. A current design scenario in which the pages of a cookbook have integrated RFID tags. The workbench can detect the current page and adapts the ambient display according the page’s contents.

4.1 A design tool for designers Developing design ideas for pervasive computing applications usually requires a significant effort on the part of designers to imagine what interacting in a fully instrumented environment might be like. In our kitchen scenario, there are no keyboards, mice or conventional input devices, and the ability to physically

Figure 1. The Ambient Kitchen is a lab-based high fidelity pervasive computing prototyping environment. The kitchen is situated in the main research space in Culture Lab, Newcastle University and uses modified IKEA units and standard laminate flooring installed within a wooden structure (see top figures). Significant care was taken that the underlying technology is not apparent to the people using the kitchen – even the wall projection is achieved using “up-lighter” style projection onto mirrors below the overhead cabinets.

Figure 2. Sample off-the-shelf and custom technologies integrated in the Ambient Kitchen: (a) 4 u DLP projectors for situated displays; (b) 4 u Micaz zigbee motes and sensor boards for object motion sensing; (c) 200 x custom capacitive sensors for floor pressure measurement; (d) 8 u Feig long range RFID readers (and sample tag).

Figure 3. Using an RFID tagged control object the state of all the Ambient Kitchen sensors can be examined on the main display: floor pressure map (left); accelerometers (center-top); RFID (center-bottom); and video feeds (right).

4. CASE STUDIES The Ambient Kitchen has been developed to support a range of research activities around the problem of providing situated support for people with dementia, and situated services associated with food planning, preparation and cooking. As a high fidelity prototyping environment it allows us to support these activities in a number of different ways, as an experimental space for designers, for explaining new technologies to users, for collecting sensor data in benchmark development for activity recognition algorithms, and for the evaluation of complete solutions in a naturalistic setting.

Figure 4. A current design scenario in which the pages of a cookbook have integrated RFID tags. The workbench can detect the current page and adapts the ambient display according the page’s contents.

4.1 A design tool for designers Developing design ideas for pervasive computing applications usually requires a significant effort on the part of designers to imagine what interacting in a fully instrumented environment might be like. In our kitchen scenario, there are no keyboards, mice or conventional input devices, and the ability to physically

explore different configurations of sensor-dependent display behavior significantly helps in the exploration and crafting of design ideas. Figure 4 shows a prototype under development in which the ambient displays (on the wall above the main worktop) respond to the turning of the pages of a specially designed cookbook. The cookbook has an RFID tag embedded in each page allowing the kitchen to detect the page that the book is currently open at. Responding to this, the ambient displays present relevant food information and even personal media related to the recipe and past times in the cook’s life when the corresponding meal was prepared.

4.2 A design tool for users Another significant challenge in designing pervasive computing applications is the involvement with users in the design process. We have conducted a number participatory design exercises involving older users and found that a significant barrier to exploring design concepts is adequately explaining the scope of the technologies involved. By demonstrating simple mappings between sensors and display in demonstration applications in the Ambient Kitchen we have found that we can greatly improve lay users' understanding of the potential functionality. For example, figure 5 shows one such commonly used application in which sample recipes are projected in response to the ingredients placed on the bench. Traffic light indicators on the display also show which of the recipes ingredients are in the kitchen’s cupboards. Though a simple demonstration of how sensor data can be mapped to information sources (and then to information displays) our experience is that such illustrations can help users think about both more mundane and adventurous (and useful) applications of such “invisible” technologies.

Figure 5. The Ambient Kitchen has been used to facilitate

your

different people. For an

discussions and focus groups on the topic of pervasive computing as part of a wider participatory design process looking at ICT and nutrition for older people.

4.3 An observatory to collect sensor data Realizing situated services that are responsive to both actions and intentions requires significant development of activity recognition algorithms themselves. As such, multi-sensor benchmarks of everyday activities are not widely available, and as part of our own research, and to support the research of collaborators and the wider research community, we are

developing a number of such benchmarks by capturing data for multiple subjects and activities, and hand annotating these datasets. Activities range from gaze data for head pose tracking algorithm development, primarily using video streams alone (see figure 6), to naturalistic data sets relating to multi-step food preparation for which RFID, accelerometer, pressure and video data is collected and hand annotated.

4.4 An evaluation test bed Evaluation means different things to engineer the question is "does it work?" That is, are the functional requirements met, does it complete certain tests accurately and without failing. For the human factors engineer the question is "does it perform a useful function in the context that it is intended to be used". The latter question can only really be answered by installing the technology in a range of real contexts. In the case of kitchen technologies, this means real lived-in homes. Constraints such as household routines and the different uses different members of a family use different rooms for at different times can be critical in the success or failure of home technologies. These constraints are only really apparent in the context of real home use. Laboratory-based facilities such as the ambient kitchen thus are of limited use in this respect. However it is possible to use them to do more limited evaluations of functional requirements that still have face validity.

Figure 6. Simultaneously captured data from the embedde

y evaluations of this kind but we are

taken. Thus the teabags might be in a container on work surface,

dcameras allows us to develop a benchmark for attention detection algorithms based on tracking head pose and position from multiple viewpoints.

We yet have to complete anplanning to do so. For example, we will use actors trained using video recordings of people with dementia carrying out simple kitchen tasks such as making a hot drink. These recordings were made of people doing these tasks, which were of their own choosing, in their own kitchens. The intention is to make initial tests of algorithms to detect when these people need prompting because they have made an error or stopped in the middle of the task. It would be disorienting and therefore unrealistic to bring people with dementia into the lab but these existing videos can be used to configure the Ambient Kitchen to match the kitchen in the video and then to get an actor to perform the sequence of actions

Chi et al.

a kitchen providing real-time feedback, tracking

the number of calories in food ingredients.

Enabling Nutrition-Aware Cooking in a Smart Kitchen

Abstract We present a smart kitchen that can enhance the traditional meal preparation and cooking process by raising awareness of the nutrition facts in food ingredients that go into a meal. The goal is to promote healthy cooking. Our smart kitchen is augmented with sensors to detect cooking activities and provides digital feedbacks to users about nutritional information on the used food ingredients. We have created a preliminary prototype for evaluation, and the result is promising.

Keywords Context-Aware Computing, Ubiquitous Computing / Smart Environments, Home Computing, Interaction Design, Kitchen, Nutrition

ACM Classification Keywords H5.m. Information interfaces and presentation (e.g., HCI): Miscellaneous.

Introduction A kitchen at home is a place to prepare and cook meals for a whole family. Food preparation can be considered as an act of caring for family members. Through cooking healthy foods for their beloved family members, they receive self-satisfaction in promoting health and reduce risks of chronic diseases in the family. For example, if a

Copyright is held by the author/owner(s). CHI 2007, April 28–May 3, 2007, San Jose, California, USA. ACM 978-1-59593-642-4/07/0004.

Pei-yu (Peggy) Chi National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected] Jen-hao Chen National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected] Hao-hua Chu National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected] Bing-Yu Chen National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected]

LCD display

smart cabinet

figure 1. Smart Kitchen promotes healthy cooking awareness and cooking interaction to the cook.

smart stove

smart counter

CHI 2006 · Work-in-Progress April 28-May 3, 2007 • San Jose, CA, USA

2333

CHI 2007 • Work-in-Progress

To simulate food ingredient recognition, we currently rely on a human observer to manually type in the name of the food ingredient appearing in the system for the first time. When the system detects new weight increase on the counter, the image of new ingredient is taken from an overhead camera and then shown to the human observer for input (shown in Figure 4). We are currently testing other possible methods for automated or manual food ingredient recognition: (1) a voice dialog-system, (2) RFID tags/reader, (3) computer vision recognition.

After the sensor infrastructure detects low-level cooking elements and weights, our system can apply an inference rule engine to infer high level food ingredient transfer activities described as follows.

Food ingredients transfer Recognition This involves tracking the path of each food ingredient from a starting container (e.g., taken from a fridge and placed on the smart counter) to an ending container holding the final cooked meal. This transfer path is made up of a sequence of ingredient transfer events from one

figure 5. User interface of our smart kitchen system. We display nutrition facts about latest ingredients on the left hand side, and overall ingredient information on the right hand side.

container to another container. To track this path, a weight matching algorithm [2] from our previous work is used. That is, the weight sensors underneath the counter, cabinet, and stove can detect weight changes. By matching the equal amount of the weight decrease (e.g., a food container) and the weight increase (e.g., the food mixer pan), we can infer an ingredient transfer from the food container to the food mixer pan. Using a public nutritional database that provides nutrition values for each food ingredient type [10], our system can then calculate and display overall nutrition facts in each container.

After determining nutrition facts of the food ingredients, our system provides awareness feedbacks to raise user’s awareness on healthy quality of food ingredients.

User interface and providing awareness We make use of a LCD display to show the composition of food ingredients in each container on the counter and its nutrition facts. Since users are typically busy during their cooking process, the design of this interface should be as simple and intuitive as possible without demanding high cognitive load on users. Our interface is shown in Figure 5, with nutrition facts of current container and overview of food ingredients information in the system.

Preliminary Evaluation In our preliminary evaluation, we invited one experienced household cook to participate in our smart kitchen system to cook spaghetti for 4 main-course servings with the following recipe:

1. Slice bacon into small strips. Heat the oil in a deep skillet over medium flame. Add the bacon for about 3 minutes until it is crisp and the fat is rendered.

For every container in the system, the interface shows weight information about food ingredients the container has.

figure 4. Dialog window for asking input the name of new food item in the system.

CHI 2006 · Work-in-Progress April 28-May 3, 2007 • San Jose, CA, USA

2336

CHI 2007 • Work-in-Progress

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts“situated” cooking support

panavi system provides a new user experience design by situated suggestions;

real-time sensing and giving feedback

the combination of the pan and the interactive

recipe application

enables users to challenge professional culinary arts by enjoying daily cooking in domestic kitchens

NATURAL INPUTS eyeCOOK is designed specifically to use natural input modalities: those that humans use in human to human, non mediated communication [5]. To reduce the need for users to provide explicit input, or change their behavior to accommodate interface constraints, implicitly provided attentional cues are observed and interpreted. We believe that this approach improves the learnability and intuitiveness of interfaces designed for novice users. Voice Commands eyeCOOK uses context-sensitive, localized grammars. This allows more synonyms for a given speech recognition command, reducing the chance of misinterpreting a word. Eye Gaze Commands When the user is in range of the eye tracker, eyeCOOK substitutes the object of the user’s gaze for the word ‘this’ in a speech command. For example, ‘Define this’ will trigger the define operation on the current eye gaze target. Since current eye trackers are spatially fixed and offer limited mobility to users, the user will not always be in a location where eye tracker input is available. Our speech grammar is designed so that system functionality is not reduced when using speech without eye tracking. This is accomplished by the user stating “define boil”, instead of saying “define this” while looking at the word boil. TOWARDS AN ATTENTIVE KITCHEN Interfaces that recognize and respond to user attention, and understand how it relates to the overall task within the kitchen environment can help the user efficiently engage in

her/his task. To achieve this, we must increase sensing capability [3], improve coordination among appliances [5], and give appliances the ability to affect the environment [3,5]. Environmental Sensors Temperature sensors used to keep track of the status of the oven and the elements of the stove can increase the system’s ability to guide the user’s cooking experience and could be synchronized with electronic timers. Appliance Information Integration Integrating knowledge of the environment can result in improved functionality, taking up less of the user’s time and effort. For example, user recipe preferences, timing constraints, as determined by the user’s electronic schedule, and currently available ingredients, communicated by food storage areas, can be combined to suggest recipes. As well, selecting a recipe can result in the addition of necessary ingredients to an electronic shopping list stored on the user’s Personal Data Assistant (PDA).

Active Environmental Actions The kitchen should not only be aware of its environment, but it should also be able to affect it. Thus, it should be able to take actions which increase efficiency, and reduce the user’s action load, like automatically preheating an oven. CONCLUSIONS We have presented eyeCOOK, a gaze and speech enabled multimodal Attentive User Interface. We have also presented our vision of an Attentive Kitchen in which appliances, informed by sensors, coordinate their behavior, and have the capability to affect the environment. This can reduce the user’s workload, and permit rationalizing requests for user attention. REFERENCES 1. Ju, W. et al. (2001). CounterActive: An Interactive

Cookbook for the Kitchen Counter. Extended Abstracts of CHI 2001 (Seattle, April 2001) pp. 269-270

2. Norman, D. A. The Invisible Computer, MIT press, 1999

3. Schmidt, A. et al. How to Build Smart Appliances. IEEE Personal Communications 8(4), August 2001. pp. 66-71.

4. Selker, T., et al. (2000). Context-Aware Design and Interaction in Computer Systems. IBM Systems Journal (39) 3&4, pp. 880-891.

5. Shell, J. et al. Interacting with Groups of Computers. Commun. ACM 46(3) March, 2003.

6. Tran, Q., et al. (2002). Cook’s Collage: Two Exploratory Designs. Position paper for Families Workshop at CHI 2002. Minneapolis, MN.

Figure 1. eyeCOOK in Page Display Mode

Interactive & Student Posters: Computers Everywhere CHI 2003: NEW HORIZONS

Posters: Computers Everywhere CHI 2003: NEW HORIZONS

997

explore different configurations of sensor-dependent display behavior significantly helps in the exploration and crafting of design ideas. Figure 4 shows a prototype under development in which the ambient displays (on the wall above the main worktop) respond to the turning of the pages of a specially designed cookbook. The cookbook has an RFID tag embedded in each page allowing the kitchen to detect the page that the book is currently open at. Responding to this, the ambient displays present relevant food information and even personal media related to the recipe and past times in the cook’s life when the corresponding meal was prepared.

4.2 A design tool for users Another significant challenge in designing pervasive computing applications is the involvement with users in the design process. We have conducted a number participatory design exercises involving older users and found that a significant barrier to exploring design concepts is adequately explaining the scope of the technologies involved. By demonstrating simple mappings between sensors and display in demonstration applications in the Ambient Kitchen we have found that we can greatly improve lay users' understanding of the potential functionality. For example, figure 5 shows one such commonly used application in which sample recipes are projected in response to the ingredients placed on the bench. Traffic light indicators on the display also show which of the recipes ingredients are in the kitchen’s cupboards. Though a simple demonstration of how sensor data can be mapped to information sources (and then to information displays) our experience is that such illustrations can help users think about both more mundane and adventurous (and useful) applications of such “invisible” technologies.

Figure 5. The Ambient Kitchen has been used to facilitate

your

different people. For an

discussions and focus groups on the topic of pervasive computing as part of a wider participatory design process looking at ICT and nutrition for older people.

4.3 An observatory to collect sensor data Realizing situated services that are responsive to both actions and intentions requires significant development of activity recognition algorithms themselves. As such, multi-sensor benchmarks of everyday activities are not widely available, and as part of our own research, and to support the research of collaborators and the wider research community, we are

developing a number of such benchmarks by capturing data for multiple subjects and activities, and hand annotating these datasets. Activities range from gaze data for head pose tracking algorithm development, primarily using video streams alone (see figure 6), to naturalistic data sets relating to multi-step food preparation for which RFID, accelerometer, pressure and video data is collected and hand annotated.

4.4 An evaluation test bed Evaluation means different things to engineer the question is "does it work?" That is, are the functional requirements met, does it complete certain tests accurately and without failing. For the human factors engineer the question is "does it perform a useful function in the context that it is intended to be used". The latter question can only really be answered by installing the technology in a range of real contexts. In the case of kitchen technologies, this means real lived-in homes. Constraints such as household routines and the different uses different members of a family use different rooms for at different times can be critical in the success or failure of home technologies. These constraints are only really apparent in the context of real home use. Laboratory-based facilities such as the ambient kitchen thus are of limited use in this respect. However it is possible to use them to do more limited evaluations of functional requirements that still have face validity.

Figure 6. Simultaneously captured data from the embedde

y evaluations of this kind but we are

taken. Thus the teabags might be in a container on work surface,

dcameras allows us to develop a benchmark for attention detection algorithms based on tracking head pose and position from multiple viewpoints.

We yet have to complete anplanning to do so. For example, we will use actors trained using video recordings of people with dementia carrying out simple kitchen tasks such as making a hot drink. These recordings were made of people doing these tasks, which were of their own choosing, in their own kitchens. The intention is to make initial tests of algorithms to detect when these people need prompting because they have made an error or stopped in the middle of the task. It would be disorienting and therefore unrealistic to bring people with dementia into the lab but these existing videos can be used to configure the Ambient Kitchen to match the kitchen in the video and then to get an actor to perform the sequence of actions

Figure 1. The Ambient Kitchen is a lab-based high fidelity pervasive computing prototyping environment. The kitchen is situated in the main research space in Culture Lab, Newcastle University and uses modified IKEA units and standard laminate flooring installed within a wooden structure (see top figures). Significant care was taken that the underlying technology is not apparent to the people using the kitchen – even the wall projection is achieved using “up-lighter” style projection onto mirrors below the overhead cabinets.

Figure 2. Sample off-the-shelf and custom technologies integrated in the Ambient Kitchen: (a) 4 u DLP projectors for situated displays; (b) 4 u Micaz zigbee motes and sensor boards for object motion sensing; (c) 200 x custom capacitive sensors for floor pressure measurement; (d) 8 u Feig long range RFID readers (and sample tag).

Figure 3. Using an RFID tagged control object the state of all the Ambient Kitchen sensors can be examined on the main display: floor pressure map (left); accelerometers (center-top); RFID (center-bottom); and video feeds (right).

4. CASE STUDIES The Ambient Kitchen has been developed to support a range of research activities around the problem of providing situated support for people with dementia, and situated services associated with food planning, preparation and cooking. As a high fidelity prototyping environment it allows us to support these activities in a number of different ways, as an experimental space for designers, for explaining new technologies to users, for collecting sensor data in benchmark development for activity recognition algorithms, and for the evaluation of complete solutions in a naturalistic setting.

Figure 4. A current design scenario in which the pages of a cookbook have integrated RFID tags. The workbench can detect the current page and adapts the ambient display according the page’s contents.

4.1 A design tool for designers Developing design ideas for pervasive computing applications usually requires a significant effort on the part of designers to imagine what interacting in a fully instrumented environment might be like. In our kitchen scenario, there are no keyboards, mice or conventional input devices, and the ability to physically

Figure 1. The Ambient Kitchen is a lab-based high fidelity pervasive computing prototyping environment. The kitchen is situated in the main research space in Culture Lab, Newcastle University and uses modified IKEA units and standard laminate flooring installed within a wooden structure (see top figures). Significant care was taken that the underlying technology is not apparent to the people using the kitchen – even the wall projection is achieved using “up-lighter” style projection onto mirrors below the overhead cabinets.

Figure 2. Sample off-the-shelf and custom technologies integrated in the Ambient Kitchen: (a) 4 u DLP projectors for situated displays; (b) 4 u Micaz zigbee motes and sensor boards for object motion sensing; (c) 200 x custom capacitive sensors for floor pressure measurement; (d) 8 u Feig long range RFID readers (and sample tag).

Figure 3. Using an RFID tagged control object the state of all the Ambient Kitchen sensors can be examined on the main display: floor pressure map (left); accelerometers (center-top); RFID (center-bottom); and video feeds (right).

4. CASE STUDIES The Ambient Kitchen has been developed to support a range of research activities around the problem of providing situated support for people with dementia, and situated services associated with food planning, preparation and cooking. As a high fidelity prototyping environment it allows us to support these activities in a number of different ways, as an experimental space for designers, for explaining new technologies to users, for collecting sensor data in benchmark development for activity recognition algorithms, and for the evaluation of complete solutions in a naturalistic setting.

Figure 4. A current design scenario in which the pages of a cookbook have integrated RFID tags. The workbench can detect the current page and adapts the ambient display according the page’s contents.

4.1 A design tool for designers Developing design ideas for pervasive computing applications usually requires a significant effort on the part of designers to imagine what interacting in a fully instrumented environment might be like. In our kitchen scenario, there are no keyboards, mice or conventional input devices, and the ability to physically

explore different configurations of sensor-dependent display behavior significantly helps in the exploration and crafting of design ideas. Figure 4 shows a prototype under development in which the ambient displays (on the wall above the main worktop) respond to the turning of the pages of a specially designed cookbook. The cookbook has an RFID tag embedded in each page allowing the kitchen to detect the page that the book is currently open at. Responding to this, the ambient displays present relevant food information and even personal media related to the recipe and past times in the cook’s life when the corresponding meal was prepared.

4.2 A design tool for users Another significant challenge in designing pervasive computing applications is the involvement with users in the design process. We have conducted a number participatory design exercises involving older users and found that a significant barrier to exploring design concepts is adequately explaining the scope of the technologies involved. By demonstrating simple mappings between sensors and display in demonstration applications in the Ambient Kitchen we have found that we can greatly improve lay users' understanding of the potential functionality. For example, figure 5 shows one such commonly used application in which sample recipes are projected in response to the ingredients placed on the bench. Traffic light indicators on the display also show which of the recipes ingredients are in the kitchen’s cupboards. Though a simple demonstration of how sensor data can be mapped to information sources (and then to information displays) our experience is that such illustrations can help users think about both more mundane and adventurous (and useful) applications of such “invisible” technologies.

Figure 5. The Ambient Kitchen has been used to facilitate

your

different people. For an

discussions and focus groups on the topic of pervasive computing as part of a wider participatory design process looking at ICT and nutrition for older people.

4.3 An observatory to collect sensor data Realizing situated services that are responsive to both actions and intentions requires significant development of activity recognition algorithms themselves. As such, multi-sensor benchmarks of everyday activities are not widely available, and as part of our own research, and to support the research of collaborators and the wider research community, we are

developing a number of such benchmarks by capturing data for multiple subjects and activities, and hand annotating these datasets. Activities range from gaze data for head pose tracking algorithm development, primarily using video streams alone (see figure 6), to naturalistic data sets relating to multi-step food preparation for which RFID, accelerometer, pressure and video data is collected and hand annotated.

4.4 An evaluation test bed Evaluation means different things to engineer the question is "does it work?" That is, are the functional requirements met, does it complete certain tests accurately and without failing. For the human factors engineer the question is "does it perform a useful function in the context that it is intended to be used". The latter question can only really be answered by installing the technology in a range of real contexts. In the case of kitchen technologies, this means real lived-in homes. Constraints such as household routines and the different uses different members of a family use different rooms for at different times can be critical in the success or failure of home technologies. These constraints are only really apparent in the context of real home use. Laboratory-based facilities such as the ambient kitchen thus are of limited use in this respect. However it is possible to use them to do more limited evaluations of functional requirements that still have face validity.

Figure 6. Simultaneously captured data from the embedde

y evaluations of this kind but we are

taken. Thus the teabags might be in a container on work surface,

dcameras allows us to develop a benchmark for attention detection algorithms based on tracking head pose and position from multiple viewpoints.

We yet have to complete anplanning to do so. For example, we will use actors trained using video recordings of people with dementia carrying out simple kitchen tasks such as making a hot drink. These recordings were made of people doing these tasks, which were of their own choosing, in their own kitchens. The intention is to make initial tests of algorithms to detect when these people need prompting because they have made an error or stopped in the middle of the task. It would be disorienting and therefore unrealistic to bring people with dementia into the lab but these existing videos can be used to configure the Ambient Kitchen to match the kitchen in the video and then to get an actor to perform the sequence of actions

Enabling Nutrition-Aware Cooking in a Smart Kitchen

Abstract We present a smart kitchen that can enhance the traditional meal preparation and cooking process by raising awareness of the nutrition facts in food ingredients that go into a meal. The goal is to promote healthy cooking. Our smart kitchen is augmented with sensors to detect cooking activities and provides digital feedbacks to users about nutritional information on the used food ingredients. We have created a preliminary prototype for evaluation, and the result is promising.

Keywords Context-Aware Computing, Ubiquitous Computing / Smart Environments, Home Computing, Interaction Design, Kitchen, Nutrition

ACM Classification Keywords H5.m. Information interfaces and presentation (e.g., HCI): Miscellaneous.

Introduction A kitchen at home is a place to prepare and cook meals for a whole family. Food preparation can be considered as an act of caring for family members. Through cooking healthy foods for their beloved family members, they receive self-satisfaction in promoting health and reduce risks of chronic diseases in the family. For example, if a

Copyright is held by the author/owner(s). CHI 2007, April 28–May 3, 2007, San Jose, California, USA. ACM 978-1-59593-642-4/07/0004.

Pei-yu (Peggy) Chi National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected] Jen-hao Chen National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected] Hao-hua Chu National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected] Bing-Yu Chen National Taiwan University 1, Sec. 4, Roosevelt Rd., Taipei 106, Taiwan [email protected]

LCD display

smart cabinet

figure 1. Smart Kitchen promotes healthy cooking awareness and cooking interaction to the cook.

smart stove

smart counter

CHI 2006 · Work-in-Progress April 28-May 3, 2007 • San Jose, CA, USA

2333

CHI 2007 • Work-in-Progress

To simulate food ingredient recognition, we currently rely on a human observer to manually type in the name of the food ingredient appearing in the system for the first time. When the system detects new weight increase on the counter, the image of new ingredient is taken from an overhead camera and then shown to the human observer for input (shown in Figure 4). We are currently testing other possible methods for automated or manual food ingredient recognition: (1) a voice dialog-system, (2) RFID tags/reader, (3) computer vision recognition.

After the sensor infrastructure detects low-level cooking elements and weights, our system can apply an inference rule engine to infer high level food ingredient transfer activities described as follows.

Food ingredients transfer Recognition This involves tracking the path of each food ingredient from a starting container (e.g., taken from a fridge and placed on the smart counter) to an ending container holding the final cooked meal. This transfer path is made up of a sequence of ingredient transfer events from one

figure 5. User interface of our smart kitchen system. We display nutrition facts about latest ingredients on the left hand side, and overall ingredient information on the right hand side.

container to another container. To track this path, a weight matching algorithm [2] from our previous work is used. That is, the weight sensors underneath the counter, cabinet, and stove can detect weight changes. By matching the equal amount of the weight decrease (e.g., a food container) and the weight increase (e.g., the food mixer pan), we can infer an ingredient transfer from the food container to the food mixer pan. Using a public nutritional database that provides nutrition values for each food ingredient type [10], our system can then calculate and display overall nutrition facts in each container.

After determining nutrition facts of the food ingredients, our system provides awareness feedbacks to raise user’s awareness on healthy quality of food ingredients.

User interface and providing awareness We make use of a LCD display to show the composition of food ingredients in each container on the counter and its nutrition facts. Since users are typically busy during their cooking process, the design of this interface should be as simple and intuitive as possible without demanding high cognitive load on users. Our interface is shown in Figure 5, with nutrition facts of current container and overview of food ingredients information in the system.

Preliminary Evaluation In our preliminary evaluation, we invited one experienced household cook to participate in our smart kitchen system to cook spaghetti for 4 main-course servings with the following recipe:

1. Slice bacon into small strips. Heat the oil in a deep skillet over medium flame. Add the bacon for about 3 minutes until it is crisp and the fat is rendered.

For every container in the system, the interface shows weight information about food ingredients the container has.

figure 4. Dialog window for asking input the name of new food item in the system.

CHI 2006 · Work-in-Progress April 28-May 3, 2007 • San Jose, CA, USA

2336

CHI 2007 • Work-in-Progress

interactive cookbook situated coaching system

kitchen providing the number of calories

previous “situated” cooking support systems

design process

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary ArtsCarbonara: initial menu

We chose a recipe of Italian pasta Roman-styled Carbonara as an initial menu.

this recipe requires several ways of cooking including sensitive temperature control, which is difficult to master.

We quoted a recipe of Tsutomu Ochiai a famous Japanese chef of Italian cuisines who sometimes appears in TV.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsiterative design process

sketching

experiment: cooking

sketching

preliminary user study

prototyping

user study

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsiterative design process

Tinkering: Sketching a circuit sensing temperature

Tinkering: attaching a circuit embedded with functions of wireless communication and

temperature sensing, to an iron made fry-ing pan

Tinkering: attaching a circuit embedded with functions of wireless

communication and temperature sensing, to an iron made fry-ing pan

Tinkering: A thermocouple sensor attached with the pan using Aluminum tape

Tinkering: Cooking with monitoring the temperature

Tinkering: Cooking with monitoring the temperature

Tinkering: Cooking with monitoring the temperature

Tinkering: Trials and errors where to attach the sensor (But it burns toride TOXI)

Tinkering: Repeating a cycle of making a prototype and cooking with it, once a week iteratively during 90days.

Tinkering: testing designs of user interface, while hardware sketching

Tinkering: Repeating a cycle of making a prototype and cooking with it, once a week iteratively during 90days.

Tinkering: Repeating a cycle of making a prototype and cooking with it, once a week iteratively during 90days.

Tinkering: Sketching a case for a circuit using the laser cutter after the board design is fixed

Tinkering: Sketching a case for a circuit using the laser cutter after the board design is fixed

Tinkering: Considering a system of projecting the temperature to the fry-ing pan for vision problems

Tinkering: Considering a system of projecting the temperature to the fry-ing pan for vision problems

User Study: Actually having 2 persons use the prototype

User Study: Actually having 2 persons use the prototype

User Study: Understanding the need of navigation

User Study: Need to understand the skill and cognition difference between individuals

Prototyping: Preparing to create Firm Prototype

Prototyping: Using Aluminum Molding

Prototyping: Using Aluminum Molding

Prototyping: Casting a mold to seal sensor

Prototyping: Creating Circuit using Manufacturer

Prototyping: Creating Circuit using Manufacturer

Prototyping: Cutting Fry-ing pan Handle using MODELA

Prototyping: Complete Prototype (when separated into pieces)

Prototyping: Completed Prototype

Exhibition: JST/CREST Ubiquitous Contents Showcase

Exhibition: JST/CREST Ubiquitous Contents Showcase

Exhibition: JST/CREST Ubiquitous Contents Showcase

design

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsarchitecture overview

Figure 5. Three Kinds of Setting Displayed in the Monitor; (a) Summary, (b) Detail, (c) Condition / Name of each Area [See, (b)]; (b-1) Main Panel,(b-2) Current Step Panel and Checkbox, (b-3) Comment Panel, (b-4) Temperature Panel, (b-5) Timer Panel

elapsed time, and the timer of boiling the pasta, which is alsoa supplemental function to Condition mode.

IMPLEMENTATION & ARCHITECTUREpanavi system mainly consists of three parts; the specialkitchen utensil (frying-pan) with embedded sensors andactuators, the display system connecting the computer withthe touch monitor and the projection system, and the softwaresystem showing cooking sequence information (Figure 6).

panavidisplay

Computer(panavi OS + Original Cooking Sequence)

ElectronicCircuit(MOXA-B)

ElectronicCircuit(MOXA-A)

Actuators

Xbee Wireless Communication

Sensors

Thermocouple Sensor

Acceleration Sensor

Vibration Motor

LEDs

USB Cable(Serial Communication)VGA

Cable

Mirror

Projector

Speaker

Projection

TouchMonitor

Special Pan

Figure 6. System Architecture

Special PanThe main body of the pan is formed of cast aluminum,with a handle made of Bakelite that can be clamped tothe body (Figure 4-c, e). The pan is embedded with twosensors. One is the J-type thermocouple sensor measuringthe current temperature with a fast response speed that cansense the temperature shift continuously. A small, stainlesssteel pipe is implanted in the pan from one end of thehandle to the center of the pan in order to embed the sensor.Another is the acceleration sensor also embedded in thehandle sensing the movement of the pan. A MOXA [23](MOXA-A in Figure 6) is embedded in the pan’s handle,which wirelessly communicates with the computer systemby sending the sensors’ value to another MOXA (MOXA-B in Figure 6) connected with the computer as a serverclient. On the surface of the pan, the proper temperatureand the instructions regarding movements are displayed from

the projector. Actuators, LEDs and vibration motors areembedded in the handle, controlled by the signals from thecomputer system via MOXA-B.

Display and ComputerThe computer is connected with ‘panavi display’ packagingtouch panel monitor and projector, and ‘panavi OS’ withthe ‘Original Cooking Sequence’ works as an Adobe Flashapplication on the computer (as shown in Figure 6). Thepanavi OS displays the instructions by analyzing the sensors’degrees against parameters programed in the system.

Original Cooking SequenceThe original cooking sequence models the recipe ofCarbonara, consisting of videos and photos in addition tothe general text recipes. For the panavi OS, this cookingsequence was reconstructed by the development team and itsprocedure is divided into 13 steps (as shown in Table 2). Eachstep is programmed with settings; temperature, sounds, andvibrations settings etc. (as shown in Table 1).

Preparing Pasta (Table 2, Step 1-4) When the checkboxof the Step 1 is touched, the elapse time counter from thebeginning (See, Figure 5-5) starts. The normal cooking timeis set to 20 minutes. When Step 4 is checked, the pasta timershown in the Detail mode starts. The user should completecooking the pancetta by the end of the time limit. The boilingtime for the pasta is set to 6 minutes.

Frying Pancetta (Table 2, Step 5-7) At the Step 5, thetemperature setting A (Figure 1-A, 165◦C) is applied. Inthe Comment Panel (Figure 5-3) the instruction messageis displayed; “Stop Heating” if the current temperature istoo high, “A little hotter” if slightly higher, “Keep at thistemperature” if it is on the proper setting, or “Heat up a littlemore” if it is slightly lower. The user is required to follow theinstructions and keep the proper temperature.

Cooling Down (Table 2, Step 8-9) From the Steps 8 to11, the temperature setting B (Figure 1-B, 70◦C) is applied,because this setting is optimized to cool the pan. Theinstruction messages are displayed as follows; “Cool the panon the wet washcloth” if the current temperature is higher thanthe proper temperature or “Go to the next step” if lower.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsdisplay

Figure 5. Three Kinds of Setting Displayed in the Monitor; (a) Summary, (b) Detail, (c) Condition / Name of each Area [See, (b)]; (b-1) Main Panel,(b-2) Current Step Panel and Checkbox, (b-3) Comment Panel, (b-4) Temperature Panel, (b-5) Timer Panel

elapsed time, and the timer of boiling the pasta, which is alsoa supplemental function to Condition mode.

IMPLEMENTATION & ARCHITECTUREpanavi system mainly consists of three parts; the specialkitchen utensil (frying-pan) with embedded sensors andactuators, the display system connecting the computer withthe touch monitor and the projection system, and the softwaresystem showing cooking sequence information (Figure 6).

panavidisplay

Computer(panavi OS + Original Cooking Sequence)

ElectronicCircuit(MOXA-B)

ElectronicCircuit(MOXA-A)

Actuators

Xbee Wireless Communication

Sensors

Thermocouple Sensor

Acceleration Sensor

Vibration Motor

LEDs

USB Cable(Serial Communication)VGA

Cable

Mirror

Projector

Speaker

Projection

TouchMonitor

Special Pan

Figure 6. System Architecture

Special PanThe main body of the pan is formed of cast aluminum,with a handle made of Bakelite that can be clamped tothe body (Figure 4-c, e). The pan is embedded with twosensors. One is the J-type thermocouple sensor measuringthe current temperature with a fast response speed that cansense the temperature shift continuously. A small, stainlesssteel pipe is implanted in the pan from one end of thehandle to the center of the pan in order to embed the sensor.Another is the acceleration sensor also embedded in thehandle sensing the movement of the pan. A MOXA [23](MOXA-A in Figure 6) is embedded in the pan’s handle,which wirelessly communicates with the computer systemby sending the sensors’ value to another MOXA (MOXA-B in Figure 6) connected with the computer as a serverclient. On the surface of the pan, the proper temperatureand the instructions regarding movements are displayed from

the projector. Actuators, LEDs and vibration motors areembedded in the handle, controlled by the signals from thecomputer system via MOXA-B.

Display and ComputerThe computer is connected with ‘panavi display’ packagingtouch panel monitor and projector, and ‘panavi OS’ withthe ‘Original Cooking Sequence’ works as an Adobe Flashapplication on the computer (as shown in Figure 6). Thepanavi OS displays the instructions by analyzing the sensors’degrees against parameters programed in the system.

Original Cooking SequenceThe original cooking sequence models the recipe ofCarbonara, consisting of videos and photos in addition tothe general text recipes. For the panavi OS, this cookingsequence was reconstructed by the development team and itsprocedure is divided into 13 steps (as shown in Table 2). Eachstep is programmed with settings; temperature, sounds, andvibrations settings etc. (as shown in Table 1).

Preparing Pasta (Table 2, Step 1-4) When the checkboxof the Step 1 is touched, the elapse time counter from thebeginning (See, Figure 5-5) starts. The normal cooking timeis set to 20 minutes. When Step 4 is checked, the pasta timershown in the Detail mode starts. The user should completecooking the pancetta by the end of the time limit. The boilingtime for the pasta is set to 6 minutes.

Frying Pancetta (Table 2, Step 5-7) At the Step 5, thetemperature setting A (Figure 1-A, 165◦C) is applied. Inthe Comment Panel (Figure 5-3) the instruction messageis displayed; “Stop Heating” if the current temperature istoo high, “A little hotter” if slightly higher, “Keep at thistemperature” if it is on the proper setting, or “Heat up a littlemore” if it is slightly lower. The user is required to follow theinstructions and keep the proper temperature.

Cooling Down (Table 2, Step 8-9) From the Steps 8 to11, the temperature setting B (Figure 1-B, 70◦C) is applied,because this setting is optimized to cool the pan. Theinstruction messages are displayed as follows; “Cool the panon the wet washcloth” if the current temperature is higher thanthe proper temperature or “Go to the next step” if lower.

There are three kinds of setting displayed in the monitor

(1) Main Panel (2) Current Step Panel and Checkbox (3) Comment Panel (4) Temperature Panel (5) Timer Panel

Summary Detail ConditionUsers can use any setting anytime.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsoriginal cooking sequence

Step 1. Pasta (1) - Boil Water Step 2. Making Egg Sauce Step 3. Pasta (2) - Start Cooking Pasta Step 4. Pasta (3) - Start Timer Step 5. Pancetta (1) - Fry Pancetta [A] Step 6. Pancetta (2) Add Wine Step 7. Pancetta (3) - Seasoning Step 8. Pancetta (4) - Cooling [B] Step 9. Pasta (4) - Drain Water Step 10. Finish (1) - Add Pasta to Pan Step 11. Finish (2) - Dress with Egg Sauce Step 12. Finish (3) - Heat Egg Sauce [C] Step 13. Serve

The original cooking sequence models the recipe of Carbonara.

Each step is programmed with settings; temperature, sounds, and vibrations etc.

Figure 3. Left: The Prototype Used in the Preliminary User Study,Right: Setting of the Preliminary Study

in the form of stirring, shaking and moving by embedding anaccelerometer in the handle of the pan that senses the pan’smoves. In addition, we decided to provide video instructionsto support text recipes on the navigation display. After thistrial and error process, the current design of the systemdescribed in the following section was finalized.

DESIGN: FUNCTIONS AND INSTRUCTIONS

Special Frying-PanThe special frying-pan (Figure 4-a) that can be used to cookingredients over the heated stove, provides some instructionsvia LED lamps, vibration, and projection images. Five colorLED lamps indicate temperature conditions and a vibrationmotor moves when the users’ temperature control is good,which are embedded within the handle of the pan. Theinstructional graphics by the projection can also be checkedon the display monitor to support the recognition of theprojection when the pan is removed from the stove or whenthe projection cannot be recognized since there is muchingredient on the pan. If part of the system fails, other partscould still help the user (i.e. if the projector fails, the displayand the LED indication work). See, Table 1. The temperaturecolor changes from white to blue, green, yellow, and red,depending on the value, which also synchronizes with theLEDs. White is the default color or indicates when thecondition temperature is too low, blue indicates 5-10 degreeslower than the proper temperature setting, green indicateswhen it is at the proper temperature within 5 degrees, yellowindicates 5-10 degrees higher, and red indicates over 15

- 0 54 - 64 65 - 74 75 - 84 85 -

ON—

the second hand of a clock

ON

warningtone

Color

TEMP(°C)

Vibration

Sound

- 69 70 - 79 80 - 89 90 - 99 100 -

B

- 149 150 - 159 160 - 169 170 - 179 180 -A

C

Blue Green Yellow RedWhite

ON

the second hand of a clock

the second hand of a clock

Table 1. Parameters to Cook the Carvonara Programed in the System.

degrees to high. When the pan is being heated, the systemmakes a sound like the second hand of a clock at 1-secintervals. When the color is red, it makes a warning tone. Theuser is required to stay within the green zone by controllingthe stove by turning it off or cooling the pan. In addition, theinstructions about actions such as shaking or stirring appear.When the pan should be shaken, an animation of an arrowwill be displayed on the surface as shown in Figure 4-b. Atthe same time, the user is required to observe the specificinstruction by referring to the texts on the display.

Figure 4. Special Fry-ing Pan; (a) Current Temperature is Projected, (b)Temperature and Action are Indicated, (c) Separated the Body and theHandle, (d) Stainless Made Small Pipe, (e) Inner Circuit of the Handle

DisplayThere are three modes displayed on the main panel (Figure5-b-1) that are loadable by touching a particular tab, whichare ‘Summary,’ ‘Detail,’ and ‘Condition’ at anytime. TheSummary mode (Figure 5-a) displays an overview of thetext recipes and current progress. Detail mode (Figure 5-b) displays a detailed culinary art of the text recipe withvideo instructions, Condition (Figure 5-c) displays the boilingtime, current temperature and the pan’s movement like thedashboard of a car.The current Step Panel and Checkbox (Figure 5-b-2) displaysthe title of each step (Table 2) and the check box that managesthe procedure. The user touches the box when the currentstep’s tasks are completed and the system recognizes it andloads the new navigation setting as the next step. CommentPanel (Figure 5-b-3) indicates the instruction massagesincluding advice, caution, or alert texts depending on thecurrent condition, which the user should notice continuouslyduring the cooking process. Temperature Panel (Figure 5-b-4) is an auxiliary tab used when the user wants to checkthe temperature not referring Condition mode. Timer Panel(Figure 5-b-5), displays the target time to finish cooking, the

Its procedure is divided into 13 steps.

user study

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsuser study

focused on not only general or common findings but also each user’s originality; their respective backgrounds, and experiences about cooking.

We observed how the system effects the users experience while cooking.

User A User B User C & D

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsmethod

1. prior interview 2. cooking with panavi 3. posterior interview

The authors interviewed each user about his/her cooking experience.

Firstly, we introduced how to use this system. Next, we showed what is the perfect Carbonara by the photo. During the cooking, we did not help the users except troubles (e.g. system errors).

We interviewed the users with watching a video of his/her cooking. They were required to explain intentions and impressions about each action, activity, and process.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artssetting

Finishing (Table 2, Step 10-12) On the Step 10 and 11,the instruction messages are displayed as follows; “Coolthe pan on the wet washcloth” if the current temperature ismuch higher than the proper temperature, “Wait a momenttill the temperature drops” if slightly higher than the propertemperature, or “Put the egg sauce” if lower. The users areurged to follow the instructions because the sauce will burnif the temperature is too high. After the Step 12 begins, thetemperature setting C (Figure 1-B, 85 ◦C) is applied, whichis optimized to heat the egg sauce with the pasta. This stepmodels “Stirring the sauce constantly until it comes to a boiland has thickened with shaking and swinging the pan,” whichis required for high level skills. The instruction messages aredisplayed as follows; “Remove the pan from fire and shake”if the temperature is higher than the proper degree or “Heatup a little more” if lower. In addition, the messages alsohave instructions about movements fed from the accelerationsensor; “Shake the pan and stir inside” when the pan is notshaken. The user must keep the temperature to 85 ◦C stirringthe sauce very quickly.

USER STUDYThis section describes the user experience of cooking usingthe panavi system. In order to earn detailed user experiences,we observed how the system effects the users’ behavior whilecooking, verified the usability of the current prototype, andfound its problems. The user study was conducted in anenvironment specially constructed in our research laboratorysimilar to the kitchen, as shown in Figure 7 and Figure 8-A. Total of four beginner or intermediate level persons inthree groups without experience of making Carbonara wereselected, because experts or professional chefs are able tocook without the system.

Utensilsrubber spatula, ladle,

tongs, tablespoon, folk

Ingredientsolive oil, white wine, pancetta, eggs, pasta, salt, black pepper

Wet washcloth

Wet washcloth

Bowls

DisplayDisplay

PotPot PanPan

Table

Camera 1

Camera 2

User

Camera 3

Plate

Figure 7. Layout of the User Study

In this study, we consider not only general or commonfindings but also each user’s originality; their respectivebackgrounds, and experiences about cooking. Not onlydoes it improve the users’ skills but it provides fruitfulexperiences such as challenging to difficult recipes, enjoyingcooking actions, and tasting delicious dishes even athome. Design researchers should not only evaluate the

Step 1. Pasta (1) - Boil WaterBoil 8 cups of water in the pot.Step 2. Making Egg SauceBeat a whole egg with an extra egg-yolk in a bowl. Then add 15 gramsof grated cheese and small amount of freshly-ground black pepper, andbeat them well again.Step 3. Pasta (2) - Start Cooking PastaPut one and a third of tablespoons of salt and 70 grams of pasta into thepot.Step 4. Pasta (3) - Start TimerStart the timer for boiling pasta. (Keep boiling for 6 minutes.)Step 5. Pancetta (1) - Fry PancettaHeat a teaspoon of olive oil in a pan. Add 30 grams of diced pancetta andfry over high heat to the proper temperature. Keep cooking until crispyand brown.Step 6. Pancetta (2) Add WineTurn off the stove temporarily and add 10ml of white wine. Heat it againat the proper temperature until almost no liquid remains in the pan.Step 7. Pancetta (3) - SeasoningTurn off the stove and move the pan onto a wet washcloth. Add 50ml ofboiled water from pasta pot to the pan along with a table spoon of freshwater. Shake the pan until oil and water are mixed well.Step 8. Pancetta (4) - CoolingKeep the pan under the proper temperature.Step 9. Pasta (4) - Drain WaterAfter the pasta timer rung, drain the water from pasta well.Step 10. Finish (1) - Add Pasta to PanAdd pasta and mix it to be dressed with the pancetta.Step 11. Finish (2) - Dress with Egg SauceAdd the egg sauce into the pan. Stir with a spatula about 15 seconds untilthe yellow and the white of the egg uniformly combined.Step 12. Finish (3) - Heat Egg SauceHeat the pan at the proper temperature. Continue to shake the pan andstir inside until smooth and creamy.Step 13. ServeServe immediately to avoid after heat.

Table 2. Original Cooking Sequence’s Texts Displayed in theMain Panel(Figure 5-b-1) of the Detail Mode (Figure 5-b).

effectiveness of the system itself, but they must verify realexperience of users with systems [12]. Because our workis not a scientific experiment system, we considered thatquantitatively measuring user experience and effectiveness isnot suitable, therefore focused on how each user’s experiencehas changed from their everyday cooking lives. Thisreport describes the detailed experience including the user’sown context and differences among individuals; each user’scharacters, previous experiences, and creative impressionsthrough the user studies.

MethodThe user study was conducted in the following sequence. Atfirst, the authors interviewed each user about his/her cookingexperience, home environment, and other related things aboutcooking and diet in 5 - 10 minutes as a prior interview. Beforestarting the cooking, we introduced how to use this systemin about 5 minutes and then gave the user about 5 minutesto learn the functions of panavi, actually heating the panand showing the changes of the temperature’s degree, andtouching the display to operate the system. In addition, weexplained what the end goal of the process is by showingthe photo of completed Carbonara, Figure 8-a. During thecooking, we did not help the user in all tasks except whentroubles such as system errors occurred. After the cooking,

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts

User A

User A (Study 1)

UserA: 23 years old male He has an experience of living on his own for a brief time, but lives with his parents now. Although he has tried to do basic cookings, he could not make well. As a result, he does not do any cooking recently.

not enough experience in cooking

does not do any cooking

recently

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts

User B

User B (Study 2)

User B: 24 years old female She lives with her parents and gives some help to their cooking. They have interests in food and cooking, and hold their own home garden. Raised in such a family environment, she has basic knowledge of cooking, but has not had a chance to challenge authentic menus.

has basic knowledge of cooking

does simple cooking

once or twice a week.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary ArtsUser C&D (Study 3)

User C: 24 years old male User C lives with his parents and brothers.He has various cooking experiences; as a Boy Scout member and in part-time jobs at a restaurant. He does cooking (mainly breakfast) 3 or 4 times a week.

User D: 22 years old female User D lives on her own for 5 years and has a habit of cooking. She mastered the recipe book for beginners her mother gave her, and see web sites to expand her cooking repertoire.

basic knowledge of cooking

User DUserC

various knowledge of cooking

cooking 3-4 times a week.

prefers to expand

her cooking repertoire

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Result (Study 1: User A)

He sometimes could not understand texts of instructions. As a result, he spent a lot of time to finish cooking. His pasta was too boiled and soft...

At the final stage, he could not stop heating even after the sauce was thick and creamy. His sauce was slightly baked compared to the perfect example. “I could not image the completion, because I had never eaten this menu.”

Even he faced many difficulties, finally got a fine dish!

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Result (Study 2: User B)

She enjoyed the process like playing a game, and completed a delicious dish except it was slightly scorched. “I am very satisfied to make a delicious dish. I have felt the process was easier than I had imagined.”

She easily finished the first 5 steps, just following the instructions without any difficulties. After that, she precisely checked instruction texts and carefully processed the further steps. “At first, I could not understand the meaning of the task on this step. It was difficult so I tried carefully, different from the previous steps.” - Step 6

She enjoyed cooking with panavi. “What an easy cooking!”

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Result (Study 3: User C&D)

User D mainly cooked with reading the recipe texts many times. While, User C frequently checked the proper & current temperature and supported her activities.

They did not check recipe texts or videos so much, since they relied on their experience. But, in fact, they said “We’d forgotten about watching the video.” Therefore, their cooking was very quickly finished, but their Carbonara was slightly sloppy (not heated well). Both User B&C were disappointed with the result, and said “I want to cook it again using the system.”

They quickly finished cooking on their collaborative works.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Reflections

This study revealed the difference between the understandings of instructions for each.

User A User B User C & D

finished in 32min. finished in 18min. finished in 15.5min.

The users had distinctive prior knowledge and skills about cooking.

- normal cooking time: 20 min.-

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Reflections

This system could take care each user’s situated actions,

even for the beginner user: User A.

User A sometimes could not understand texts of instructions.

He could keep the proper temperature and did not failed at the last stage.

He did not know how to prepare pasta...

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Reflections

But, at some points, the system could not respond to the users’ situated actions,because the instructions by texts and vide tutorials

are fixed and not changeable.

User C&D sometimes skipped instructions.

User A: “I could not judge between the important steps and the omissible ones, because it was too much information for me to understand and follow.”

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Arts Reflections

The current system cannot navigate the timing of judging

when the heating should be stopped.All users could keep good temperature,

but nobody made the perfect one.

One of User A was slightly baked.

One of User B was was slightly scorched

One of User C&D was was slightly sloppy

conclusion

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artssmart daily commodities

This research provided a model for designing daily commodities (e.g. kitchen utensils)

as smart objects enriching our everyday life.Figure 5. Three Kinds of Setting Displayed in the Monitor; (a) Summary, (b) Detail, (c) Condition / Name of each Area [See, (b)]; (b-1) Main Panel,(b-2) Current Step Panel and Checkbox, (b-3) Comment Panel, (b-4) Temperature Panel, (b-5) Timer Panel

elapsed time, and the timer of boiling the pasta, which is alsoa supplemental function to Condition mode.

IMPLEMENTATION & ARCHITECTUREpanavi system mainly consists of three parts; the specialkitchen utensil (frying-pan) with embedded sensors andactuators, the display system connecting the computer withthe touch monitor and the projection system, and the softwaresystem showing cooking sequence information (Figure 6).

panavidisplay

Computer(panavi OS + Original Cooking Sequence)

ElectronicCircuit(MOXA-B)

ElectronicCircuit(MOXA-A)

Actuators

Xbee Wireless Communication

Sensors

Thermocouple Sensor

Acceleration Sensor

Vibration Motor

LEDs

USB Cable(Serial Communication)VGA

Cable

Mirror

Projector

Speaker

Projection

TouchMonitor

Special Pan

Figure 6. System Architecture

Special PanThe main body of the pan is formed of cast aluminum,with a handle made of Bakelite that can be clamped tothe body (Figure 4-c, e). The pan is embedded with twosensors. One is the J-type thermocouple sensor measuringthe current temperature with a fast response speed that cansense the temperature shift continuously. A small, stainlesssteel pipe is implanted in the pan from one end of thehandle to the center of the pan in order to embed the sensor.Another is the acceleration sensor also embedded in thehandle sensing the movement of the pan. A MOXA [23](MOXA-A in Figure 6) is embedded in the pan’s handle,which wirelessly communicates with the computer systemby sending the sensors’ value to another MOXA (MOXA-B in Figure 6) connected with the computer as a serverclient. On the surface of the pan, the proper temperatureand the instructions regarding movements are displayed from

the projector. Actuators, LEDs and vibration motors areembedded in the handle, controlled by the signals from thecomputer system via MOXA-B.

Display and ComputerThe computer is connected with ‘panavi display’ packagingtouch panel monitor and projector, and ‘panavi OS’ withthe ‘Original Cooking Sequence’ works as an Adobe Flashapplication on the computer (as shown in Figure 6). Thepanavi OS displays the instructions by analyzing the sensors’degrees against parameters programed in the system.

Original Cooking SequenceThe original cooking sequence models the recipe ofCarbonara, consisting of videos and photos in addition tothe general text recipes. For the panavi OS, this cookingsequence was reconstructed by the development team and itsprocedure is divided into 13 steps (as shown in Table 2). Eachstep is programmed with settings; temperature, sounds, andvibrations settings etc. (as shown in Table 1).

Preparing Pasta (Table 2, Step 1-4) When the checkboxof the Step 1 is touched, the elapse time counter from thebeginning (See, Figure 5-5) starts. The normal cooking timeis set to 20 minutes. When Step 4 is checked, the pasta timershown in the Detail mode starts. The user should completecooking the pancetta by the end of the time limit. The boilingtime for the pasta is set to 6 minutes.

Frying Pancetta (Table 2, Step 5-7) At the Step 5, thetemperature setting A (Figure 1-A, 165◦C) is applied. Inthe Comment Panel (Figure 5-3) the instruction messageis displayed; “Stop Heating” if the current temperature istoo high, “A little hotter” if slightly higher, “Keep at thistemperature” if it is on the proper setting, or “Heat up a littlemore” if it is slightly lower. The user is required to follow theinstructions and keep the proper temperature.

Cooling Down (Table 2, Step 8-9) From the Steps 8 to11, the temperature setting B (Figure 1-B, 70◦C) is applied,because this setting is optimized to cool the pan. Theinstruction messages are displayed as follows; “Cool the panon the wet washcloth” if the current temperature is higher thanthe proper temperature or “Go to the next step” if lower.

technical framework methodologythis framework senses and analyze users’ state, and generating feedback suitable for

each user’s situations.

this integrated technical elements should be designed through

the prototyping process in real domestic environments.

×

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artssmart daily commodities

The design of panavi enables the users to

concentrate in cooking without noticing if it is a

computing device.It seems the same as normal

consumer products. This design model is needless of

any massive equipment to construct the system.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsfuture visions

We would like to measure and model chefs’ cooking ways and apply these data to

navigation, and produce various kinds of menu working on our future prototype.

CHI 2012 “EATING + COOKING” Monday, May 7th, 2012

panavi: Recipe Medium with a Sensors-Embedded Pan for Domestic Users to Master Professional Culinary Artsfuture visions

We will proceed to develop & improve this system to be more suitable for real domestic

contexts, which encourage people’s daily cooking to become more enjoyable.

“Interactivity” Please come to our demo booth!

We will introduce panavi by real cooking show.

MON 18:00-20:00 TUE 16:00-19:00

WED 13:00-14:30

Today’s Reception Highlight on Interactivity Interactivity Encore

Maybe and so on...

thanks!