What’s cookin’?

Dhairya Dand, Christian Ervin, Robert Hemsley, David Nuñez, Laura Perovich

What’s cookin’? is a collaborative cooking system that helps people create meals together even when they’re apart. It is a collection of augmented kitchen tools, surfaces, and computational representation of meals.

Video

Cooking is a social, shared, and sensory experience. The best meals bring friends together and engage all of our senses, from the sizzling of sautéing onions to the texture of toasted bread.

However, sometimes it’s inconvenient for people to come together to share the creative process; kitchen space can be limited, heavy cooking tools may be difficult to transport, and commitments in the home, such as childcare, may discourage people from relocating.

Our tool allows people to cook together in a remote, synchronous, collaborative environment by mirroring the social, shared, and sensory nature of the cooking experience. It is particularly suited for collaborations that involve multiple parallel processes that work in isolation but also intersect at various times throughout the collaboration, most notably at the end when a meal is produced.

We demonstrate example interactions through a series of video sketches and physical prototypes based on friends coming together to share a meal. Our tool can can also be employed in other environments, such as the industrial kitchen or food television. Our model of collaboration extends to large crafts, such as fashion or woodworking and is serves as a step towards a broader framework where embedding knowledge in objects through interaction creates wikipedia for the physical world.

From a Table Away

In our collaborative kitchen environment the tabletop is the mediator which brings together the remote collaborators and bridges the interaction between our tangible utensils, our physical interactions and the environment. Its embedded ambient displays serve as the communication hub between chefs and render augmented information about objects placed on the surfaces.

To reduce information overload between the local and remote collaborators, we use an ambient display to enables glanceable awareness of the remote users tabletop activity. Through this users can be made aware of the remote party’s progress through a recipe or be alerted when a collaborator is struggling with a specific task. If the user wishes to transition to directly interact with the remote workspace, they can adjust the focus of their countertop bringing the impression of remote environment into the foreground of their workspace.

The surfaces help to create a shared workspace that creates enables collaborative teaching and fosters a sense of co-location and teamwork. It provides ambient notifications of the progress for the entire meal, but does not prevent any individual cook from working on his own tasks. Through this we ensure that there is a shared awareness of the ingredients and the tasks being undertaken and how these interrelate which together creates the experience of co located interaction on one shared tabletop.

Spice Spice Baby

The use of spices within this recipe extends the collaboration from the tabletop into the physical environment. Objects which have shared meaning and affordances are mirrored into the remote physical location allowing users to seamlessly share interactions and knowledge.

When Robert buys a bottle of wine that has particularly robust notes, **What’s Cookin’?** helps the chefs collaborate to improve the meal by altering the shared recipe to better match this wine. The system alerts the chefs about the wine choice, and Laura, an expert on wine and food pairings, indicates to David that he should adjust the spices in his marinara sauce. Laura locally taps the bottles recommends and the necessary types and quantities are mirrored into David’s environment. David sees his own spice bottles glow and as he adds the spices to his pot, the bottles’ glow slowly fades out until he has deposited the correct amount.

This tool enables more natural collaboration as users to are able to physically use their bodies to interact with the objects as they would if they were performing the same task locally. This allows the user to draw upon their existing mental models and kinetic memory to help recall which spices to use within the interaction. We re-use existing tools and objects and so enable the seamless continuation of their existing practices sharing the knowledge between these mirrored objects.

The self-aware bottle also records it’s interactions allowing the local user to record their own interactions and replay the information at a time in the future.

We created a physical prototype of this interaction that demostrated the experience with a spice bottle; it glowed to indicate the need to add a spice. A tilt sensor inside the device tracked when the user successfully added spice; the color of the glow changed and eventually decreased in response to a correct number of shakes.

Knead you to knead me

David sees that Laura is frustrated kneading her dough, David sends a tangible video of his hands kneading, which Laura can see overlaid on her dough. After observing David, she slides his hands aside and now practices kneading with her newly learnt style.

Video conferencing is the status quo of remote collaboration today. The question we asked was would it to be more meaningful and personal if the video can be overlaid on the context, in this case the dough, at the same time provide a tangible feedback. This lead to a co-located collaboration experience while still being remote.

Hack the knife

David is struggling to use his knife, Laura helps him learn to cut by mimicking the cutting action at her side. David’s learning is augmented by audio and haptic feedback.

The kitchen has been the birthplace for tools – knives, utensils, spoons which have existed from the stone age to our age; minus their functionality they have hardly ever evolved. Each of our tools is associated with a set of knowledge on how to use them and their best practices, we thought how would it be to have tools that are self-aware, tools that teach and learn from you. These tools not only connect to you but connect you to other people who use them.

We created a working prototype, Shared Spoons, to explore what it means to embed knowledge in the tools we use. We instrumented two wooden spoons with 6 degree of freedom accelerometer / gyroscope IMUs. This allowed us to determine the orientation of the spoons in 3D space. The spoons were connected to a 3D rendering package that provided visual feedback showing spoon orientation. As a master chef moves the spoon around, sensor data can be recorded so that the gesture of “whisk” can be differentiated from “stir,” for example. As master chefs stir many, many spoons, a knowledge repository of physical interactions with tools is collaboratively generated. The spoons can be used synchronously, as well. We demonstrated a scenario where a master chef stirs one of the Shared Spoons while the apprentice stirs the other spoon. As the apprentice “matches” the velocity and orientation of the master’s spoon, the software generates a pleasant tone; when the spoons are not in harmony, a discordant tone sounds.

Time on my side

By interacting with the handle of the skillet David sees the remaining cooking time. When David’s side is about ready, Laura’s kitchen buzzer goes off telling her to leave for David’s home.

Temporal co-ordination is a key aspect of collaboration, and time is something that we as humans aren’t good at keeping. What we have here is a collective workspace – the pan, the kitchen buzzer, even the cellphone, which work in tandem – these objects spread across distances collaborate with each other so that we don’t need to actively worry about time, thus allowing you to focus on what’s most important for you – cooking.

Related Work & Prior Art

Surfaces and Spaces
* ClearBoard (Ishii, 1992)
* Tangible Bits (Ishii, 1997)
* Double DigitalDesk (Wellner, 1993)

Cooking
* CounterActive (Ju, 2001)
* Home of the Future (Microsoft, 2003)
* Counter Intelligence Group (MIT Media Lab)

Objects
* Cooking with Elements (Bonanni)
* IntelligentSpoon (Cheng)
* ChameleonMug (Selker)

Individual Contributions

We assert that all team members shared work equally and fairly, collaborated on group efforts with enthusiasm, and also provided focused support in areas of expertise.

Laura was instrumental as Project Manager for the team and drove the development of our presentation along with the video script. Dhairya worked with Laura on the script for the video and also was responsible for shooting film. Christian was primarily responsible for editing the project video and creating the user interface simulations. David and Robert worked on the physical, working prototypes with David taking lead on the design and development of the Shared Spoons and Robert designing and implementing the augmented spice bottle.

All team members shared responsibilities on project direction and implementation.

[Presentation Slides]

 

This entry was posted in 2nd Project by jonathanspeiser. Bookmark the permalink.

About jonathanspeiser

Jonathan Speiser – website N/A
MIT Media Lab, Viral Spaces / MS1

Experience
I have considerable experience with Python and Java programming. I have also some experience working with C. I have made some small scale electronics projects for fun (e.g. a simple electric toy car). I am currently learning the Arduino platform to create more sophisticated projects and explore ideas.

Why
I eager to learn and I am motivated by the idea of creating more intuitive, physical interfaces that improve user experience. I am interested in the areas of health care and communication and my hope is the class will serve as an inspiration to spur my creativity in these domains.

Art
Architecture
Craft/Fabrication
Design
DIY Electronics
Electrical Eng.
Mech. Eng.
Programming/CS

★★★★
★★★★
★★★★
★★★★
★★★
★★★★
★★★★
★★★

3 thoughts on “What’s cookin’?

  1. Great presentation. It is tough to keep the delicate balance between visual augmentation and digital clutter when adding overlays to real objects. I liked it for showing the remote persons actions when providing advice, but found the visual feedback of recognized food throughout your video futuristic looking but kind of gimmicky.
    Augmented (hyper)Reality: Domestic Robocop https://vimeo.com/8569187

  2. I really liked the video overlay you showed of the remote person preparing the food; people who are learning to cook (like me) often watch cooking tutorial videos on youtube -while they cook- and it’d be great to have it be interactive, personalized, and reactive to my pace by means of having a live, remote collaborator.

  3. Thanks for the good presentation. Your video was very insightful in that it demonstrated how people can cooperate in a cooking context between remote agents. Now I am also wondering how we can establish a collaborative environments where people can transfer qualities that are hard to be digitized, such as smells an flavors.

Comments are closed.