MAS.834 » jonathanspeiser http://mas834.media.mit.edu MAS.834, MIT Media Lab, Fall 2012. Fri, 23 Aug 2013 16:17:18 +0000 en-US hourly 1 http://wordpress.org/?v=3.6 Radical Textiles http://mas834.media.mit.edu/2012/12/12/radical-textiles/ http://mas834.media.mit.edu/2012/12/12/radical-textiles/#comments Thu, 13 Dec 2012 00:22:53 +0000 jonathanspeiser https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1538 Continue reading ]]> Christian Ervin, David Nunez, Laura Perovich

Video (password protected): https://vimeo.com/55501151

Presentation slides: RadicalTextiles

Final Write-up: Radical Textiles Final Write-up

What if your shirt could change its form, color, and elasticity–instantly?  Or your pile of dirty clothes could morph into sparkly shoes or a bicycle helmet?  Could fabric react to what’s happening in the world… or what was going on in our heads?

In our world of radical textiles, each stitch is a radical atom.  Conceptually, we can think of these stitches as a kind of stem cell of textiles.  They multiply and divide, merge, and change depending on where they are on the body.  They are also “perfect” in that they have ideal knowledge of what they should be and how to act in different situations.

What might this world might look like?  In this future a man could pull a tie out of his shirt collar if he was underdressed, shirts would grow into jackets as soon as we walked out into the cold, and we could design an outfit and fold it into our pocket if we wanted to wear it later.

Though these scenarios may seem far away, there has been a growing interest in computational textiles in recent years leading to a variety of projects, especially around movement, interpersonal relationships, and empowerment.  This includes projects from the Media Lab such as PillowTalk which allows people to connect remotely through soft objects and DressCode that uses textiles as a way to teach programming skills, especially to young girls.

We break down the world of radical textiles along a few dimensions including use cases, modes of interaction, and types of textile changes.  We also explore the implications of a future that includes radical textiles.  They have the potential to improve our world through conservation of materials, the democratization of design, and increasing creativity.  Yet this substantial shift–a transition similar in scale to the recent shift from landlines to cell phones–may have unforeseen consequences and impacts on society.

We assert that all team members shared work equally and fairly, collaborated on group efforts with enthusiasm, and also provided focused support in areas of expertise.  All team members shared responsibilities on project direction and implementation.  Laura had primary responsibility for the development of our presentation, David for the physical prototypes, and Christian for the video editing.

]]>
http://mas834.media.mit.edu/2012/12/12/radical-textiles/feed/ 0
What’s cookin’? http://mas834.media.mit.edu/2012/10/31/whats-cookin/ http://mas834.media.mit.edu/2012/10/31/whats-cookin/#comments Wed, 31 Oct 2012 16:37:18 +0000 jonathanspeiser https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1191 Continue reading ]]> Dhairya Dand, Christian Ervin, Robert Hemsley, David Nuñez, Laura Perovich

What’s cookin’? is a collaborative cooking system that helps people create meals together even when they’re apart. It is a collection of augmented kitchen tools, surfaces, and computational representation of meals.

Video

Cooking is a social, shared, and sensory experience. The best meals bring friends together and engage all of our senses, from the sizzling of sautéing onions to the texture of toasted bread.

However, sometimes it’s inconvenient for people to come together to share the creative process; kitchen space can be limited, heavy cooking tools may be difficult to transport, and commitments in the home, such as childcare, may discourage people from relocating.

Our tool allows people to cook together in a remote, synchronous, collaborative environment by mirroring the social, shared, and sensory nature of the cooking experience. It is particularly suited for collaborations that involve multiple parallel processes that work in isolation but also intersect at various times throughout the collaboration, most notably at the end when a meal is produced.

We demonstrate example interactions through a series of video sketches and physical prototypes based on friends coming together to share a meal. Our tool can can also be employed in other environments, such as the industrial kitchen or food television. Our model of collaboration extends to large crafts, such as fashion or woodworking and is serves as a step towards a broader framework where embedding knowledge in objects through interaction creates wikipedia for the physical world.

From a Table Away

In our collaborative kitchen environment the tabletop is the mediator which brings together the remote collaborators and bridges the interaction between our tangible utensils, our physical interactions and the environment. Its embedded ambient displays serve as the communication hub between chefs and render augmented information about objects placed on the surfaces.

To reduce information overload between the local and remote collaborators, we use an ambient display to enables glanceable awareness of the remote users tabletop activity. Through this users can be made aware of the remote party’s progress through a recipe or be alerted when a collaborator is struggling with a specific task. If the user wishes to transition to directly interact with the remote workspace, they can adjust the focus of their countertop bringing the impression of remote environment into the foreground of their workspace.

The surfaces help to create a shared workspace that creates enables collaborative teaching and fosters a sense of co-location and teamwork. It provides ambient notifications of the progress for the entire meal, but does not prevent any individual cook from working on his own tasks. Through this we ensure that there is a shared awareness of the ingredients and the tasks being undertaken and how these interrelate which together creates the experience of co located interaction on one shared tabletop.

Spice Spice Baby

The use of spices within this recipe extends the collaboration from the tabletop into the physical environment. Objects which have shared meaning and affordances are mirrored into the remote physical location allowing users to seamlessly share interactions and knowledge.

When Robert buys a bottle of wine that has particularly robust notes, **What’s Cookin’?** helps the chefs collaborate to improve the meal by altering the shared recipe to better match this wine. The system alerts the chefs about the wine choice, and Laura, an expert on wine and food pairings, indicates to David that he should adjust the spices in his marinara sauce. Laura locally taps the bottles recommends and the necessary types and quantities are mirrored into David’s environment. David sees his own spice bottles glow and as he adds the spices to his pot, the bottles’ glow slowly fades out until he has deposited the correct amount.

This tool enables more natural collaboration as users to are able to physically use their bodies to interact with the objects as they would if they were performing the same task locally. This allows the user to draw upon their existing mental models and kinetic memory to help recall which spices to use within the interaction. We re-use existing tools and objects and so enable the seamless continuation of their existing practices sharing the knowledge between these mirrored objects.

The self-aware bottle also records it’s interactions allowing the local user to record their own interactions and replay the information at a time in the future.

We created a physical prototype of this interaction that demostrated the experience with a spice bottle; it glowed to indicate the need to add a spice. A tilt sensor inside the device tracked when the user successfully added spice; the color of the glow changed and eventually decreased in response to a correct number of shakes.

Knead you to knead me

David sees that Laura is frustrated kneading her dough, David sends a tangible video of his hands kneading, which Laura can see overlaid on her dough. After observing David, she slides his hands aside and now practices kneading with her newly learnt style.

Video conferencing is the status quo of remote collaboration today. The question we asked was would it to be more meaningful and personal if the video can be overlaid on the context, in this case the dough, at the same time provide a tangible feedback. This lead to a co-located collaboration experience while still being remote.

Hack the knife

David is struggling to use his knife, Laura helps him learn to cut by mimicking the cutting action at her side. David’s learning is augmented by audio and haptic feedback.

The kitchen has been the birthplace for tools – knives, utensils, spoons which have existed from the stone age to our age; minus their functionality they have hardly ever evolved. Each of our tools is associated with a set of knowledge on how to use them and their best practices, we thought how would it be to have tools that are self-aware, tools that teach and learn from you. These tools not only connect to you but connect you to other people who use them.

We created a working prototype, Shared Spoons, to explore what it means to embed knowledge in the tools we use. We instrumented two wooden spoons with 6 degree of freedom accelerometer / gyroscope IMUs. This allowed us to determine the orientation of the spoons in 3D space. The spoons were connected to a 3D rendering package that provided visual feedback showing spoon orientation. As a master chef moves the spoon around, sensor data can be recorded so that the gesture of “whisk” can be differentiated from “stir,” for example. As master chefs stir many, many spoons, a knowledge repository of physical interactions with tools is collaboratively generated. The spoons can be used synchronously, as well. We demonstrated a scenario where a master chef stirs one of the Shared Spoons while the apprentice stirs the other spoon. As the apprentice “matches” the velocity and orientation of the master’s spoon, the software generates a pleasant tone; when the spoons are not in harmony, a discordant tone sounds.

Time on my side

By interacting with the handle of the skillet David sees the remaining cooking time. When David’s side is about ready, Laura’s kitchen buzzer goes off telling her to leave for David’s home.

Temporal co-ordination is a key aspect of collaboration, and time is something that we as humans aren’t good at keeping. What we have here is a collective workspace – the pan, the kitchen buzzer, even the cellphone, which work in tandem – these objects spread across distances collaborate with each other so that we don’t need to actively worry about time, thus allowing you to focus on what’s most important for you – cooking.

Related Work & Prior Art

Surfaces and Spaces
* ClearBoard (Ishii, 1992)
* Tangible Bits (Ishii, 1997)
* Double DigitalDesk (Wellner, 1993)

Cooking
* CounterActive (Ju, 2001)
* Home of the Future (Microsoft, 2003)
* Counter Intelligence Group (MIT Media Lab)

Objects
* Cooking with Elements (Bonanni)
* IntelligentSpoon (Cheng)
* ChameleonMug (Selker)

Individual Contributions

We assert that all team members shared work equally and fairly, collaborated on group efforts with enthusiasm, and also provided focused support in areas of expertise.

Laura was instrumental as Project Manager for the team and drove the development of our presentation along with the video script. Dhairya worked with Laura on the script for the video and also was responsible for shooting film. Christian was primarily responsible for editing the project video and creating the user interface simulations. David and Robert worked on the physical, working prototypes with David taking lead on the design and development of the Shared Spoons and Robert designing and implementing the augmented spice bottle.

All team members shared responsibilities on project direction and implementation.

[Presentation Slides]

 

]]>
http://mas834.media.mit.edu/2012/10/31/whats-cookin/feed/ 3
IdeasInMotion http://mas834.media.mit.edu/2012/09/26/ideasinmotion/ http://mas834.media.mit.edu/2012/09/26/ideasinmotion/#comments Thu, 27 Sep 2012 03:51:21 +0000 jonathanspeiser https://courses.media.mit.edu/2012fall/mas834/mas834/?p=636 Continue reading ]]> Slides PDF: Perovich_presentationFinal

Movement-based ideation can be difficult to capture since motion is fleeting and experiential.  IdeasInMotion is a system that supports dance choreographers developing routines by helping them collect their ideas, restructure the moves into a routine, experience the new choreography, and share the results with remote collaborators.  The system consists of sensors and haptic feedback devices in clothing that are used to document, re-experience, and share motion, paired with a computer-based interface for sorting and re-structuring the moves.  The sensors document the choreographer’s movements as he brainstorms and a board collects the data and wireless sends it to the computer where it is intelligently divided into segments.  Using the computer interface, the choreographer can rearrange the moves to create a new combination.  He can then experience the resulting routine through haptic feedback in the clothing.  Once he’s please with the result, he uses the computer interface to create a mirror image of the routine that represents the follower’s experience.  He sends this to his dance partner so she can learn the routine “naturally” by touch–as she would if they were collaborating in person.

]]>
http://mas834.media.mit.edu/2012/09/26/ideasinmotion/feed/ 0