MAS.834 » Student Projects http://mas834.media.mit.edu MAS.834, MIT Media Lab, Fall 2012. Fri, 23 Aug 2013 16:17:18 +0000 en-US hourly 1 http://wordpress.org/?v=3.6 Wave Alchemy http://mas834.media.mit.edu/2012/12/13/wave-alchemy/ http://mas834.media.mit.edu/2012/12/13/wave-alchemy/#comments Fri, 14 Dec 2012 04:48:37 +0000 Jacqueline Kory https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1609 Continue reading ]]> Life is full of moments that come with obvious or subtle expressions of energy. It is common for us as human beings to attach different emotions to such expressions. And yet when we want to capture it and interact with it, we are often constrained to flat, 2D encapsulations of video, audio, or photographic recordings. Moreover, the way we would look back at a memory in this digital age is now often through a screen with hundreds of files digitally stored away, further removing the experience of the event’s emotion. What if we had a way to experience this emotional energy again, and dynamically interact with it in infinitely complex ways? Here we present a concept and prototype that explores a novel physical-visual language of dynamic, emotionally expressive waveforms, designed to transform the way we perceive different forms of energy as we go about our daily lives. With the power of computations hidden within the physical materials used in the interface, we create an interactive form made of Radical Atoms that can take one form of energy and transmute it into a waveform as its output, or Wave Alchemy.

waveform created by connecting the beads water user interaction music dance crowd cheer construction clapping ambience and surrealism

Team

Dan Sawada – Ideation, physical prototyping, programming, interaction design, video production and editing
Anirudh Sharma – Ideation, video production and editing, programming, partial implementation of a concept that was abandoned
Sujoy Kumar Chowdhury – Ideation, physical prototyping, programming, research, video production
Christine Hsieh – Ideation, research, presentation, physical prototyping, video production, interaction design for a concept that was abandoned
Andrea Miller – Ideation, research, presentation, physical prototyping, video production

Final Paper

CHI-extended-abstract_wave FINAL

Final Presentation

Wave Alchemy presentation slides

 

 

]]>
http://mas834.media.mit.edu/2012/12/13/wave-alchemy/feed/ 0
(re)place http://mas834.media.mit.edu/2012/12/13/replace/ http://mas834.media.mit.edu/2012/12/13/replace/#comments Fri, 14 Dec 2012 01:44:51 +0000 sophiachang https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1603 Continue reading ]]>

STATEMENT

 

The tendency in the creation of digital technologies has been to focus on the design of tools which allow digital information to augment the physical: physical first, digital as an added layer. This direction of thinking is evident in early examples such as the garage door opener, which creates convenience through the automation of a previously existing physical object. However, the same trend also continues within more recent explorations of ways to integrate digital tools into the experience of our physical world.

Augmented reality is one example which is a growing area within human-computer interaction and a distinct example of digital augmentation, as the objects augmented are often separate from the augmenting tool. Using screens or glasses, digital information can be overlaid on any object being viewed within the physical world to augment them with additional digital information.

As another example, The Tangible Media Group at MIT has also researched means of augmentation which are more connected to the manipulation of physical. In the Relief project[3], the digitally projected image of a mountain gives added visual information to understand the topographical forms which can be displayed and manipulated on its form-changing surface.

Both of these current examples overlay a digital image over the physical, enabling access to additional information that cannot be obtained directly from the objects themselves. The inclination to explore the digital as an added layer of augmentation follows the development of all new technologies, which are always created in relation to what exists, with overlay being one of the easiest ways to bring digital affordances into the world. However, as we are now increasingly familiar with the digital, it is also important to reverse the question and ask how physical means might also augment a body of digital information in space.

This paper introduces an interface which selectively materializes digital information within space to allow one to tangibly work with a specific portion of a larger body of digital information. The name (re)place refers both to the replacing of abstract digital information by physical tangibility, as well as the adjustable placement of a physical slice within space as a way of deciding what area to materialize.

 

Replace_(CHI-Formatted Paper)

(re)place Video

 

Contributors:

Sophia Chang -  Ideation, Presentation, Content Generation, Physical Prototyping

Jifei Ou - Ideation, Presentation, Physical Prototyping

Sheng Kai Tang - Ideation, Presentation, Software Prototyping, Physical Prototyping

 

 

]]>
http://mas834.media.mit.edu/2012/12/13/replace/feed/ 0
synchroLight http://mas834.media.mit.edu/2012/12/13/synchrolight/ http://mas834.media.mit.edu/2012/12/13/synchrolight/#comments Thu, 13 Dec 2012 20:52:01 +0000 sophiachang https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1590 Continue reading ]]>

 

STATEMENT

In a current scenario of video communication, the shared working space between the participants are physically detached. This discontinuity leads to many inconvenience and problematics, one of them is how to point or indicate physical objects between the remote participants. We were inspired by the phenomena that light can penetrate from one side of a glass and illuminate objects from the other side. In this case, penetrating light becomes a medium that facilitates a better communication.

 

SYSTEM DESCRIPTION

For this assignment, we sought to re-create such experience in a remote communication scenario and demonstrate a vision of “light can travel through virtual world and reach out to the physical one”. In our system, user in video communication takes a flash light and points to any position on the screen. this 2-D coordination will be captured and transmitted to the remote side. A projector simulates a light beam on the same coordination. By doing so, participants are able to intuitively pointing and annotating contents in physical world.

 

APPLICATION

The metaphor of penetrating light can be also applied to other scenarios. We envisioned two potential applications.

1. Urp++

as a remote version of a previous project from Tangible Media Group, Urp, Urp++ allows user to use a phicon as light source to synchronize the environment light within the two workbenches. When the light source changes its position, the simulated shadow from both sides will response the the correct position. User can also pass the light source to each other remotely. (see the sketch below)

 

2. X-ray

In this application, the simulated light beam can not only point on the remote physical object, but also reveal some hidden information inside or behind this object. The simulated light becomes an X-ray.

 

PRESENTATION SLIDES

synchroLight Video

 

Contributors:

Jifei Ou - Ideation, Presentation, Physical Prototyping

Sheng Kai Tang - Ideation, Presentation, Software Prototyping

]]>
http://mas834.media.mit.edu/2012/12/13/synchrolight/feed/ 0
Obake http://mas834.media.mit.edu/2012/12/13/obake/ http://mas834.media.mit.edu/2012/12/13/obake/#comments Thu, 13 Dec 2012 20:18:52 +0000 Jesse Austin-Breneman https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1583 Continue reading ]]> Inspired from the tactility of solids and the fluidity of liquids, we have created Obake which embodies a solid shape changing form with a liquid exterior affording fluid like interactions. Obake objects are 3D wireframe shapes with the capacity to morph into new solid forms while maintaining a fluid exterior enabling new interaction opportunities.

We have outlined ten key interactions and gestures for an Obake surface, as illustrated in our presentation – intrude, extrude, prod, pull, push, friction, compress, expand, warp, stitch, s-bend.

We have currently implemented a 2.5D display based on the Obake principles which we use to demonstrate a geographic data viewer. The current features include exploring layers in the terrain both laterally (soil and water profile) and linearly (vegetation profile), ability to pull out shapes (buildings) and prod (mountain) them to explore detail, use of the elastic texture to create (river) and to customize paths (river thickness), to simulate using friction (fire) and to morph display affordances according to the shapes (physical extrusion of terrain from 2D surface to elevated 3D environment with depth based soil profile).

Presentation: Obake

Paper: Obake

Contributions:
Rob Hemsley – Ideation, Presentation, Physical Prototyping
Dhairya Dand –  Ideation, Presentation, Software Prototyping

]]>
http://mas834.media.mit.edu/2012/12/13/obake/feed/ 0
PURSeus http://mas834.media.mit.edu/2012/12/13/purseus/ http://mas834.media.mit.edu/2012/12/13/purseus/#comments Thu, 13 Dec 2012 17:48:56 +0000 zacharybarryte https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1574 Continue reading ]]> PURSeus is a bag that has the ability to change size, shape, material, split into multiple bags, automatically deliver items, and understand the user’s schedule and needs. These features eliminate the need for users to have several bags for a variety of uses because PURSeus can understand its contents and transform itself into an all-purpose bag. The theoretical material that forms PURSeus can be extended for use as any type of smart container the user requires.

PURSeus Presentation

PURSeus (CHI-Formatted Paper)

Contributions:
Jason Gao – Physical prototyping, presentation, ideation
Anjali Muralidhar – Physical prototyping, video presentation, presentation, ideation
Samvaran Sharma – MATLAB-based voice recognition, prototyping, ideation
Henry Skupniewicz – Physical prototyping, ideation
Helena Hayoun Won – Video presentation, presentation, ideation

]]>
http://mas834.media.mit.edu/2012/12/13/purseus/feed/ 0
ArTouch http://mas834.media.mit.edu/2012/12/13/artouch/ http://mas834.media.mit.edu/2012/12/13/artouch/#comments Thu, 13 Dec 2012 06:18:18 +0000 lieber https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1547 Continue reading ]]> In ArTouch, we explore the ways in which radical atoms can change the paradigms in which we experience art and paintings.  We challenge the natural convention of simply looking at painting by inviting the visitor to touch and interact with our art in new tangible ways creating a richer experience.  In this way, we create not only evolutionary pieces of art, but also art that become collaborative as different visitors interact with the piece throughout the course of the day.

The video below presents our vision for a few of the interactions made possible by our system.

In making the video, we concentrated on three main objects for use in interacting with the art — the hands, face, and mouth.  We feel that these “implements” make the interactions natural and ubiquitous.  In this video we highlighted three main interactions.  First was the ability to reveal hidden layers of a painting by blowing away top layers.  These underlying layers could reveal an artists method for creating a work or specially created textures left there by the artist for the visitor to discover.  Secondly, visitors participated in collaboratively creating a work through changing the physical texture.  Using their hands, visitors could smooth out various sections of the piece transitioning from rough rock, to smooth clay, and finally to the shiny bottom layer.  Over time, the roughness returns indicating how long it has been since the last interaction and giving new visitors an opportunity to interact with it.  Finally, the last piece of art invited the visitor to explore their own emotional state through abstract textures.  By sensing the visitor’s mood, the painting could change its texture and physical resistance to motion accordingly.

Physical prototype showing underlying mechanisms used to create textures on the left. The right side shows an enlarged area of the image for closer examination of local texture. This area is remotely linked to textures in the first panel as seen in the second image.

In our physical prototype as seen in the images below, we wanted to highlight some ideas not presented in the vision video or only hinted at.  First was the idea of remote interactions.  Using a set of textured wheels that were coupled by an elastic belt, we create a dynamic texture that that responds to changes in either picture.  We also explored the idea of interactions that could change based on previous input through the use of a Peltier device that can heat or cool depending on the state that the previous user determined.

We feel that ArTouch will not only revolutionize how art is experienced in museums, but will also bring more art into personal spaces through the use of remote collaboration.  This collaboration will create a more personal piece of art that can be enjoyed in new ways everyday making for a more exciting, tangible experience.

For more information regarding our project, please take a look at our our presentation slides (PDF) or paper (PDF).

Contributions:
Jessie Austin-Breneman – Physical prototyping, presentation
Zachary Barryte – Physical prototyping, presentation
Eric Jones – Physical prototyping, presentation
Woong Ki Sung – Vision video making, presentation
Wenting Guo – Vision video, presentation
Trygve Wastvedt – Vision video, presentation

]]>
http://mas834.media.mit.edu/2012/12/13/artouch/feed/ 0
Radical Textiles http://mas834.media.mit.edu/2012/12/12/radical-textiles/ http://mas834.media.mit.edu/2012/12/12/radical-textiles/#comments Thu, 13 Dec 2012 00:22:53 +0000 jonathanspeiser https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1538 Continue reading ]]> Christian Ervin, David Nunez, Laura Perovich

Video (password protected): https://vimeo.com/55501151

Presentation slides: RadicalTextiles

Final Write-up: Radical Textiles Final Write-up

What if your shirt could change its form, color, and elasticity–instantly?  Or your pile of dirty clothes could morph into sparkly shoes or a bicycle helmet?  Could fabric react to what’s happening in the world… or what was going on in our heads?

In our world of radical textiles, each stitch is a radical atom.  Conceptually, we can think of these stitches as a kind of stem cell of textiles.  They multiply and divide, merge, and change depending on where they are on the body.  They are also “perfect” in that they have ideal knowledge of what they should be and how to act in different situations.

What might this world might look like?  In this future a man could pull a tie out of his shirt collar if he was underdressed, shirts would grow into jackets as soon as we walked out into the cold, and we could design an outfit and fold it into our pocket if we wanted to wear it later.

Though these scenarios may seem far away, there has been a growing interest in computational textiles in recent years leading to a variety of projects, especially around movement, interpersonal relationships, and empowerment.  This includes projects from the Media Lab such as PillowTalk which allows people to connect remotely through soft objects and DressCode that uses textiles as a way to teach programming skills, especially to young girls.

We break down the world of radical textiles along a few dimensions including use cases, modes of interaction, and types of textile changes.  We also explore the implications of a future that includes radical textiles.  They have the potential to improve our world through conservation of materials, the democratization of design, and increasing creativity.  Yet this substantial shift–a transition similar in scale to the recent shift from landlines to cell phones–may have unforeseen consequences and impacts on society.

We assert that all team members shared work equally and fairly, collaborated on group efforts with enthusiasm, and also provided focused support in areas of expertise.  All team members shared responsibilities on project direction and implementation.  Laura had primary responsibility for the development of our presentation, David for the physical prototypes, and Christian for the video editing.

]]>
http://mas834.media.mit.edu/2012/12/12/radical-textiles/feed/ 0
PersonaBench http://mas834.media.mit.edu/2012/12/12/personabench/ http://mas834.media.mit.edu/2012/12/12/personabench/#comments Wed, 12 Dec 2012 20:34:47 +0000 samvaran https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1529 Continue reading ]]> Title: PersonaBench

PersonaBench Extended Abstract

Description: We explore the design of socially dynamic furniture, which adapts its form to maximize interaction in public spaces. Often, individuals in today’s public spaces are increasingly isolated by their technological devices. MP3 players, smartphones and tablets, erect social barriers that inhibit interpersonal interactions. By rewarding and promoting emergent cooperative behavior, our furniture is architected to foster and catalyze connections between people.

Final presentation: PersonaBench

Stop-motion video: http://youtu.be/1NtT74SaQh8

Team:
Shawn Conrad
Research, Presentation, Happy/sad bench storyboard, Abstract
Lauren Kim
Research, Presentation, Stop motion video [photoshop], Bench fabrication, Abstract
Jacqueline Kory:
Research, Presentation, Stop motion video [compilation, music], View bench storyboard, Abstract
Adina Roth
Research, Presentation, Stop motion video [photoshop], Bench fabrication, Abstract
Jonathan Speiser
Research, Presentation, Bench fabrication, Abstract

]]>
http://mas834.media.mit.edu/2012/12/12/personabench/feed/ 0
Magonote http://mas834.media.mit.edu/2012/11/10/magonote/ http://mas834.media.mit.edu/2012/11/10/magonote/#comments Sat, 10 Nov 2012 13:54:00 +0000 Jacqueline Kory https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1446 Continue reading ]]> Magonote is a concept for a collaborative scratching experience. ‘Magonote’ (〜孫の手〜) is a Japanese word that refers to a backscratcher tool. The system we propose comprises of a Magonote enabled chair and a stuffed animal through which a remote person can participate in scratching.

magonote Social grooming Koala Magonote chair Koala with blinking LED Magonote enabled chair

Scratching is a reflex response to itches, for which we do not usually depend on others. It is easy to miss the casual bonding that happens through scratching interaction. In fact, it is also a common social grooming activity in a number of primates. From this observation, we wanted to design a novel experience that uses scratching as a social object and a medium for reconnecting with friends and family.

The above concept video demonstrates an example scenario. Arun could really use a good scratch now. He remembers that his friend Bill helped him out when he was around, particularly when Arun’s hands could not reach the itch-location very easily. Unfortunately Bill no longer lives in his city. Arun knows what to do in this situation. He comes across a Magonote chair nearby and decides to give it a try. At the same time, in a different city, Bill is reading a book. The stuffed koala that Arun gave him suddenly starts nodding its head. He fetches it closer to see why. He notices that there are some LED lights at the back of the koala, and they are blinking. From the pattern of blinking, he recognizes it as an incoming ‘scratch request’. He acknowledges the request by giving the koala a good scratch. Immediately, the Magonote attached to Arun’s chair gets activated. Arun realizes that it is Bill. Arun likes the way Bill scratches, but the scratch-location is slightly off from the itch location. Arun signals it by rubbing his back against the chair. Bill notices that the only one LED at the top-center position of the koala is now fading in and out. He starts scratching around that particular LED. Magonote arm changes the scratch position accordingly. After Arun is satisfied with the scratch session, he leaves the chair. At Bill’s end, no more lights are blinking. Bill puts the koala back to it’s original position.

Here we have used the scratch interaction as a metaphor for casual bonding. The stuffed animal is a ghost representation of someone dear to us and a metaphor for attention seeking. The dyadic interaction between the remote users takes place in personal physical spaces. The capabilities of the chair include transmission of initial presence information, actuation of robotic Magonote and scratch-location-gesture detection through pressure sensors. The features of the stuffed animal are: presence notification through nodding of head, scratch intent notification through LED blinking, scratch position notification through LED fading and a scratch sensing surface. We implemented the LED array controls using an embedded Arduino and simulated the rest using ‘Wizard of Oz’ technique.

Team

Dan Sawada
Research, Design and laser-cutting of Magonote, Prototyping (programming, electronics), Concept video production
Anirudh Sharma
Research, Concept video production, Prototyping (programming, electronics)
Sujoy Kumar Chowdhury
Research, Prototyping (programming, electronics), Interaction Design, Concept video production

Related work

  • Scratch Input by Chris Harrison, Scott Hudson, UIST 2008
  • inTouch by Scott Brave, Andrew Dahley, and Professor Hiroshi Ishii, 1998
  • Hug Shirt by CuteCircuit, 2006

 

]]>
http://mas834.media.mit.edu/2012/11/10/magonote/feed/ 0
Constellation http://mas834.media.mit.edu/2012/10/31/constellation/ http://mas834.media.mit.edu/2012/10/31/constellation/#comments Wed, 31 Oct 2012 23:58:16 +0000 champika https://courses.media.mit.edu/2012fall/mas834/mas834/?p=1205 Continue reading ]]> Project Description: 

The goal of our project, Constellation, is to demonstrate an interface that motivates and guides collaborative motion. Collaborative motion covers a cornucopia of activities from cooking, to dancing, to swimming, to yoga. We focus specifically on dance, and particularly flash mobs, as an application ripe for our interface. We examined the natural behavior of swarms, nature’s collaborative motion, and looked to the natural cues of motion in animals such as bees and ants as the basis of our prototype. We developed a system that tracks the synchronization of movement among proximal users. As more users move limbs in sync, the corresponding movement indicators (LEDs) become increasingly brighter. This approach creates an incentive based reward system to encourage users to move in synchronized (e.g. in a flash mob), and also creates an artistic effect that enhances the overall aesthetic experience.

Prototype States

Prototype Development

Example Scenario

Group Members

Shawn Conrad

Research, Stop Motion Video, Material Logistics
Lauren Kim
Research, Stop Motion Video [image editing], Presentation
Jacqueline Kory
Research, Stop Motion Video [music, compilation]
Adina Roth
Research, Stop Motion Video [photography], Prototype [fabrication, hardware], Presentation
Jonathan Speiser
Research, Stop Motion Video, Prototype [programming, electronics]

Final PDF: Assignment 2_constellation

]]>
http://mas834.media.mit.edu/2012/10/31/constellation/feed/ 1