Dan Sawada – Ideation, physical prototyping, programming, interaction design, video production and editing
Anirudh Sharma – Ideation, video production and editing, programming, partial implementation of a concept that was abandoned
Sujoy Kumar Chowdhury – Ideation, physical prototyping, programming, research, video production
Christine Hsieh – Ideation, research, presentation, physical prototyping, video production, interaction design for a concept that was abandoned
Andrea Miller – Ideation, research, presentation, physical prototyping, video production
CHI-extended-abstract_wave FINAL
Wave Alchemy presentation slides
]]>
STATEMENT
The tendency in the creation of digital technologies has been to focus on the design of tools which allow digital information to augment the physical: physical first, digital as an added layer. This direction of thinking is evident in early examples such as the garage door opener, which creates convenience through the automation of a previously existing physical object. However, the same trend also continues within more recent explorations of ways to integrate digital tools into the experience of our physical world.
Augmented reality is one example which is a growing area within human-computer interaction and a distinct example of digital augmentation, as the objects augmented are often separate from the augmenting tool. Using screens or glasses, digital information can be overlaid on any object being viewed within the physical world to augment them with additional digital information.
As another example, The Tangible Media Group at MIT has also researched means of augmentation which are more connected to the manipulation of physical. In the Relief project[3], the digitally projected image of a mountain gives added visual information to understand the topographical forms which can be displayed and manipulated on its form-changing surface.
Both of these current examples overlay a digital image over the physical, enabling access to additional information that cannot be obtained directly from the objects themselves. The inclination to explore the digital as an added layer of augmentation follows the development of all new technologies, which are always created in relation to what exists, with overlay being one of the easiest ways to bring digital affordances into the world. However, as we are now increasingly familiar with the digital, it is also important to reverse the question and ask how physical means might also augment a body of digital information in space.
This paper introduces an interface which selectively materializes digital information within space to allow one to tangibly work with a specific portion of a larger body of digital information. The name (re)place refers both to the replacing of abstract digital information by physical tangibility, as well as the adjustable placement of a physical slice within space as a way of deciding what area to materialize.
Contributors:
Sophia Chang - Ideation, Presentation, Content Generation, Physical Prototyping
Jifei Ou - Ideation, Presentation, Physical Prototyping
Sheng Kai Tang - Ideation, Presentation, Software Prototyping, Physical Prototyping
]]>
STATEMENT
In a current scenario of video communication, the shared working space between the participants are physically detached. This discontinuity leads to many inconvenience and problematics, one of them is how to point or indicate physical objects between the remote participants. We were inspired by the phenomena that light can penetrate from one side of a glass and illuminate objects from the other side. In this case, penetrating light becomes a medium that facilitates a better communication.
SYSTEM DESCRIPTION
For this assignment, we sought to re-create such experience in a remote communication scenario and demonstrate a vision of “light can travel through virtual world and reach out to the physical one”. In our system, user in video communication takes a flash light and points to any position on the screen. this 2-D coordination will be captured and transmitted to the remote side. A projector simulates a light beam on the same coordination. By doing so, participants are able to intuitively pointing and annotating contents in physical world.
APPLICATION
The metaphor of penetrating light can be also applied to other scenarios. We envisioned two potential applications.
1. Urp++
as a remote version of a previous project from Tangible Media Group, Urp, Urp++ allows user to use a phicon as light source to synchronize the environment light within the two workbenches. When the light source changes its position, the simulated shadow from both sides will response the the correct position. User can also pass the light source to each other remotely. (see the sketch below)
2. X-ray
In this application, the simulated light beam can not only point on the remote physical object, but also reveal some hidden information inside or behind this object. The simulated light becomes an X-ray.
Contributors:
Jifei Ou - Ideation, Presentation, Physical Prototyping
Sheng Kai Tang - Ideation, Presentation, Software Prototyping
]]>We have outlined ten key interactions and gestures for an Obake surface, as illustrated in our presentation – intrude, extrude, prod, pull, push, friction, compress, expand, warp, stitch, s-bend.
We have currently implemented a 2.5D display based on the Obake principles which we use to demonstrate a geographic data viewer. The current features include exploring layers in the terrain both laterally (soil and water profile) and linearly (vegetation profile), ability to pull out shapes (buildings) and prod (mountain) them to explore detail, use of the elastic texture to create (river) and to customize paths (river thickness), to simulate using friction (fire) and to morph display affordances according to the shapes (physical extrusion of terrain from 2D surface to elevated 3D environment with depth based soil profile).
Presentation: Obake
Paper: Obake
Contributions:
Rob Hemsley – Ideation, Presentation, Physical Prototyping
Dhairya Dand – Ideation, Presentation, Software Prototyping
Contributions:
Jason Gao – Physical prototyping, presentation, ideation
Anjali Muralidhar – Physical prototyping, video presentation, presentation, ideation
Samvaran Sharma – MATLAB-based voice recognition, prototyping, ideation
Henry Skupniewicz – Physical prototyping, ideation
Helena Hayoun Won – Video presentation, presentation, ideation
The video below presents our vision for a few of the interactions made possible by our system.
In making the video, we concentrated on three main objects for use in interacting with the art — the hands, face, and mouth. We feel that these “implements” make the interactions natural and ubiquitous. In this video we highlighted three main interactions. First was the ability to reveal hidden layers of a painting by blowing away top layers. These underlying layers could reveal an artists method for creating a work or specially created textures left there by the artist for the visitor to discover. Secondly, visitors participated in collaboratively creating a work through changing the physical texture. Using their hands, visitors could smooth out various sections of the piece transitioning from rough rock, to smooth clay, and finally to the shiny bottom layer. Over time, the roughness returns indicating how long it has been since the last interaction and giving new visitors an opportunity to interact with it. Finally, the last piece of art invited the visitor to explore their own emotional state through abstract textures. By sensing the visitor’s mood, the painting could change its texture and physical resistance to motion accordingly.
Physical prototype showing underlying mechanisms used to create textures on the left. The right side shows an enlarged area of the image for closer examination of local texture. This area is remotely linked to textures in the first panel as seen in the second image.
In our physical prototype as seen in the images below, we wanted to highlight some ideas not presented in the vision video or only hinted at. First was the idea of remote interactions. Using a set of textured wheels that were coupled by an elastic belt, we create a dynamic texture that that responds to changes in either picture. We also explored the idea of interactions that could change based on previous input through the use of a Peltier device that can heat or cool depending on the state that the previous user determined.
We feel that ArTouch will not only revolutionize how art is experienced in museums, but will also bring more art into personal spaces through the use of remote collaboration. This collaboration will create a more personal piece of art that can be enjoyed in new ways everyday making for a more exciting, tangible experience.
For more information regarding our project, please take a look at our our presentation slides (PDF) or paper (PDF).
Contributions:
Jessie Austin-Breneman – Physical prototyping, presentation
Zachary Barryte – Physical prototyping, presentation
Eric Jones – Physical prototyping, presentation
Woong Ki Sung – Vision video making, presentation
Wenting Guo – Vision video, presentation
Trygve Wastvedt – Vision video, presentation
Video (password protected): https://vimeo.com/55501151
Presentation slides: RadicalTextiles
Final Write-up: Radical Textiles Final Write-up
What if your shirt could change its form, color, and elasticity–instantly? Or your pile of dirty clothes could morph into sparkly shoes or a bicycle helmet? Could fabric react to what’s happening in the world… or what was going on in our heads?
In our world of radical textiles, each stitch is a radical atom. Conceptually, we can think of these stitches as a kind of stem cell of textiles. They multiply and divide, merge, and change depending on where they are on the body. They are also “perfect” in that they have ideal knowledge of what they should be and how to act in different situations.
What might this world might look like? In this future a man could pull a tie out of his shirt collar if he was underdressed, shirts would grow into jackets as soon as we walked out into the cold, and we could design an outfit and fold it into our pocket if we wanted to wear it later.
Though these scenarios may seem far away, there has been a growing interest in computational textiles in recent years leading to a variety of projects, especially around movement, interpersonal relationships, and empowerment. This includes projects from the Media Lab such as PillowTalk which allows people to connect remotely through soft objects and DressCode that uses textiles as a way to teach programming skills, especially to young girls.
We break down the world of radical textiles along a few dimensions including use cases, modes of interaction, and types of textile changes. We also explore the implications of a future that includes radical textiles. They have the potential to improve our world through conservation of materials, the democratization of design, and increasing creativity. Yet this substantial shift–a transition similar in scale to the recent shift from landlines to cell phones–may have unforeseen consequences and impacts on society.
We assert that all team members shared work equally and fairly, collaborated on group efforts with enthusiasm, and also provided focused support in areas of expertise. All team members shared responsibilities on project direction and implementation. Laura had primary responsibility for the development of our presentation, David for the physical prototypes, and Christian for the video editing.
]]>