Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 MAS.834 Sun, 05 Feb 2017 17:11:53 +0000 en-US hourly 1 https://courses.media.mit.edu/2016fall/mas834/wp-content/uploads/sites/6/2015/09/cropped-TIlogoB-02-copy2-32x32.png Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 32 32 SleepScape – subtle interventions in sleep (Group 6) https://courses.media.mit.edu/2016fall/mas834/2016/12/12/sleepscape/ Tue, 13 Dec 2016 02:41:52 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6706 Project Abstract: click to download

sleepscape

We spend roughly one-third of our lives sleeping, yet  the technology that supports this daily necessity has evolved very little over time. The vast and varied technologies we interact with during our waking hours have evolved significantly, improving our daily lives and expanding our capabilities.  While the communication devices and vehicles we use each day would be unrecognizable to people who lived just two-hundred years ago, the beds and blankets we use would be familiar to people who lived thousands of years ago, both in the materials they are made of and their capabilities.

SleepScape seeks to improve your sleep experience by learning your habits and adapting to them over time. Perhaps you are cold when you first get in bed each night, but wake up overheated a few hours later. SleepScape can increase airflow, or adjust its position to help regulate your body temperature. It can comfort you when you are restless by hugging your body or gently wake you in the morning by nudging you. We’ve also imagined extended capabilities, like dream monitoring and interpretation.

blanket_covering   ripple   blanket_comforting

SleepScape is a blanket that contains a flexible mesh of electroactive polymers/muscle wires. Embedded in this mesh are sensors that track the user’s temperature as well as the ambient temperature in the room. It also contains accelerometers, proximity sensors and pressure sensors, to gather information about the user’s movement and posture, relative to the blanket. All of these can be tracked specific to different areas of the grid.

Shape-changing capabilities are activated by electrical impulse. The network of muscle wire is strategically activated in order to produce movement. Different areas gradually contract or expand, allowing for purposeful, yet subtle movements, like the blanket rolling off the user when he/she is too hot.

Over time, the blanket learns things like the user’s optimum sleep temperature, and will regulate this by acting before the user has become too hot or too cold.

Below is an image of the flexible mesh. The colored dots represent sensors embedded in the mesh. As stated above, the movement of the blanket is controlled by strategically activating muscle wires to choreograph specific movements, like the blanket rolling off the user. The center image is a time-laps of muscle wire contracting. We’ve played around with the idea of the blanket becoming more or less dense to facilitate airflow, as in the animation on the right, but there are easier ways to create heat-activated vents. This capability can be built into the cloth, instead of requiring a Pneuduino to pump air in and out.

flexible_grid  wireanimation_small  blanket_thicknes_small

Extended Capabilities: See Dreamscape

Dreamscape

]]>
Perfect Fit: A Magnetic Thread Textile (Group 2) https://courses.media.mit.edu/2016fall/mas834/2016/12/08/perfect-fit-a-magnetic-thread-textile-group-2/ Fri, 09 Dec 2016 03:48:45 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6676 Final Video


tangible-final
tangible-final2 tangible-final3 tangible-final4 tangible-final5 tangible-final6 tangible-final7 tangible-final8 tangible-final9 tangible-final10 tangible-final11 tangible-final12 tangible-final13 tangible-final14 tangible-final15 tangible-final16 tangible-final17 tangible-final18 tangible-final19 tangible-final20 tangible-final21 tangible-final22 tangible-final23 tangible-final24 tangible-final25

]]>
Project 2 – Hydrogels (Amos, Dan, Karishma, Laya, Nassia) https://courses.media.mit.edu/2016fall/mas834/2016/12/07/project-2-hydrogels-amos-dan-karishma-laya-nassia/ Wed, 07 Dec 2016 21:13:48 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6642 Our group worked on using the interactive properties of hydrogels. Instead of hacking a material to come up with new properties, we decided to hack interactions/use the properties that hydrogel currently has. The properties we tested out include: 

  • Tunable Thermal Behavior
  • Edible
  • Conducts Electricity
  • Acoustically Transparent
  • Refraction index of water

Hydrogel is 97% water. The process of making Hydrogel included boiling water and then adding a gelling agent that was then put into different shapes and sized containers. The gel would solidify over time (about 5 – 10 minutes) and then you could pick it up from these moulds. We used petridishes, test tubes and ice cube trays. Below are a few pictures of the process.

img_6738 img_0941 img_7511 img_9572

The Edible Ice Cubes – When these food colored ice-cubes of gel are added to water, the coloring starts to naturally change the color of the drink. We thought about interactions here that could include noticing if someone tampered with your drink or not and notifying the drinker about this.

img_7388 img_3808

Based on these initial explorations, we decided to focus our prototypes on the following:

  •  Edible Interaction
  • Augmented Perception
  • Material Logic

The three prototypes/interactions we came up with include:

  • Yum! Gelecriticity – an interaction allowing children to create a relationship with food that can interact with them about these applications.
  • Open Gela-me – Being able to interact with someone over a meal or conversation through this panel that also seems like a mosiac piece of art.
  • I/O Gel – Being able to use the gel to be able to ‘save’ specific properties of foods and being able to experience a variety of tastes and flavors in a gel form. Could this be the future of food?

All videos for these interactions can be found in our presentation here.

]]>
emobject https://courses.media.mit.edu/2016fall/mas834/2016/12/07/emobject/ Wed, 07 Dec 2016 20:41:07 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6657 Difei Chen, Jiabao Li, Ali Shtarbanov, Siya Takalkar, Qi Xiong

exploded_line face_line-01 img_3920 img_8872 img_8871 img_8873

Emobject as…

  • An instant personal expression reflector
  • An average personal expression reflection
  • A social scene reflector
  • An emotional telepresence device
  • An interaction device that reacts to your actions
  • A learning device for autistic patients
  • An interactive toy
  • A controller for lights, appliances, and temperature
  • A platform for creative expression

screen-shot-2016-12-07-at-5-13-10-pm

]]>
“Art Alive” https://courses.media.mit.edu/2016fall/mas834/2016/12/07/art-alive/ Wed, 07 Dec 2016 20:38:32 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6654 Our team was interested in making art that is more dynamic: Art that responds to the unique behaviors of the people viewing it at any time (the number of people present, their proximity to the piece, where they’re looking within the frame etc.). This creates new affordances between viewers and the art itself. Our hope is that we could eventually bring art – and its contents – to life.

We picked Andy Ryan’s photograph of Marvin Minsky’s home to illustrate some of these concepts:

We then explored ways to use computer vision to capture information about where people were looking and where they were positioned, relative to the piece. This is a short demo of how we might capture and process viewer data, using facial recognition software and the front-facing camera on an iPhone (the circle is following the movement of Tamer’s face).

Finally, we created a small-scale prototype (of Ryan’s painting) with a layer of clear thermochromic paint, which revealed different parts of the image based on where the viewers were situated. This is a short demo of how this paint could help us change the colors dynamically and direct focus to different areas.

(Last audio sample in this clip was Minsky himself)

Team: Tamer Deif, Shruti Dhariwal, Clara Lee, Jeremy Finch
Thanks to Prof Hiroshi Ishii, Penny Webb, Dan Fitzgerald and Udayan Umapathi

]]>
Jellyfish https://courses.media.mit.edu/2016fall/mas834/2016/12/03/jellyfish/ Sat, 03 Dec 2016 17:17:59 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6601 Lucas Cassiano, Alethea Campbell, Poseidon Ho, and Lily Gabaree

Inspiration

In nature, jellyfish do not have brains. They process information via sensitive nerve nets that underlie their epidermis, allowing for full radial sensation. We were inspired by their sensitivity, compositional simplicity, and the many affordances of their radial design.

diagram3

Like jellyfish, we rely on touch in our natural environments. The skin is the largest organ of the human body, approximately 22 square feet of densely packed receptors. The human hand alone contains approximately 100,000 nerves. Jellyfish is an interface that makes full use of our capacity to sense through touch.

Mechanism

Jellyfish is a proposed dynamic interface that transforms flat, screen-based information into three-dimensional, mutable material, using a programmable topology.

3D Viewer
all tops
Place Jellyfish over a GUI, and move it around like a puck. The topology of Jellyfish changes according to the detected screen content, to create correlating textures. The base of the puck is a solid ring, which glides easily on surfaces; the top is a translucent skin, stretched over shape-changing wires, that can bend up to 90 degrees at each node, allowing for the creation of a variety of shapes.

Pressing on a node allows the user to deform the shape, and this input also affects the screen content, allowing for hands-on CAD modeling and other applications.

jelly1gif

 

Applications

Jellyfish can transform any typical GUI interaction into a tangible experience.

Applications include: modeling in CAD software; examining datasets; GIS mapping; game controls, and more. [expand]

data1

Process

Our original brainstorms spanned a variety of possibilities: stress-based tongue interfaces; ants as actuators/fabricators; plant-based interactions and personal growth gardens. We decided to focus on a later idea – a tangible interface puck, loosely inspired by the Microsoft Surface Dial, dialbecause it would have a wide range of possible applications for productivity and expression.

Unlike the Dial, our puck would be more than an advanced mouse; it would be a direct and tangible connection to the original content. We were inspired by the Radical Atoms discussion and Bret Victor’s talk about the underutilization of many “modes of understanding,” particularly our capacity for tactile understanding. And to achieve this understanding, we would use programmable matter, in the form of changeable topology.

We decided to look to nature for inspiration as to methods of best realizing our vision, and focused on the jellyfish, which has a simple, radial design that affords fluid and rapid shape-changing. A trip to the New England aquarium provided additional inspiration. jelly2

When designing the interface, we focused on usability: the puck would fit in one’s hand, glide easily over any screen, and would be manipulatable by all fingers. Inspired by the jellyfish’s fluid-filled hood and underlying musculature, we decided to use a rigid structure in the bottom layer, with a gel-filled encasement on top. This would allow for more dramatic shape shifts in the rigid structure, including sharp edges, but would also afford smooth, organic surfaces if needed, by altering the amount of gel present in the topology.

There was a delay in getting the shape-changing wires we hoped to use for the rigid structure, so we used 3D-printed models to represent different topologies that could be rendered. artboard

 

 

The tops can be used interchangeably to snap in the puck. We used gel and a plastic film to create a malleable surface atop the underlying structure.

squeeze1

Once the wires arrived, we tested their performance moving a gel layer. We did not achieve the dynamic node structure desired, but did produce movement in the test layer.

wires

 

 

]]>
Hongliang Wang_Light Induced Shape Changing Material https://courses.media.mit.edu/2016fall/mas834/2016/11/24/light-induced-shape-changing-material/ Thu, 24 Nov 2016 15:12:18 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6589 Light induced shape changing material can be used as a screen of television that can transforms the image to be a real 3D things.Moving on,it can be used in a 3D space with a projection of holographic and it can shape changing with the holographic signal constantly.

img_3384

img_3383

]]>
Tanuja – From cold to hot https://courses.media.mit.edu/2016fall/mas834/2016/11/22/group-2-individual-proposal/ Wed, 23 Nov 2016 02:54:49 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6574 individual_tanuja

]]>
Expanding our Umwelt https://courses.media.mit.edu/2016fall/mas834/2016/11/18/expanding-our-umwelt/ Fri, 18 Nov 2016 19:45:21 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6507 Concept: Using programmable material to expand the umwelt of an organismproject-2-ideas1

What is Umwelt?

Each functional component of an umwelt has a meaning and so represents the organism’s model of the world. These functional components correspond approximately to perceptual features.It is also the semiotic world of the organism, including all the meaningful aspects of the world for any particular organism, i.e. it can be water, food, shelter, potential threats, or points of reference for navigation. An organism creates and reshapes its own umwelt when it interacts with the world. This is termed a ‘functional circle’. The umwelt theory states that the mind and the world are inseparable, because it is the mind that interprets the world for the organism. Consequently, the umwelten of different organisms differ, which follows from the individuality and uniqueness of the history of every single organism.project-2-ideas

Semiosphere

An organism creates and reshapes its own umwelt when it interacts with the world. This is termed a ‘functional circle’. The umwelt theory states that the mind and the world are inseparable, because it is the mind that interprets the world for the organism.

Consequently, the umwelten of different organisms differ, which follows from the individuality and uniqueness of the history of every single organism. When two umwelten interact, this creates a semiosphere.

What if we could use different programmable material to expand the unwelt of an organism? Further, what if we could have different organisms connect in a semiosphere through a material?

 


Very powerful idea in expanded Umwelt. But to what end/why would we want to? Obviously it’s cool to experience more, but we also note that an animal’s affordances are perceived using it’s umwelt. Does an expanded umwelt imply expanded “pallet” of affordances available to us? Which are the most compelling? How is it bidirectionally interactive? Are these materials that we perceive directly, that we use as tools to perceive other elements of the world, or wearable materials that expand our senses all the time?

-Dan

 

 

Penny: I like the idea of being able to dynamically adjust your perspective, as a kind of empathy machine or tool for connectivity. I would suggest thinking about it it terms of HCI; for instance, what might a computer’s umwelt be like?

]]>
Part II: Stress TestOUT by Alethea Campbell https://courses.media.mit.edu/2016fall/mas834/2016/11/17/part-ii-stress-testout-by-alethea-campbell/ Thu, 17 Nov 2016 23:48:19 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6497

 

Concept:

When we are stressed, our bodies release stress hormones. Popular Science found that “Changes in cortisol and other hormones register in your saliva, indicating not only stress but according to a recent study, possibly also how well you respond to it.” Short term stress can be healthy. Long-term stress is dangerous, always telling our bodies that we are in danger or activating different systems to move constantly is taxing and exhausts out bodies.

I am wondering how we can remind our bodies of some interface what kind of stress we are experiencing. Or, creating a system to help us determine if the stress we are experiencing is healthy or damaging.

 

 

How:
(Brainstorm)

Create something of which there are many pieces of input—haptic, heat sensors, perspiration, heart beat and more.

AND

Output would be a scent to help you calm down in some way. Or, react based off of what you determine as feedback. Next, you input your feelings before and after.

OR

Output: Lights to communicate to your brain, if it is good stress or bad stress. And, then cognitively acknowledge your feelings.

 


There is a wide body of previous work around wearables for biometrics-based mood detection/augmentation/modulation, so be cautious of navigating and positioning within it. Smell is a compelling aspect, if you can clarify the argument it the best sense for modulating stress. Combining many biometric monitors to estimate stress levels -esp. good vs. bad stress – is huge task, probably outside the scope of this class. Is there other data we have access to that could be a proxy for stress level? What feedback does the user give and how is it used?

-Dan

 

Penny: Think about what it really means to be stressed, what do you do, how do you respond, do you try to hide away from the stress, or do you go for a walk, or do you just ignore it? Have a think about some of these natural responses that people already do when they are stressed, and what we already to to try to ‘de-stress’. Perhaps the answer isn’t creating a technology that is aware of you by monitoring, but something you turn to when you are stressed.

]]>