alims – Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 MAS.834 Sun, 05 Feb 2017 17:11:53 +0000 en-US hourly 1 https://courses.media.mit.edu/2016fall/mas834/wp-content/uploads/sites/6/2015/09/cropped-TIlogoB-02-copy2-32x32.png alims – Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 32 32 Ali Shtarbanov – Sensory Substitution/Augmentation Ideas https://courses.media.mit.edu/2016fall/mas834/2016/11/16/sensory-substitutionaugmentation-ideas/ Thu, 17 Nov 2016 00:25:08 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6415 Overarching Theme:

Our perception of reality is limited by the sensory organs that our bodies come equipped with, and by the bandwidth of each organ. Our eyes for example, have receptors sensitive to electromagnetic waves between 400nm and 700nm, which is a tiny fraction of all of the EM waves that hit our bodies every second, to most of which we are completely unaware. There are gigabytes of WiFi data and hundreds of cell phone conversations that are passing through you right now, and you are completely blind to those. For the case of sound, the bandwidth of our ears is  between 20Hz and 20Khz and we are oblivious to what is happening outside of the audible range. Similar bandwidth limitations apply for the other senses. We can build machines to pick up those signals, but one cannot perceive those other signals directly because humans don’t come equipped with the proper sensors. Thus our experience of reality is constrained by our biology. Our brains are sampling just a little bit of the world.

But other animals have different experiences of reality. Snakes can see infrared, and honeybees include UV in their view of the world. For the blind and deaf tick, the important signals are temperature and odor. For the black-ghost knifefish, the world is colored by electric fields. And for the bat, its reality is primarily about ultrasonic waves. There is a word for this called umwelt, which refers to the slice of the world that a particular organism can pick up on. Each animal presumable assumes that its umwelt is everything there is out there. And humans are probably the only species that know that our umwelt is an insignificant fraction of what is out there.  What would it be like if we can experience the world beyond the confines of our umwelt? Can we create new senses that let us experience more of reality?

Idea 1:

Before going into the topic of new senses for humans, lets consider the problem with the current senses and how we experience information. In today’s digital world, we are overutlizing our visual sense, yet at the same time, we are unterutilizing the haptic sense. When we enter a public environment we see people staring at their phone and being completely unaware of what is happening outside the confines of that 6inch screen. Can we change it? We can create a handheld device that fits in one’s pocked or attaches to the wrist and delivers information form the cell phone in the form a haptic sensation. And there are several modalities that could be used for haptic feedback delivery, including change of angular momentum, change of shape, vibrations, change of center of mass, and change of temperature. Into each of those modalities one can encode information such as time or notifications. But if we consider all of the modalities together and consider all possible combinations, we can imagine the possibility of creating an entire haptic language. Imagine if we could ‘read’ a text message or an e-mail through touch, just by holding a device in our hand and have it provide us with tactile stimulus corresponding to the content of a message we just received. Although this may seem far-fetched, it is already happening. Blind people are able to to just that when they read braille. They don’t think about the little bumps as they move their finger; rather the information contained in those bumps just comes right off the page without conscious though. Morover, David Eagleman has shown a device called the ExtrasensoryVEST which encodes speach into vibrotacytile stimulations on the user’s back. After a few weeks of wearing the device, users are able to hear through the sense of touch – and this is again happening without consiously thinking about the patterns of stimulation.

Idea2:

Use the haptic feedback device described in the aforementioned paragraph but simply change the input. Rather than having the input be information coming form a cell phone, have the input as information that is in the environment around us that our biological sensors are not able to pick up. One example of this is WiFi networks. What if we could create a sense that let’s us walk though a building and EXPERIENCE the distribution of the WiFi signals throughout the building. For example, if I am in the Media Lab, I want to be able to take a walk and find out how the received power of a particular network varies as I move from place to place without having to look at a screen but rather experience this in a tactile manner. And the same applies if I am walking around campus.

Idea3:

“Emotions” for objects. A person can be modeled as a system whose inputs are senses and past experiences and whose outputs are emotions, actions, and behaviors, which in turn feed back into the system as a feedback loop. And the same model also applies to animals. But what about objects? Well , they are quite static. Can we make objects be more emotional and dynamic in response to environmental inputs? What if I had an object that could show some kind of expression depending on the temperature or humidity of the environment in which it is placed in.  Or What if it could sense its proximity to its “friends” and then behave in a way that indicated happiness, but when placed by itself alone it would express sadness.Or what if I had something that when I drop to the floor could sense the vibrations on the floor, such as when a train is approaching, and then start behaving in a certain way based on those vibrations. Or what if I have an object produce dynamic haptic feedback in response to music. Or if I put this object on ice or place it in the fridge, can I have it show some emotional response?


The idea of augmenting our senses is powerful, but how does this go beyond wearables with slightly different I/O than we see today? Our phones are powerful dev platforms to be sure, already loaded with many different sensors and screens and vibrators, if you can sit down today and write an app for it, it’s probably not a fundamentally new interface or interaction. More importantly, why does it make sense to feel wifi through touch rather than through sight, smell, or any other sense? What is the correspondence/methodology/justification in the sensory mapping? And, as an interaction, what is the input the user gives to the system in the first place? What is the feedback loop? Why?

The emotional objects has a new interesting aspect – objects with personality/agency – how is this more than an extension of emotive objects in OBMG?

-Dan

]]>
Emotional Telepresence via Shape-Changing Interface https://courses.media.mit.edu/2016fall/mas834/2016/10/11/emotional-telepresence-via-shape-changing-interface/ Wed, 12 Oct 2016 00:14:55 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6124 shape-changing-interface-for-emotional-telepresence

We are spending far more hours with our computers and smartphones than with people. Our digital experiences are very often devoid of emotions. Even when we communicate with people through non-direct means, most of the emotions are lost in cyberspace. And thus, the most effective way for emotional expression still remains face-to-face communicating. What if we could change that paradigm. What if we had a way to constantly be emotionally connected with all the people we care about, without having to meet each of them directly every time we wished to find out how they are doing.

What if some of the objects that we use every day served as interfaces that reflected the emotional states of the person we associate with a particular object. Say for example the mug you are holding was not just a mug but was rather an interface that changed its shape, color, or other property in a way that indicates to you how the person you associate with this mug is feeling.  Or what if the furniture, or any other object changed one or more of its properties to reflect one’s emotions.

To build an emotional telepresence system, we could use a webcam that captures a person in their natural environment, and then process that webcam information with Affectiva’s emotion tracking software, and extract only the emotion content and discard the video content (thereby maintaining the user’s privacy; of course the user could choose how or what they wish to share always, but the maximum they could share is the emotion data). The emotion data is then sent to the cloud, and a shape-changing device at some other spatial and possibly temporal location changes its properties in response to the emotional content. In other words, we are mapping emotional content into dynamic change of shape. The question that needs to be asked, though is what mapping scheme should we use between emotion and shape; and what kind of shape-changing device should we use? Pip Mothersill explored the relationship between form and emotion in her Master’s thesis, thus we could refer to that document for guidance.

 

]]>
Group 5: Soldering Tip Cleaner https://courses.media.mit.edu/2016fall/mas834/2016/09/27/5863/ Tue, 27 Sep 2016 17:18:11 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=5863 Ali Shtarbanov
Preunky Akther
Alethea Campbell
Tamer Deif
Karishma Galani
Kristin Osiecki

Group 5: Soldering Tip Cleaner

]]>
Ali Shtarbanov https://courses.media.mit.edu/2016fall/mas834/2016/09/20/ali-shtarbanov/ Tue, 20 Sep 2016 19:51:19 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=5747 I am a Master’s student at the MIT Media Lab with research interests in new interface technologies, human computer interaction, and the internet of things.  After earning degrees in Electrical Engineering and Physics, I came to the Media Lab to learn about a broader diversity of topics, and to pursue a more interdisciplinary research trajectory.

Previously, I have worked on projects spanning much of the Electrical Engineering spectrum – from signal processing, to digital design, to electromagnetism. More recently, my focus has been more product-oriented, with current projects in the areas of HCI and IoT.  For my latest project I am also starting to explore the fields of materials science and polymer science.

 

 

]]>