julialc4 – Tangible Interfaces http://mas834.media.mit.edu MAS.834 Sat, 12 Dec 2015 03:52:20 +0000 en-US hourly 1 https://courses.media.mit.edu/2015fall/mas834/wp-content/uploads/sites/6/2015/09/cropped-TIlogoB-02-copy2-32x32.png julialc4 – Tangible Interfaces http://mas834.media.mit.edu 32 32 P2:Computing with Clay http://mas834.media.mit.edu/2015/12/11/p2computing-with-clay/ Sat, 12 Dec 2015 03:52:20 +0000 http://mas834.media.mit.edu/?p=5249

 

 

Computing with Clay:

 

I am interested in ways to interface a material like clay as a computational tool for modelling and navigating 3D virtual spaces. I am also thinking about tangible interfaces in relation to Augmented reality and using a haptic mapping to contribute to an augmentation. I am thinking of this as an augmentation process as well as an interface. Methods to compute the volume of the clay include potentially the following: maybe magnetic fields/magnets, capacitive/proximity sensing, computer vision or depth imaging using infrared.

 

 

]]>
Pneumatic Jamming for Physics Simulation http://mas834.media.mit.edu/2015/09/28/pneumatic-jamming-for-material-simulation/ Tue, 29 Sep 2015 01:38:36 +0000 http://mas834.media.mit.edu/?p=4455 Jamming Matters

What would these images of a protein feel like to the touch?

What would these versions of a protein feel like to the touch?

How about this abstract shape?

Inspired by the paper on Jamming Enabled User Interfaces, and by Jamming-Skin-Enabled Locomotion in soft robotics, this proposal aims to harness the ability to create a graded shifting of matter from very fluid to very tough for the purpose of material simulation in haptic feedback.

Haptic feedback or haptic simulation is often part of video gaming and 3d modelling, generally getting some form of physical feedback from a digital system. This proposal suggests placing jamming packages in a grid-like pattern, each with its own micro-pneumatic tube or chamber. Then one can multiplex along the individual chambers, setting the softness as a gradient. This  mapping of softness/hardness across the cells could work in tandem with a digital physics simulation of materials, such as the simulation of a piece of wood or the simulation of moving water. For example, when the user sees a CG graphic of wood visually, they could also feel a hard object, like a piece of wood. When the user sees water flowing the pneumatically-jammed cells could map softly in a sinusoidal undulation to mimic water. The grid/cells could also take the form of a large glove and simulate other physics across the hand, responding as the user interacts with a 3D world.

The most interesting aspect about this isn’t just simulating digital physics for games, nor the ‘new and unusual’ dissonant sensations that may emerge (though those motivations are also fine with me), but more importantly seeing at what point a simulation hits the uncanny valley, or in other words, how perceptible is it that it is a simulation versus a real object? What causes the suspension of disbelief to be sustained or to break-down, given a that there is a coupled correlation of visual and haptic stimuli. Is the coupling of different types of information enough to cause a sense of causal coherency? Does the level of “Realistic-ness” of a rendered visual simulation really matter or will our perceptual systems’  “filling-in” capabilities work with any graphic if the information is paired?

 

Chambered Jamming Gloved version

 

WebGL Water simulation!

 

 

]]>