alhajri – Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 MAS.834 Sun, 05 Feb 2017 17:11:53 +0000 en-US hourly 1 https://courses.media.mit.edu/2016fall/mas834/wp-content/uploads/sites/6/2015/09/cropped-TIlogoB-02-copy2-32x32.png alhajri – Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 32 32 Tangible Things https://courses.media.mit.edu/2016fall/mas834/2015/11/19/team-grey-t-project-proposals/ Thu, 19 Nov 2015 15:50:14 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=5186

Abdulla Alhajri
Lia Bogoev
Amy Loomis
Nono Martínez Alonso

Idea 1: UnderCut

This project is an extension of the current shape display system. UnderCut will allow for an extra degree of freedom when compared to the current shape display capabilities.

View post on imgur.com

View post on imgur.com

 

Idea 2: Braille Stop

Braille

Idea 3: Hey you! Get off that chair!!

Chair

 

Idea 4: Tangible Modeling Environment

In a quest to find more screen-free time in our days as designers, we propose ways to interact with computational data on real-time in the context of physical elements and materials.

Nowadays, we have an enourmous amount of design tools which allow us to generate complex digital models with extremly simple input (i.e. visual algorithmic environments such as Dynamo or Grasshopper).

Hand-made architectural design “working model.”

What if we could use physical elements as inputs—paper, pens, cardboard—manipulated with our hands so a modeling environment reacts and “generates” for us?”

Teaching a computer how to read physical sketches into a computer isn’t something novel, algorithms for the automatic interpretation of a rought architectural sketch as a consistent 3D digital model have already been created [Interpreting Physical Sketches as Architectural Models, by Barbara Cutler and Joshua Nasman, as can be seen in images a, b, c, d]. Still, the feedback loop died when sketches where just used as input for the computer to generate things on the screen, inside existing CAD software.

Our approach wants to find a way to close the feedback loop and bring the generated information into the physical world through intelligent materials — so the user doesn’t need to interact at all with a computer, but just play with her hands.

This example shows a digital simulated environment. Our device would allow you to perform the operations that are happening inside a CAD program on the real world, by interacting with physical paper and elements, and displaying extrusions and other properties (such as line lengths at scale) on the device.

Going away of outputting the generated information with a projector due to the limitations this technology presents, we would have to try to embed certain behaviors into an “intelligent” device or material.

Sample algorithm to convert physical objects captured by a camera into lines.



 

ken

Undercut seems to be interesting exploration. Cap on each pin on CooperForm can easily removed and you can attach different material. We can think of variety of extension to add on each pin to create different materiality or functions.

Architecture one is hard to imagine. Draw sketches to convey idea. I remember you were saying there will be projection on your drawing…? how does it actuate? Can this go beyond Sandscape? or is it just similar one?

 

]]>