jfinch – Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 MAS.834 Sun, 05 Feb 2017 17:11:53 +0000 en-US hourly 1 https://courses.media.mit.edu/2016fall/mas834/wp-content/uploads/sites/6/2015/09/cropped-TIlogoB-02-copy2-32x32.png jfinch – Tangible Interfaces https://courses.media.mit.edu/2016fall/mas834 32 32 “Art Alive” https://courses.media.mit.edu/2016fall/mas834/2016/12/07/art-alive/ Wed, 07 Dec 2016 20:38:32 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6654 Our team was interested in making art that is more dynamic: Art that responds to the unique behaviors of the people viewing it at any time (the number of people present, their proximity to the piece, where they’re looking within the frame etc.). This creates new affordances between viewers and the art itself. Our hope is that we could eventually bring art – and its contents – to life.

We picked Andy Ryan’s photograph of Marvin Minsky’s home to illustrate some of these concepts:

We then explored ways to use computer vision to capture information about where people were looking and where they were positioned, relative to the piece. This is a short demo of how we might capture and process viewer data, using facial recognition software and the front-facing camera on an iPhone (the circle is following the movement of Tamer’s face).

Finally, we created a small-scale prototype (of Ryan’s painting) with a layer of clear thermochromic paint, which revealed different parts of the image based on where the viewers were situated. This is a short demo of how this paint could help us change the colors dynamically and direct focus to different areas.

(Last audio sample in this clip was Minsky himself)

Team: Tamer Deif, Shruti Dhariwal, Clara Lee, Jeremy Finch
Thanks to Prof Hiroshi Ishii, Penny Webb, Dan Fitzgerald and Udayan Umapathi

]]>
Jeremy Finch – Project 2 Brainstorming https://courses.media.mit.edu/2016fall/mas834/2016/11/16/jeremy-finch-project-2-brainstorming/ Wed, 16 Nov 2016 18:10:20 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=6361 Two rough concepts: One about biofeedback, the other thinking more about 4D printing and flowers.

tangible2_concepta

tangible2_conceptb


There have been many projects that connect heartrates or other biometric readings to wearable “tangible” output or other people. See

http://www.thetouchx.com/index.html#Home

The idea of discretizing interactions with 4D printed devices by freezing cubes of reactants/solvents is interesting. Keep developing.

P.S. I did the code to print the Harvard flowers – can help with 4D printing.

-Dan


Udayan’s comment: Enough work has been done in the past on connecting heartrate to a physical output and so on. But, you could build up on other interaction scenarios and it could be interesting. Take a look a this inflatable jacket and this helmet. Maybe you can build on top of these existing objects by defining more concrete scenarios within the space of transformable objects.

The morphing over time idea is a good direction, but why ice cubes? This direction similar to the previous one needs more concrete scenario.

]]>
P1 Concepts (Jeremy Finch) https://courses.media.mit.edu/2016fall/mas834/2016/10/05/p1-concept-proposals-jeremy-finch/ Wed, 05 Oct 2016 18:45:24 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=5955 spices bench heels

]]>
Jeremy Finch https://courses.media.mit.edu/2016fall/mas834/2016/09/19/jeremy-finch/ Mon, 19 Sep 2016 17:59:53 +0000 https://courses.media.mit.edu/2016fall/mas834/?p=5670 I have a background in UX research, and ergonomics and usability testing for consumer and medical product development. I love drawing, comic books and film editing. Lately, I’ve been teaching myself animation and motion graphics. I’m currently a second year MBA student at Sloan.

Website: www.jeremyafinch.com
Email: finchj@mit.edu

]]>