Project I (Oct. 11~Nov. 1)

Explorations in I/O coincidence: Creating a Synchronized Physical Object

The screen and GUI have captured and become a go-to paradigm for communication that is devoid of many of our senses. The way we perceive and experience physical objects has greater depth shaped by the affordances and physical limitations of the object. As our world becomes more connected – we will try to break away from the closed-paradigm of adding a screen to everything – before the world breaks us!
Current systems for real-time remote collaboration and communication are largely rooted in traditional GUI and voice/video inputs and outputs. With these approaches, the shared experience is mostly limited to projections of the digital world through screens and make use of only a few of users’ senses and tactile abilities.
On the other hand, interactions with physical objects or the environment enable us to be more or our natural selves in their complexity and use of the human senses. In addition, these interactions are shaped by the affordances and physical limitations of the objects.
In this class project, you will explore the concepts of I/O coincidence and Synchronized Distributed Physical Objects presented by the Tangible Media Group in the past, but with a twist:
In groups of three, students will randomly get two actuators and two sensors and will have to create a set of synchronized objects in the spirit of Tangible Media project InTouch using the technical framework given in class.
The input and the output of your systems need to be conjugated in a way that makes physical sense, and preferably be poetic, beautiful and meaningful.  Since you will be using a standard framework given by the TA’s, and since your devices will have the capabilities to easily interact with other devices, your systems are expected to have the capabilities to interact with other group’s devices and to create understandable output. Ths creates another conceptual challenge  as the affordance of one object needs to be translated to the affordance of another group’s object.
Key qualities to keep in mind:

  • Does your interface provide clear affordances for how people will interact with it?
  • Is your interface able to provide understandable information as output?
  • Can both parties, through the simplicity of your implementation of IO coincidence, coherently interact with each other’s telepresence?

Suggested timeline:

  • 1 week of brainstorming, sketching, and discussions with your group. By the end of the week you should know what you plan on doing!
  • 1 week of preliminary research, conceptual + technical. Initial prototyping
  • 1 week of prototyping, testing, implementing, and documenting (documentation is important!)

Each group will give a short presentation on your project on November 1st.

Deliverables:

  • Short video showing synchronized interaction of two people with the final project.
  • 5 minute presentation+5 minute discussion in which you address the following about your project: what sensors and actuators you have used, how you channeled them to capture a specific interaction with your object, how are the physical affordances of your object being used to convey information/feelings/experience to someone else. How do you imagine these affordances passed on to someone who has another object that can communicate with your object  
  • Project overview page uploaded to wordpress with any related media, animations or photographs.
  • Demo your system working with your second object and with other group’s objects.

Suggested readings project I:

The Design and Implementation of inTouch: A Distributed, Haptic Communication System
Victor Su. The Design and Implementation of inTouch: A Distributed, Haptic Communication System. Thesis (M. Eng. and S.B.)—Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.