We spend roughly one-third of our lives sleeping, yet the technology that supports this daily necessity has evolved very little over time. The vast and varied technologies we interact with during our waking hours have evolved significantly, improving our daily lives and expanding our capabilities. While the communication devices and vehicles we use each day would be unrecognizable to people who lived just two-hundred years ago, the beds and blankets we use would be familiar to people who lived thousands of years ago, both in the materials they are made of and their capabilities.
SleepScape seeks to improve your sleep experience by learning your habits and adapting to them over time. Perhaps you are cold when you first get in bed each night, but wake up overheated a few hours later. SleepScape can increase airflow, or adjust its position to help regulate your body temperature. It can comfort you when you are restless by hugging your body or gently wake you in the morning by nudging you. We’ve also imagined extended capabilities, like dream monitoring and interpretation.
SleepScape is a blanket that contains a flexible mesh of electroactive polymers/muscle wires. Embedded in this mesh are sensors that track the user’s temperature as well as the ambient temperature in the room. It also contains accelerometers, proximity sensors and pressure sensors, to gather information about the user’s movement and posture, relative to the blanket. All of these can be tracked specific to different areas of the grid.
Shape-changing capabilities are activated by electrical impulse. The network of muscle wire is strategically activated in order to produce movement. Different areas gradually contract or expand, allowing for purposeful, yet subtle movements, like the blanket rolling off the user when he/she is too hot.
Over time, the blanket learns things like the user’s optimum sleep temperature, and will regulate this by acting before the user has become too hot or too cold.
Below is an image of the flexible mesh. The colored dots represent sensors embedded in the mesh. As stated above, the movement of the blanket is controlled by strategically activating muscle wires to choreograph specific movements, like the blanket rolling off the user. The center image is a time-laps of muscle wire contracting. We’ve played around with the idea of the blanket becoming more or less dense to facilitate airflow, as in the animation on the right, but there are easier ways to create heat-activated vents. This capability can be built into the cloth, instead of requiring a Pneuduino to pump air in and out.
Extended Capabilities: See Dreamscape
]]>Dreamscape is a customizable modular sleep surface. Inspired by the way the subconscious assimilates outside stimuli during sleep, Dreamscape seeks to manipulate dreams. Various sensors monitor a user’s sleep-state. Once REM sleep has been achieved, Dreamscape comes alive. Perhaps you want to feel as though you’re laying in a field, sleeping under the stars, you’ll hear crickets chirping and feel a light breeze. Or maybe you want to feel as though you’re riding Falkor, a gentle rising and falling simulates the breath of the giant dragon.
Proposal 2: Materiality seeks to build capabilities into a material, and allow the user to determine the function by manipulating its form in a variety of different ways. See this short animation.
Interaction with dreams is a very powerful idea. There are strong ideas for I/O with REM state as input and sensory stimuli to influence dreams as output. The interesting aspect is that, although we are interacting with the device, it is only subconsciouse. Does the deivece then need to take some additional role as decision-maker or driver of the interaction? Talk to Dan Novy (guest speaker next class) about hallucinations. Talk to me about smart pillows.
-Dan
]]>]]>
The goal of this piece is to create a means of fluid collaborative painting between two individuals in different locations. Each individual’s experience will yield a physical piece of artwork. The two will be similar, but not exactly the same. Their similarity represents a collaboration/connection, while their differences acknowledge the space between and illustrate the organic nature of the materials.
In my vision of this, the actions of the individual painting appear on the remote surface as ink bleeding through paper. As one person paints, the brush strokes of the other person slowly appear. There is a choreography to the way each person responds to the other.
I have several concerns, one being that the drawing mechanism is loud or clumsy and distracts from the experience. Another is that two individuals cannot paint at the same time, because the sensor recording the action of painting gets in the way of the mechanics recreating the brush strokes of the remote painter.
-Kristin Osiecki
]]>Kristin Osiecki
I’m a Designer, Educator and Maker who has focused my work on the intersection of the visual arts and technology, exploring electronics and their potential for self-expression, as well as their ability to make subjects like physics, programming and math both tangible and interactive. This work lead me to pursue an Ed.M through HGSE’s Technology, Innovation and Education program.
I hold a BFA in Graphic Design from Rhode Island School of Design, as well as a MAT. Prior to starting at Harvard in August, I spent five years teaching a broad range of visual arts courses at the high school level.
The best way to contact me is via kosiecki@alumni.risd.edu
]]>