“Emperor Constantine VII received foreign guests while seated on a throne flanked by golden lions that ‘gave a dreadful roar with open mouth and quivering tongue’ and switched their tails back and forth. Next to the throne stood a life-sized golden tree, on whose branches perched dozens of gilt birds, each singing the song of its particular species. When Liudprand performed the customary prostration before the emperor, the throne rose up to the ceiling, potentate still perched on top. At length, the emperor returned to earth in a different robe, having effected a costume change during his journey into the rafters.”
My midterm project was a visual illusion that takes advantage of the limits of human perception. By creating videos that flash through a deck of shuffled cards showing one frame per card, the typical viewer can’t pick out a single card, which is shown for only 4 milliseconds. A card shown for 2-3 frames is visible by some viewers, and 4 frames is sufficient to be seen by the majority of viewers.
This trick is inspired by a trick popularized in the movie Now You See Me, featuring Jesse Eisenberg. Here is a clip of him performing the trick in the movie.
I experimented with many possible variants of the video to ensure that the video seemed smooth and homogeneous, while still maintaining a high success rate for the illusion. This involved showing the desired card for various numbers of frames and experimenting with blur to make the non-target cards less easily detected.
The video from Now You See Me had extreme blurring on all cards except the 7 of Diamonds, which was fine considering the dynamic action of riffling through cards. Because my video simply showed pictures of cards, there was no believable reason why cards in the middle of the video should be heavily blurred.
My final video involved a combination of longer exposure to the target card and blurring of non-target cards. The cards became steadily more blurred over the first half-second of the video, from zero blur to a 16 pixel Gaussian blur. About two thirds of the way through the video, my desired card was shown for four frames. The first and fourth frames were blurred (20px Gaussian), and the middle two frames were entirely unblurred. This allowed for clear viewing of the card for two full frames. This also still gave the impression that the scene was constantly shifting, as the jump from the blurred card to the unblurred card was mistaken for a jump to a completely different card. In contrast, simply showing the desired card for four frames was always recognized as a “blip” in the video by viewers.
To add the “Inception” component of the trick, I had to find a way to tell the computer which card to force a card on the spectator. I made an mp4 video for each non-face heart card (in addition to the seven of diamonds). By embedding the video into a webpage I was able to process keyboard inputs via JavaScript. On this page where I embedded the video, pressing a number 1-9 (let’s call it x) on the keyboard automatically changes the video src tag to show the video with the x of Diamonds as the target. This enables me to take a card suggestion from the audience (using a plant to guarantee a heart card) then force that card on the user.
I Final Cut Pro for all the editing, because iMovie doesn’t enable frame-by-frame editing. I downloaded a Zip of the card images from http://www.jfitz.com/cards/.
Here is the video of the final trick performed (courtesy of Jon Bobrow).
A presentation that points out that color is more than simply a science of numbers and convoluted terms but rather the existence of color makes our world the wonderfulness that we perceive.
Dull & dreary to gorgeously beautiful.
Ex)
to
and
to
Project Description
Two guest lecturers, Professor Winberg and Professor Landers (tribute to the horrible & excellent Introduction to Biology professor pair, Wineberg & Lander) give a presentation on “Color as We See It”. Winberg befuddles the audience with a decisively unnecessary use of excessively long words and other absurdities of speech including details on what will not be covered and the importance of understanding minuscule details rather than the idea as a whole. Landers then takes the scene and shows why color is an amazing thing in our world, that provides life and vitality in its simple existence.
With color, the dull becomes interesting, the usual becomes beautiful and our world, magical.
The initial idea with my Midterm presentation was to expand on my Trick++ which was based on the idea of an abrasive professor character with a finicky presentation that acts up against my will. I was very excited the response to my Trick++ character and excited for the directions I could take the personality I had invented.
However when I tried to write actual content for the extension of the character in my Midterm presentation I ran into a wall in that I couldn’t think of any explanation for WHY my powerpoint would be acting up behind me. This lack of a practical explanation drove me away from the crazy professor persona and left me in need of a presentation topic.
Idea Potentials
TED talk-like
sailing
colors (history of?)
my life
crazy/weird teaching/guest lecturer
Possible Technological Components
Face recognition component
(Color changing?)
Reveal in pre-sent email
Reveal on website
Class color know-er
Unused Coding Sketches (can include code on request?)
SendEmail
Play Video
Input GUI
Current Plan for Final
Return to the abrasive hard-nosed professor character persona and absurdly changing slides. =) The cheeky, misbehaving slides will be the fault of a TA who is finally getting back at the mean unfriendly and unpopular prof. Initial idea-generating phrase, the professor’s area of expertise will be her downfall? Or just absurdity.
My midterm project was an extension of my Trick++, and used the same deck of NFC-tagged cards I created for that project. This time, I developed the trick to move the reveal away from my phone and onto a volunteers – having the selected card appear on their smartphone instead of simply showing it on mine.
This was accomplished mostly with software development on my Android app. Since the application can know which card is selected long before the reveal, it isn’t limited to just displaying the image of the card. For this midterm I added functionality to automatically text the card to any phone, or send the card as an email to any address.
On the technical side, this was not difficult to implement. Sending SMS is trivial within Android apps, and required just a few lines of code. Email was only slightly more difficult, since it needs to authenticate with some sort of account, and I decided to just use a Parse app and a free Mailgun account. I also tried to get the app to post the group Slack, but since I’m a restricted user I don’t have the right access privileges.
The code for the app is available here, which is a little better organized than the version from the Trick++. The app has also picked up some other additional features, such as card history/deck tracking, just in case I decided to integrate those into a performance some day as well.
In review: although the in-class performance didn’t quite work out (it ended up being due to 1) my phone’s WiFi being off and 2) T-Mobile’s poor data coverage inside the depths of the Media Lab), I think this trick was an intuitive and natural extension of what I had already built. I wish that I had more time to develop more export destinations (Slack, Facebook, Twitter… anything!), but I think the trick played well as is. This midterm will probably also wrap up the career of my NFC deck, since I have a few separate ideas I think I’d like to pursue for my final.
Here is a video of the in-class performance. Unfortunately, the iPhone 5s was not playing, which both took away from the overall effect, and distracted me during the second half of the video. Check back soon for a staged (but better!) performance of the trick.
Three Videos At Once
Here is a video that shows all three of the clips playing simultaneously. The top clip is the video displayed on the Projector (by means of an iPhone 4S), the bottom left video is played on the iPad, and the bottom right video is played on the iPhone 5s.
Main Idea
The main idea was to tell the story of the computer using multiple devices: the projector, an iPad, an iPhone, and some physical devices.
The inspiration for this trick was Marco Tempest’s iPod TED Talk trick. I love the way he augments his storytelling with “tricks” spread throughout. I wanted to build on this idea by using vertical screens (instead of on a table) as well as using multiple sized screens. I tried multiple ways (see “Expanding on the Idea”) to get this effect. After trying many methods, Marco’s MultiVid software ended up being the best way to perform the trick.
How It Was Done
Here is a diagram of the system in place. The trick uses 4 videos going at the same time. One video is for the iPad, one is for the iPhone, one is for the projector (which is projected from a second iPhone), and the 4th is a “teleprompter” that only the magician can see (so he/she knows where they are in the trick). All 4 videos are connected to a MacBook. All 5 devices are running MultiVid (available for free online).
I purchased an iPad case that allows for holding it with your palm, so my hand would not get in the way of the screen. I could not find a similar product for the iPhone. I tried to make a “handle” on the back of the iPhone, which failed miserably. I also used a 30pin-to-VGA adapter to transfer the iPhone video to the projector.
The videos were all created using a 30-day free trial of Final Cut Pro X [FCPX] (some of the animations and “slides” were created in Keynote.app, and then exported as QuickTime files to be used in FCPX). All of the audio across all 3 of the video displays (that the audience saw) were pushed to a single device (the projector connected iPhone) to get the audio through the room’s speaker system, as well as to prevent slightly-out-of-sync audio (which is more obvious than out-of-sync video).
Audio: The audio was created in the same way that my previous trick (see: Siri++ ), however I augmented Siri’s voice with additional sounds and music. The music included instrumentals of The Beatles & Bob Dylan (Jobs’s favorite bands… but mine too, so there’s not too much of a connection there 😉 ). Most of the sound effects were used with the default iLife sound effects (that come with Garageband.app or iMovie.app). All audio editing was done within FCPX.
Video: After some failed attempts to get After Effects up and running (hoping to get it figured out by the final presentation), I defaulted to Keynote animations and FCPX free trial.
Expanding on the Idea
I would love to get interactive elements working on multiple screens, however MultiVid only supports video. It would be great to write software to make interactive across screens. (I am going to try to play around with this, but for it to work, I need to figure out how to do it in a way that doesn’t take as long as a PhD thesis [see, THAW]).
Also, after working on the trick, I see why Marco did it on a flat surface. You are limited by what you can hold if you want to do the trick vertically. I will have to see how this can be improved.