Projection Mapping
Interactive Head
This is an interactive, projection-mapped head. Animations cycle between hand-animated and AI-generated human faces (courtesy of @fanchyart on IG). When users stand in front of the camera, their faces are automatically projected onto the head. When no faces are detected by the camera, animations proceed to randomly cycle.
Build Process:
HEAD: I 3D modeled the head in Blender then exported a paper model as an SVG that I then carved into corrugated plastic with my CNC X-Carve. I zip-tied all the 2D pieces to form a 3D head.
ANIMATIONS: All background animations are from Beeple and STVinMotion. The animated faces were made by me in Blender and Resolume Arena. The human faces were generated in collaboration with fanchyart.
ARDUINO: An Arduino with a time of flight sensor detects whether users are standing in front of the camera. When no users are present, the Arduino randomly chooses an animation to play every 20 seconds.
PROCESSING 3.0: When users are detected by the Arduino, a Processing sketch runs an Open CV face-recognition program that detects faces then scales and centers them such that they match the dimensions of the head sculpture
VJ: Resolume Arena contains all of the pre-rendered animations and video effects. Each animation is assigned a MIDI number that the Arduino triggers to start.
PURE DATA: A pure data sketch runs in the background to manage MIDI communications between Arduino, Processing, and Resolume Arena.
GIF 1: CAMERA TURNS ON WHEN FACE DETECTED. GIF 2: FACE AUTOMATICALLY CENTERS AND SCALES CORRECTLY IMG 3: CAMERA DETECTION MODULE
PROJECTION MAPPED PYRAMID
PROJECTION MAPPED FACE
I am in the process of making a system that projects images onto faces (GIF 2). I use the Kinect’s IR camera (GIF 1) to track faces because regular cameras cannot track faces in low-light situations and create unintended (albeit very cool) feedback loops with projected images (GIF 3).