no title
After the wiimote headtracking experiments, I felt that having two extra accessories (wiimote and a sensor bar) just wouldn’t cut it for any practical application of a deep UI. Instead, I downloaded OpenCV (open source face tracking) to track the user’s eyes and perspective transform the UI based on that.
Sorry that there’s no live camera feed with this recording; you’ll just have to trust that I sat here and moved my face back and forth and magically made the UI swing around to that. I did a slow pan from right to left, followed by a slow low pan from left to right, and then some misc moves. Use the green and blue dots as references for the perspective transform; the blue dot is where my face is (in the camera image) and the green dot is canvas center, translated along positive Z axis to show the transform more clearly.