Head Tracking on the iPhone

For those of you who haven’t seen head-tracking displays, let me direct you to this video which describes the basics of what’s going on.

Basically, what’s shown on the screen updates in relation to the angle at which you view the screen. The end result turns the screen of your TV or iPhone into a window into a virtual world. It’s a virtual-virtual-reality. Awesome!

I would love to be playing an FPS on my iPhone, and tilting/turning the phone to shoot behind objects or tilt around corners. Racing games could be equally cool, letting you peer left or right to look into your rear view mirrors as you race through LA.

Technology-wise, the youtube video linked above requires IR sensors to track your head position in relation to the TV – but this is b/c the TV is stationary and your head is moving. With the iPhone, the effect can be the same by rotating the phone in your hands, and have your head by stationary. The iPhone’s accelerometer can (might?) be able to give all the necessary information to determine the phone’s orientation during the game.

If anyone knows if there’s any stub code for head-trackers written in objective-c, please leave a note in the comments. My gut tells me that if virtual-virtual-reality is ever to find its way onto the iPhone, it’s going to have to be written largely from scratch. Please someone prove me wrong!

3 thoughts on “Head Tracking on the iPhone

  1. Have you seen iHologram on youtube? That’s basically what you’re talking about. The problem is that while you can detect movement of the iPhone, you cannot detect the location of a person’s head/eyes.

    You’d need to know what that point is in relation to the position of the iPhone and then the movement of the iPhone would be trackable in relation to the position of the person’s head using the accelerometer.

    What you might be able to do is on launch of the game/app, run through an initializer which would ask the user to do some things in order to get a known position for the iPhone to calculate movement from using the accelerometer.

    The code you’re looking for isn’t so much head tracking code as it is anamorphic 3d rendering code. The head tracking would actually be simply iPhone tracking in relation to a static position (the person’s head) using the accelerometer which I believe there is enough code out there for already.

  2. that iHologram looks pretty cool – i hadn’t seen it before.

    what you’re saying is spot on – it can’t be head tracking, but the principles should be the same as labyrinth. just assume that the phone is 2 feet or whatnot from the eyes, and calibrate the ‘home’ angle of the phone. from there, you should have all the info you need to fake what amounts to head tracking.

    the result should end up being even more 3d looking than iHologram, since you can assume a starting angle of 35 degrees and infer from there.

    Thanks for the comment!

    Cheers

Leave a Reply

Your email address will not be published.