Hey all! In this video we highlight some of the techniques and tools we used as well as some of the problems we ran into making the “Easy” music video. Hopefully, this will be informative. We’d love to see you guys make some videos too!
Posts by: Ryan Karpf
Today, the Google I/O conference is happening here in San Francisco. The talk Voiding your Warranty: Hacking Glass included the above video, which features our cofounder Ryan showing off his hacker skills with the Glass. Here’s the story behind the video.
Using an avatar as a proxy for communication has many benefits. Your avatar can always look good, be well lit and in an interesting location. However, even the most immersive virtual worlds fall flat when trying to deliver the emotional data from real world facial expressions and body language.
From video game controllers to tracking your sleep behavior, there is a good deal of experimentation being done with wearable sensor hardware right now. In addition to soldering our own creations together, we have been checking out work done by others as fast as we can all with the goal enabling rich emotional avatar communication.
As you can imagine, when we received our beautiful new Google Glass as part of the Explorer Program, we were eager to see if we could access its sensors and drive our avatar’s head movement (caveat: Google Ventures is one of our investors).
Being the only white guy with a beard here at High Fidelity, working with Glass fell to me 😉 This was a great exercise because it gave us an opportunity to abstract the input layer for multiple device support (we also got Oculus working! Stay tuned for that blog).
We had previously created an Android app that grabbed all the phone’s sensor data and sent it over UDP to a configurable port. Imagine holding your phone and being able to twist and move your avatar’s hand. Kinda like turning any phone (with sensors) into a Wii controller. Low and behold when we plugged our Glass in and tried to run the Android app from our IDE, Glass showed up as a device and it “just worked”. We could not edit the fields in the GUI on Glass but we could see from the log that it was transmitting the data.
For obvious reasons, Glass has some pretty aggressive energy saving behavior which made it tricky to keep the transmission alive. We ended up moving the sensor data transmission to a service layer. To stop transmission we just turn Glass off.
You can see in the video that we have a very low latency connection between human and avatar head movement using Glass!