by on / 1 comment

High Fidelity received many wonderful submissions for its STEM VR grant challenge and has selected two recipients. They are, in no particular order:

TCaRs – An awesome racing game where you get to interact with JavaScript to customize your car’s handling, create unique power ups and optimize performance through editing the program code with the use of the Blockly API.

TCars_VR_Challenge

Planet Drop – A networked multi-player game that leverages the benefits of social VR through “cooperative asymmetrical gaming”: the virtual environment is shared by the players, but each has specific information related to his or her chosen STEM specialty, provided by individualized HUDs. However, unlike a traditional “information asymmetry” game, like a card game, the goal is not to use unshared information to a player’s advantage to win, but rather to share that information as quickly and effectively as possible to allow the team to solve challenges and advance through a story arc of increasingly impressive accomplishments.

FTL Labs PlanetDrop-VR

Thank you to all who submitted proposals. We look forward to playing and sharing these games when they are delivered and doing something like this again in the future.

by on / 1 comment

Inside an animal cell

High Fidelity recently had the pleasure of showing off our open source virtual reality platform to educators and technical integrators at the ISTE conference in Philadelphia.

Though educators have been using virtual worlds like Second Life and Minecraft to teach a wide range of subjects for years, until now the ability to be fully immersed in those worlds was not technically possible. Head mounted displays like the Oculus, Vive and GearVR will change that and when used with a VR platform like High Fidelity’s, we believe they will transform the way we learn.

To demonstrate one way educators can use our platform, High Fidelity worked with DynamoidApps to develop an interactive model of an animal cell that can be explored on one’s own or with an entire class. The vast alien looking environment goes beyond just showing the parts of the cell, also showing some of the processes taking place. Traveling around with your classmates and teacher allows for real time question and answers and sharing of ideas.

If you want to visit this animal cell, login and go to cellscience/start, and fly towards any cell you see to begin your journey. Hitch a ride on a motor protein and jump off at one of the huge mitochondria along the way!

This cell ‘unit’ will also be free to anyone who wants to use it or modify it. We hope it is just the first of many such units educators will be able to take advantage of; to jump start that catalog, we are offering up to three $5000 grants to teams or individuals who want to build edu content. For more information on rules and deadlines, visit eduvr.org.

We look forward to receiving your submissions and seeing more examples of great EduVR content.

by on / 7 comments

Today, the Google I/O conference is happening here in San Francisco. The talk Voiding your Warranty: Hacking Glass included the above video, which features our cofounder Ryan showing off his hacker skills with the Glass. Here’s the story behind the video.

Using an avatar as a proxy for communication has many benefits. Your avatar can always look good, be well lit and in an interesting location. However, even the most immersive virtual worlds fall flat when trying to deliver the emotional data from real world facial expressions and body language.

From video game controllers to tracking your sleep behavior, there is a good deal of experimentation being done with wearable sensor hardware right now. In addition to soldering our own creations together, we have been checking out work done by others as fast as we can all with the goal enabling rich emotional avatar communication.

As you can imagine, when we received our beautiful new Google Glass as part of the Explorer Program, we were eager to see if we could access its sensors and drive our avatar’s head movement (caveat: Google Ventures is one of our investors).

Being the only white guy with a beard here at High Fidelity, working with Glass fell to me 😉 This was a great exercise because it gave us an opportunity to abstract the input layer for multiple device support (we also got Oculus working! Stay tuned for that blog).

We had previously created an Android app that grabbed all the phone’s sensor data and sent it over UDP to a configurable port. Imagine holding your phone and being able to twist and move your avatar’s hand. Kinda like turning any phone (with sensors) into a Wii controller. Low and behold when we plugged our Glass in and tried to run the Android app from our IDE, Glass showed up as a device and it “just worked”. We could not edit the fields in the GUI on Glass but we could see from the log that it was transmitting the data.

For obvious reasons, Glass has some pretty aggressive energy saving behavior which made it tricky to keep the transmission alive. We ended up moving the sensor data transmission to a service layer. To stop transmission we just turn Glass off.

You can see in the video that we have a very low latency connection between human and avatar head movement using Glass!