Online Audio Spaces Update: New Features for Virtual Event Organizers
It’s been about 8 weeks since we launched High Fidelity’s new audio spaces in beta. We really appreciate all the support, particularly if you have ...
To make controllers easier to add and maintain, I moved multiple devices like the hydras, playstation controllers, and 360 controllers over to the new UserInputMapper, which provides a variety of default actions that will work without you having to worry about anything. These actions include things like YAW_LEFT, LATERAL_FORWARD, VERTICAL_UP and so on. Adding a new type of device is a lot easier now and you can look at any of the provided managers for examples.
I exposed all the input mappings and action states to javascript. This means you can switch around the mappings from a script, which will eventually be used to create a key binding screen, as well as check the current states of the actions. There are also some actions that don’t do anything which you can use in scripts: ACTION1, ACTION2, and SHIFT. For examples, check out hmdControls.js, the new, slightly-modified squeezeHands.js, and toybox.js (which may not have been checked in yet and is still in development). As a proof of concept, I created mouseLook.js which will let you look around without right clicking. It is definitely still under development but might eventually become the default camera mode.
I began work on a system for menu and world interaction with hand controllers via the LEFT_HAND_CLICK and RIGHT_HAND_CLICK actions. These will trigger left clicks. You can trigger right clicks with the SHIFT action + either of those. These make it possible to click menus and interact with the world (like with edit.js) using any hand controller!
This new system is controller agnostic, which means it won’t depend on any specific type of device. For example, toybox.js and squeezeHands.js are now usable with any hand controllers without any extra work!
I messed around with some camera features as requested by some users. I added zooming, Center Player In View mode, and condensed some menu options into the View->Camera Modes menu, which lets you switch modes. Independent mode will untether the camera from the avatar so it can be moved separately from a script. While in independent mode, your avatar won’t move (unless you move it with a script).
I set up and worked extensively with our new Vive! I worked on getting the hand controllers to work and did extensive testing of the (recently merged!) “plugins” branch, as well as helping on some related sprints with both teams. As a part of the work on the plugins branch, I created the concepts of “InputDevices” and “InputPlugins.” InputDevices communicate with the UserInputMapper, and InputPlugins are toggle-able managers of those devices. You can find your available InputPlugins in Avatar->Input Modes. If you don’t have any of the optional SDKs installed, you’ll just see the keyboard. If you’re trying to use hydras or gamepads, make sure the option is available there and is checked.
It’s been an absolutely amazing summer getting to work on this stuff. Unfortunately I have to head back to school soon, but I still have a long To Do list of things to work on, including support for the Perception Neuron suit and the Oculus Touch, a better and customizable system for keyboard shortcuts that will warn you if you try to add a duplicate mapping from a script, continuing to develop mouseLook.js and toybox.js, and tons of other stuff. I’ll be offline for the next couple weeks but don’t hesitate to message me with any questions or suggestions about any of this (or anything at all)!
Related Article:
by Ashleigh Harris
Chief Marketing Officer
It’s been about 8 weeks since we launched High Fidelity’s new audio spaces in beta. We really appreciate all the support, particularly if you have ...
Subscribe now to be first to know what we're working on next.
By subscribing, you agree to the High Fidelity Terms of Service