Today during the main Oculus Connect 6 keynote, Mark Zuckerberg announced that hand tracking will be coming to the Oculus Quest.
The amazing thing about this is that the Quest is already untethered from the PC and will allow you to become more immersed in experiences not having to use any controllers.
“What began as a research project at Facebook Reality Labs has been brought to life through close collaboration with our product and design teams to bring about a new paradigm for VR input,” said the Oculus team. “Our computer vision team developed a new method of using deep learning to understand the position of your fingers using just the monochrome cameras on Quest today—no active depth-sensing cameras, additional sensors, or extra processors required. This technology approximates the shape of your hand and creates a set of 3D points to accurately represent your hand and finger movement in VR.”
How it works
The Facebook team uses deep neural networks to predict the location of a person’s hands as well as landmarks, such as joints of the hands. These landmarks are then used to reconstruct a 26 degree-of-freedom pose of the person’s hands and fingers. The result is a 3D model that includes the configuration and surface geometry of the hand. APIs will enable developers to use these 3D models to enable new interaction mechanics in their apps or to drive a user interface.
They will use a novel tracking architecture that produces accurate, low-jitter estimates of hand pose robustly across a wide range of environments, and an efficient, quantized neural network framework that enables real-time hand-tracking on a mobile processor, without compromising resources dedicated to user applications.
Zuckerberg stated that this technology will be coming to the Oculus Quest next year in early 2020. This will allow people to become more mobile without having to luggage around the controllers.
More news to come.