High fidelity hand tracking lets you pinch, grab, and zoom in AR.
Nreal’s 3D system Nebula is getting a big upgrade that will bring hand tracking to Nreal’s Light AR glasses, allowing users to interact with 3D digital objects through multiple hand gestures that will make owning a pair of Nreal Light’s even more helpful for day-to-day use at work, in learning environments, and staying connected with friends and family.
High-fidelity hand tracking for Nreal’s Nebula system comes through a partnership with Clay AIR, a hand-tracking and gesture recognition company that has worked on highly accurate and hardware agnostic interaction software to support optical hand control solutions for clients such as Lenovo, Bose, Qualcomm, and Thales.
With hand tracking already available on devices such as Magic Leap 1, Microsoft’s HoloLens 2, and now on the Oculus Quest, Nreal understands that if it wants to remain competitive in this market, they’ll also need to offer hand tracking to its consumers as well as the 3,500 creators developing world-building AR experiences on Nreal Light glasses.
To bring hand tracking to Nreal Light glasses, Clay AIR took advantage of the embedded spatial cameras built into the headset to overlay a 3D hand skeleton model in full HD on top of your physical hand. Your hand motion is then tracked and rendered in real-time using state of the art deep learning algorithms.
If you own a set of Nreal Light’s AR glasses or you’re a developer, one of your first questions regarding native hand tracking is how much of an impact this feature will have on battery life. The answer to that is, surprisingly, not much at all. You shouldn’t see any additional strain on the battery of your glasses or any tethered Qualcomm Snapdragon 855 5G smartphone.
Nreal will launch hand tracking through its own SDK in the near future. The feature can be easily deployed on native Android apps running on Nreal’s 3D system, such as Facebook, Spotify, Netflix, and Instagram. As a developer, you’ll be able to build out customized hand tracking experiences that include 3D hand models, separate fingers, a bounding box, cursors, and customized skins.
The library of gestures provided by Clay AIR will include pinch, point, grab, swipe, and zoom, among many other customizable options.
In an official Nreal press release, Chi Xu, CEO and founder of Nreal said, “By offering hand tracking with Clay AIR, Nreal offers its developers and users of the Nreal Light Consumer Kit an added degree of flexibility in how they might want to interact with their MR environment,” adding, “whether it’s hand tracking, their 5G smartphone as a controller, a third-party 6DoF controller, or even in the near future eye tracking in partnership with 7invensun.”
Bringing this new feature to the Nreal Light Nebula system is really about creating a consumer-friendly hardware/software experience that can bring AR to the masses without making it overly complex. “Seamless hand tracking and gesture recognition technology, is a necessary catalyst to make AR accessible and practical on a global scale,” said Thomas Amilien, CEO of Clay AIR.
Ultimately, this partnership marks an important inflection point with the first integration of hand tracking via embedded monochrome cameras already used for spatial tracking in AR in conjunction with an expanded carrier partnership with Deutsche Telekom.
In the end, this new feature is a big win for Nreal, Clay AIR, and—more importantly, consumers. It’s consumer adoption that will drive AR forward. We just need to get to a point where users can access robust AR experiences without having to wear heavy, cumbersome devices.
This partnership just may get us there.
Feature Image Credit: Nreal
The post Hand Tracking Comes To Nreal AR Glasses appeared first on VRScout.
from VRScout https://ift.tt/2PRnYhR
via IFTTT
No comments:
Post a Comment