Quest’s ‘Body Tracking’ API isn’t what it looks and sounds like.
Body Tracking API was released on Thursday as part of Movement SDK, which also includes Eye Tracking API and Face Tracking API for Quest Pro.
The official Oculus Developers Twitter account announced the release with an illustration from the documentation showing a user’s full body pose being tracked. This was widely shared – leading to many believing Quest just got body tracking support – but both the name of the API and the illustration are misleading.
Meta’s Hand Tracking API provides the actual position of your hands & fingers, tracked by the outwards-facing cameras. Its Eye Tracking API & Face Tracking API provide your actual gaze direction and facial muscles movements, tracked by Quest Pro’s inward-facing cameras. But the “Body Tracking” API only provides a “simulated upper-body skeleton” based on your head and hand positions, a Meta spokesperson confirmed to UploadVR. It’s not actual tracking, and it doesn’t include your legs.
A better name for the API would be Body Pose Estimation. The spokesperson described the technology as a combination of inverse kinematics (IK) and machine learning (ML). IK refers to a class of equations for estimating the unknown positions of parts of a skeleton (or robot) based on the known positions. These equations power all full-body VR avatars in apps today. Developers don’t need to implement (or even understand) the mathematics behind IK, as game engines like Unity & Unreal have IK built-in, and packages like the popular Final IK offer fully fleshed-out implementations for less than $100.
Unless you’re using body tracking hardware such as HTC’s Vive Trackers, IK for VR tends to be janky – there are just many potential solutions for each given set of head and hand positions. Meta’s pitch here is that its machine-learning model can produce a more accurate body pose for free. Though without the lower half of the body, and with support limited to Quest headsets, it’s unclear how many developers will take up this offer.
Hints given at Meta’s Connect 2022 event, and the company’s research, suggest legs will be added in the future though.
In a talk for developers, Body Tracking Product Manager Vibhor Saxena said:
“New improvements to body tracking in the coming years will be available through the same API, so you can be sure that you will continue getting the best body tracking technology from Meta without having to switch to a different interface.
We are excited to bring these capabilities to you and working hard to make body tracking much better in the years to come.”
During the main keynote Mark Zuckerberg announced Meta Avatars are getting legs with a demonstration that was also misleading. Legs will arrive in Horizon later this year, then in the SDK for other apps next year. Saxena confirmed Body Tracking API leverages the same underlying technology powering Meta Avatars – which seems to suggest the API will get legs too.
You may be wondering: if Body Tracking API is just an estimate based on head and hand positions, how could it possibly incorporate legs? Last month Meta showed off research on exactly this, leveraging recent advancements in machine learning. The system shown isn’t fully accurate though and has 160ms latency – more than 11 frames at 72Hz. That timing is too slow and the output is imperfect, so you couldn’t expect to look down and see your own legs in the positions where you expect them to be. Comments from Meta’s CTO suggest the company might use tech like this to show legs on other people’s avatars instead:
“Having legs on your own avatar that don’t match your real legs is very disconcerting to people. But of course we can put legs on other people, that you can see, and it doesn’t bother you at all.
So we are working on legs that look natural to somebody who is a bystander – because they don’t know how your real legs are actually positioned – but probably you when you look at your own legs will continue to see nothing. That’s our current strategy.”
As we noted at the time though, the coming solution might not be of the same quality as this research. Machine learning papers often run on powerful PC GPUs at relatively low framerate, and the paper didn’t mention the runtime performance of the system described.
from UploadVR https://ift.tt/mCj9432
via IFTTT
No comments:
Post a Comment