Scroll to read more

I’m not sure that I like where this is headed.

According to UploadVR, the latest version of Meta’s VR framework includes a new element: tracking tongue movement when using a VR headset.

Meta VR tongue

As per UploadVR:

In version 60 of its SDKs for Unity and native code, Meta has released a new version of its face tracking OpenXR extension which now includes how far stuck out your tongue is. The Meta Avatars SDK hasn’t been updated to support this yet, but third-party avatar solutions can do so after updating their SDK version to 60.”

As you can see in the above example, that’ll mean that, soon, your VR avatar may be able to reflect tongue movements, providing a more realistic VR experience.

Which is a bit weird, but then again, it’s no weirder than Meta’s experiments to insert computer chips into your brain to read your mind.

It’s also probably not as creepy as you might initially expect.

According to UploadVR, tracking tongue movement is another element of Meta’s advanced face-tracking, in order to simulate more realistic expressions. If tongue movement isn’t factored in, your simulated facial responses can get distorted, while including tongue reactivity can also provide more authentic depictions of speech, vowels, etc.

So it’s less about using your tongue in VR as it is about re-creating facial expressions in a more realistic way. And with Meta also developing its hyper-real CODEC avatars, that’ll inevitably require it to include tongue tracking as well, in order to replicate real-world response.

So it makes sense, but still, it does seem a little weird. And it’ll also lead to some adverse use cases.

But either way, tongues are coming to the metaverse.

Yeah, that’s a sentence I hadn’t anticipated writing in 2023.