omnicept-edition-facemaskremoval-blackbg

The HP Omnicept Edition VR headset adds face tracking, eye tracking and heart rate in 2021.


HP

Imagine you’re training to be a pilot or a doctor, experiencing a simulation in VR with convincing images and tools that feel like the real thing. And not only do you get to know how you did afterwards, but you can see how you focused, what your pulse rate was and maybe even what you were feeling.

VR headsets like Oculus Quest don’t look inwards to measure what we’re experiencing as we try new things, but some business-targeted VR headsets already have eye tracking to measure what a wearer might be looking at or interested in. The HP Omnicept, a new VR training platform announced today, goes further: It measures eye movement, face and lip movement, pupil size and even heart rate. It’s on target for release in 2021 and points to a new wave of work and training tech designed for a more feedback-focused digital world. The eye-tracking tech is from Tobii, which also supplies tech to the HTC Vive Pro Eye and other VR and AR headsets. But the other sensors are new.

According to HP, the combination of pulse, eye tracking and face tracking will open up cognitive and behavioral insights to improve VR training for people like pilots. But it could also allow businesses and researchers to understand how someone is emotionally engaging in VR.

The most interesting (and weird) part of Omnicept is HP’s commitment to an “inference engine” that will be developed using the headset’s sensors, creating algorithms that can study elements of behavior and attention. It will start by measuring cognitive load: how overwhelming a task could be to someone, and how much an individual can process at a time. Pulse, eye movement, pupil dilation and facial expressions will contribute to estimating those results.

The name, HP Omnicept, sounds terrifying. And maybe so does the idea of tracking our faces, eyes and heart rate in VR. But this tech isn’t being aimed at just any regular person. It’s training-focused. It’s also being designed to comply with GDPR privacy standards and it’s being made with the input of Stanford Human Interaction Lab director Jeremy Bailenson, a VR veteran and expert on VR training.

Bailenson has been working with HP for a year on the Omnicept: how it handles privacy and data, and how it builds its tools to measure cognitive load. “The world is full of people who claim they’re going to measure your brainwaves and know your future, and that’s really not what we’re doing,” he says. “We’re saying, ‘what would make people’s lives work better?’ Knowing [cognitive] load would be one of them. And what are the best signals on the planet to get there? I am not a fan of wildly sensing everything, and [I believe in] starting with a problem, solving that problem.”

omnicept-edition-opticalsensors-blackbgomnicept-edition-opticalsensors-blackbg

Omnicept’s first goal is measuring cognitive load. Emotional interaction with avatars could be next.


HP

There are already plenty of VR training hardware applications and headsets, many of them already being used to train pilots, test-drive virtual cars, or train for medical procedures. The goal of the Omnicept is to add more tools and enhance the ways research can study VR training even further.

“We’re really starting to see traction in companies in VR training,” Bailenson says. “It’s not ‘should we do VR training?’ it’s whether they’re in pilot phase or larger-scale rollout at this point.” But the future beyond that could involve working in VR and using tools like these to improve productivity and focus. When could that start to really happen? “It’s less than a five-year horizon, and probably closer to two or three,” Bailenson says. He’s already studied Zoom fatigue, and is just as concerned about where future virtual work could get overwhelming. “You’re going to see more work being done there,” he says, and points to measuring cognitive load as a future helper for reducing VR fatigue.

HP plans to use Omnicept to add more emotional context to avatars in VR, too — it’s something that’s absent from VR headsets that miss out on face-to-face video chat. (Imagine an avatar smiling when you smile, or communicating other emotions over time.) Bailenson feels, however, that the study of emotion is much more nuanced and harder to nail down than cognitive load at the moment.

The first VR headset to work with Omnicept will be the HP Reverb G2 Omnicept Edition, an enterprise-targeted update to the company’s upcoming Reverb G2 PC VR headset. The Omnicept version has the same general specs as the G2, which is coming in the fall (2,160×2,160-pixel resolution per eye, four in-headset cameras for room tracking, Valve-designed spatial audio over-ear headphones), but adds heart rate measurements along with eye-tracking and face-tracking camera sensors.

omnicept-edition-profile-whitebgomnicept-edition-profile-whitebg

A side view of the Reverb G2 Omnicept Edition VR headset and its head strap.


HP

Although the headset has an optical heart-rate sensor, HP isn’t looking at fitness training or health applications yet. Pulse on the Reverb G2 Omnicept is only being used to measure things like focus and engagement and emotion. “We decided on a filtered, aggregated version of what the heart is doing to specifically relate to [cognitive] load, because this is one of the 30 things we’re doing for privacy,” Bailenson says. However, possibilities for health monitoring could be territories that VR headsets like this explore in the future. “When the time is right, we will be able to do that in a really good way.”

HP’s Omnicept platform is being made available free to anyone who wants to buy the headset and try the sensors. But it’s also being sold as an enterprise license, or HP will license the tools using revenue sharing for interested software developers. HP is already working with some enterprise partners: Theia Interactive, Devika, PixoVR, Mimbus and public-speaking training program Ovation.

Bailenson is also excited about the open dataset that HP’s releasing from its research. HP studied 1,000 participants from four continents and their eye, face and pulse data that was used to design the Omnicept. “You have massive master files about cardiovascular response, about what the eye is doing, and then you have how well they’re doing in a task and how loaded they feel moment to moment,” he says. “When you think about 1,000 people, each person’s in VR for 45 minutes, and you can sample up to 1,000Hz, and you’ve got 17 signals coming in just related to what the heart is doing, you can see how this thing just gets massive.”

Latest posts