The world of immersive technology can be a daunting place full of confusing acronyms and tech lingo that only the most up-to-date enthusiast will understand. Last year VRFocus created a little glossary of the most used acronyms such as VR (virtual reality), AR (augmented reality) and many more. Well it’s come to our attention that an update of sorts is required, detailing some of the tech terms so widely used now-a-days without a thought to those who may not know what the heck is going on.
Companies come up with their own descriptions all the time, trying to describe a new technology with some fancy new PR term they’ve created. VRFocus isn’t interested in those, it’s the stuff that’s almost used everyday by either us or any other VR/AR technology writer.
Location-based Entertainment (LBE)
Now this should be fairly self explanatory but as this does cover a few parameters VRFocus thought it would be good to include. As the name suggests, location-based entertainment is all about playing immersive content away from home. Now this can refer to something as simple as a VR arcade which houses Oculus Rift’s and HTC Vive’s, to far bigger open plan arenas like those used by Zero Latency or experiences such as Star Wars: Secrets of the Empire by The VOID.
LBE can also refer to popup locations at cinemas or when an educational institution like a museum uses VR to transport visitors to amazing new worlds.
Field-of-View (FOV)
When discussing head-mounted displays (HMDs) one of the most used terms companies talk about and people want to know about is field-of-view (FOV). The FOV relates to how much a user can see, so the wider the FOV the more immersive VR can be as your vision is filled with more information. For example the Oculus Rift has a 110-degree FOV which is good enough to immerse users. There are however headsets out there which boast an even larger FOV. VRgineers new XTAL pro headsets has a 170º FOV, while Starbreeze’s StarVR device has 210º.
All these figures relate to the horizontal FOV width (the most commonly used), but there is the vertical FOV to consider as well. To give you an idea of what these numbers mean when considering your own eyes, on average a binocular horizontal FOV is about 200 degrees.
Inside-out Tracking
Popping on a VR headset is all well and good but for true immersion most users will want to be able to have some sort of movement in VR, and that requires tracking. There are two forms available with the first one VRFocus is going to talk about being inside-out tracking.
This technology is the newest of the two tracking options as it’s proven the most difficult to get right. This is because the headset needs all the tracking tech built-in so the software can map and learn the world around you. Current inside-out tracking consists of two cameras mounted on the front of a HMD – as seen on devices like the Lenovo Mirage Solo and HTC Vive Focus. These not only track your movements but controllers as well (Windows Mixed Reality), offering straight forward solution that requires little to no setup. Additionally, you’re not confined to a particular room or size.
The only downside, the cameras can only see what’s in front of them. So waving a controller behind your head will do nothing.
Outside-in Tracking
Then there’s outside-in tracking which is pretty much the ‘go to’ choice for the main VR headsets like Oculus Rift, HTC Vive and PlayStation VR. This technology involves sensors/cameras being mounted around the user creating a tracked play space. The benefit of this design is that players are monitored wherever they go within that area, and so are the controllers.
For example archery videogames are quite popular in VR and most developers try to make this as realistic as possible so players have to grab an arrow from a quiver on their back. With inside-out tracking this couldn’t be achieved.
However, even though the tracking is better there are limitations. Firstly, you need to install sensors around a room, meaning cables and designating one area for VR – you don’t want to keep on swapping. Then there’s the rest of the setup, making sure they’re all angled correctly and marking out the space so you don’t bump into anything.
Locomotion
Movement in VR, the bane of every developer and either loved or loathed by players. Locomotion in VR has had a dicey history as studios not only learn what can and can’t be done but players also learn their own tolerances. When VRFocus talks about locomotion in reviews and previews it generally tends to boil down to teleportation or direct locomotion.
When developers realised that players didn’t want to just stand still and shoot waves of monsters in VR, actually wanting to move and explore virtual worlds teleportation became the defacto choice. This was namely due to it being comfortable for virtually everyone – no nausea – and easy to implement. The other choice, direct locomotion, was another can or worms entirely. This gave players control just like a standard first-person shooter (FPS) like Battlefield, yet in VR could cause many to become unwell.
As time has gone on VR studios have learnt tricks to make direct locomotion much more comfortable. And while some players will still suffer from the odd bit of simulator sickness, nowadays most VR titles have to employ some sort of direct locomotion as its is by far the more immersive of the two.
Foveated Rendering
VRFocus has covered this particular technology on a number of occasions as its particular use in VR cannot be underestimated. As you may or may not know running VR software/programs/videogames is very resource intensive for the CPU but mainly the GPU (graphics processing unit). This is because a computer is running two signals for each eye – usually HD – as well as keeping to the 90 FPS needed for a smooth comfortable experience. This process is especially hard for mobile devices whether they’re standalone (Oculus Go) or smartphone-based (Gear VR).
One technique to ease the burden on processors – meaning they don’t get as hot, use less power, and work more efficiently – is to use a process called Foveated Rendering. This works by only rendering the image at the highest quality where the user is looking, whilst at the same time downgrading the rest of the image in the peripheral. As such, the player still sees the best quality visuals but the GPU isn’t maxing out trying to render everything in HD.
Normally this process works in hand with eye-tracking technology from companies like Tobii. But recently Oculus showcased a technique called Mask-based foveated rendering (MBFR) that uses just head movement.
Frames-per-second (FPS)
Another aspect of VR that’s important for smooth experiences are the frames-per-second (FPS). As previously mentioned VR needs a consistent 90 FPS, any significant drop below this and everything starts to become way too jittery. This then leads to simulator sickness and generally feeling unwell – imagine turning your head and the visuals having to catch up. Minor drops can be tolerated so long as they are brief, otherwise it’s just unbearable.
Sony’s PlayStation VR has a slightly different system called re-projection meaning that developers only need to hit 60 FPS. The system then ‘re-projects’ the frames so the experience runs at a comfortable 120 FPS.
via Mint VR