Unity 2019.2 Arrives, Improves VR/AR Performance and Capabilities for Developers

There are a number of videogame engines designed to help facilitate virtual reality (VR) and augmented reality (AR) development. While you may not really pay attention (or care) about some of the logos at the start of titles, one that crops up quite a lot is Unity. Today, Unity Technologies has announced the launch of Unity 2019.2, with the latest iteration including a few tasty features for immersive content developers.

Unity 2019.2 image1

As with any release of this kind Unity 2019.2 features a massive selection of updates and improvements, more than 170 in fact. While many none specific VR/AR enhancements will help immersive developers, there are several designed just for them. These include VR support for Unity’s High Definition Render Pipeline (HDRP). Currently limited to Windows 10 and Direct3D11 devices, developers will be able to improve the visual capabilities of their projects on high-end hardware.

When it comes to supporting AR developers Unity released a dedicated solution called AR Foundation. It’s now received a slew of improvements to support face-tracking, 2D image-tracking, 3D object-tracking, environment probes and more for ARKit and ARCore.

AR Foundation Updates:

  • Face-Tracking (ARKit and ARCore): You can access face landmarks, a mesh representation of detected faces, and blend shape information, which can feed into a facial animation rig. The Face Manager takes care of configuring devices for face-tracking and creates GameObjects for each detected face.
  • 2D Image-Tracking (ARKit and ARCore): This feature lets you detect 2D images in the environment. The Tracked Image Manager automatically creates GameObjects that represent all recognized images. You can change an AR experience based on the presence of specific images.
  • 3D Object-Tracking (ARKit): You can import digital representations of real-world objects into your Unity experiences and detect them in the environment. The Tracked Object Manager creates GameObjects for each detected physical object to enable experiences to change based on the presence of specific real-world objects. This functionality can be great for building educational and training experiences, in addition to games.
  • Environment Probes (ARKit): This detects lighting and color information in specific areas of the environment, which helps enable 3D content to blend seamlessly with the surroundings. The Environment Probe Manager uses this information to automatically create cubemaps in Unity.
  • Motion Capture (ARKit): This captures people’s movements. The Human Body Manager detects 2D (screen-space) and 3D (world-space) representations of humans recognized in the camera frame.
  • People Occlusion (ARKit): This enables more realistic AR experiences, blending digital content into the real world. The Human Body Manager uses depth segmentation images to determine if someone is in front of the digital content.
  • Collaborative Session (ARKit): This allows for multiple connected ARKit apps to continuously share their understanding of the environment, enabling multiplayer games and collaborative applications.

There’s plenty to get stuck into and don’t forget Unity 2019 is free to download. As for what to expect in Unity 2019.3? Well, that’s expected to arrive later this summer in beta, with a full release scheduled for fall 2019. For those that can’t wait there’s always the alpha version. For further Unity updates, keep reading VRFocus.



via Mint VR
Labels: ,
[blogger]

Contact Form

Name

Email *

Message *

Theme images by Storman. Powered by Blogger.