Oculus have been making strides with regards to the ability of users to capture mixed reality (MR) videos that contain both footage of the virtual reality (VR) user alongside the virtual environment they are currently inhabiting. Footage of that nature allows viewers to gain a much stronger understanding of how a VR experience plays out. Oculus has recently expanded MR capture support by adding Unreal Engine integration.
Oculus added native support for MR capture using the Rift native SDK (software development kit) only a few days ago, and now Unreal Engine developers can learn how to implement the functionality for their projects. The update comes alongside extended support for Unreal Engine 4 in the form of a new, merged Oculus plugin. Blueprints have also been overhauled to support the new features.
A detailed guide giving full instructions to developers on how to set up Unreal to capture Mixed Reality video has been provided by Oculus. The guide offers two options for capturing the footage. Direct composition mode allows an MR capture application to stream the real-world footage from a camera to the scene directly and display the composited image in the scene itself. This requires the player to be in front of a green screen for video capture and can introduce some latency. The other option is External Composition, which requires a third-party composition software to be used, such as OBS Studio or Xsplit. The output displays two windows, one showing the application foreground content from the camera, while the other shows background content from the virtual application.
The current set-up is still somewhat complicated and recommended more for developers than YouTubers or casual livestreamers. The functionality could be very useful for debugging or to show off proof-of-concept ideas, however.
Perhaps in future a simplified tool will be available for casual users. VRFocus will report on any future developments of this technology.
via Mint VR