WebXR Export and Unity XR SDK — How and Why (2024)

WebXR Export is a set of 2 (WebXR Export + WebXR Interactions) Unity packages that enable WebXR features when building Unity WebGL (now Unity Web) experiences.

Unity XR SDK is a set of C++ headers that allow better integration of XR features in the Unity Engine.

Unity is not affiliated with WebXR Export.

Non technical part — How and Why I got to do this

Few years ago, I registered to get the XR SDK, but it took me time to get and implement it in WebXR Export, and I was thinking that since then there were more up to date versions of the XR SDK.
Few months ago, after ending my work with my main client, I started to work on a new WebXR game, and needed a button in XR. I could use ready made scripts I had from other projects, but I wanted better integration with Unity’s current systems.
Apparently there was no need for the XR SDK to implement the Input System related systems for Unity XR Interaction Toolkit to work, so I did that.
And as I almost finished that, I got to Unite 2023. There I learned that the only important system in the XR SDK is the display.
I got back, finished the Input System and XR Interaction Toolkit integration, and looked again on the XR SDK. And apparently, I had the latest.

How the XR SDK works

The XR SDK package allows multiple backends (called “providers”) to implement a single engine feature (called a “subsystem”) in Unity.
It has a number of Subsystems, like the Display (output) subsystem and the (tracking) Input subsystem.

A single subsystem consists of:
- A developer-facing C# interface.
- A native interface that multiple backends (Providers) implement via dynamic libraries. (C++ and JavaScript files in the Web platform)
- Common engine code which handles communicating with the C# interface, the native interface, and the rest of the engine.

How it helps WebXR Export

For WebXR Export, the main missing part was the Display output, and while already on it, also added the device tracking input. Controllers input can keep being tracked using the Input System and Hands input is only done using the Input System.

Before the XR SDK integration, the way WebXR Export handled the display was by having a number of Camera components and GameObjects for each XR state.
On states where there are 2 views (left eye + right eye = AR/VR headsets), the scene was rendered using 2 Camera components.
The XR SDK allows to render those 2 views using one Camera component.

I was thinking that it’ll also allow developers to implement Single-Pass or Multiview rendering, but apparently it’s something that only Unity can implement, and they didn’t implement it for WebGL, So we still using Multi-Pass for WebXR Export.

Tackling the XR SDK

I knew I had the latest XR SDK, I knew that it at least could render in multi-pass mode from past experiments, and I had a basic working example.
I also wanted to implement it in the “correct” way, no overriding WebGL or Unity to WebGL calls. Or minimize those as much as possible.

There were lots of things to change and implement
- Adding XR SDK headers to the project.
- Adding basic Display subsystem.
- Adding basic Tracking Input subsystem.
- Updating the XR Rig to use only one Camera component.
- Actually rendering the views to the device display.
- Making sure old features works.
- Making sure all works on different Unity versions.

I started by adding all the XR SDK headers and basic Display Subsystem to the project.
On this stage, I just wanted to test how it works on Normal mode (no XR) and VR mode, and I disabled the old 2 VR Camera components for that.
I’m not familiar with C++ and had to figure out how to share data between JavaScript and the C++ Display Provider code (`WebXRDisplayProvider.cpp`).
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/f116235218229f661f7609457a07e3164c3752a1

I wanted to see that the values of the views I get from the WebXR API are translated correctly to the display subsystem.
Instead of wearing and removing a headset for every try, the Immersive Web Emulator was a great help.

I figured that the rest of the rendering implementation would take time, so I moved to implement basic Tracking Input subsystem.
I also made sure that rendering one view would also be possible, for mobile phone AR.
And also added a component to set different settings of the Camera component, based on the XR state, as in Normal mode and VR mode the developer might want to have background Skybox or color, but in AR mode there’s no need to render background. Or maybe ignoring rendering layers on different modes.
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/6034c727dbd6f08fd516151f1cdb19d5dac1c314

Next was to test it on the minimum Unity version.
Contrary to other XR platforms, where you know ahead of time what features are supported, if it’s AR device, VR device or both, and in most cases would start the experience in XR mode. It’s not the same on WebXR.
In WebXR the page needs to ask the browser what XR modes it supports, and only after the user enters XR there’s more info about the device capabilities.
This is why on WebXR Export, I choosed to start the Display and Tracking subsystems, only after requesting the browser to enter XR mode.
Seems like on newer Unity versions, the developer can create the XR Subsystems on load, and start/stop them when needed. On older Unity versions, it was not possible.
The solution was to create and start the subsystems when entering XR and stopping and destroying the subsystems on exit. That works on old and newer Unity versions.
Also on older Unity versions, it tried to use Single-Pass by default, and as there’s on Single-Pass implementation in Unity for web, had to explicitly disable this.

Now that it worked on different versions with different demos, I could test more aspects, including performance.
Also noticed that there’s an option for `displayIsTransparent` in `UnityXRDisplayState`, so it was time to disable old hacks of overriding WebGL and Unity to WebGL calls.
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/730857b4a9731ea66493141c1873a83fdc07c601
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/3ea43e2c93185a09ceed51f16470e62ab90c5b15

In that state, I couldn’t use a real device to test the display, as I removed the code which overrides `bindFramebuffer` to bind to the display frameBuffer instead of the HTML Canvas.
Luckily, the Immersive Web Emulator and Spector.JS got me all the data I needed to see what and how is drawn.
And with the Performance tab in Chrome DevTools, I could see that in some cases, there are significant drops in the frame rate.

WebXR Export and Unity XR SDK — How and Why (2)

I was thinking that some issues might be because I didn’t fully integrated the Display subsystem yet, like rendering the display texture to the device display.
So I started to do just that.
I needed to draw a texture on the entire display of a device.
I started by creating small vertex and fragment shaders and drawing empty quad in WebGL. The tutorial at WebGL2Fundamentals have all that is needed for that.
But once I wanted to attach the texture, I realized that it’s better if I’ll implement it in C++, as the Unity to WebGL calls do some wrappings to create and get texture references.
So I translated the calls from WebGL to C++ and OpenGL.
Thanks to this blog post I realized that I can just add `#include <GLES3/gl3.h>` to the C++ code and use OpenGL functions.
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/62382c8955ac6064e1f12ab3e0d0b602a230bb5b

First dead end

At this stage I could test the results on devices as well.
Headset/device tracking — worked great!
Performance and display — had issues. (Bad performance, display texture updates only once, touch input position, and some WebGL warnings)

All the issues I had seemed to be related to the Engine, and not something I could solve.
I got to a dead end.

After some thinking, I tried using URP instead of the Built-in Render Pipeline.
And it worked!
It still had a few artifacts, but the performance was good, and the display texture got updated as it should.

Now the question was, should we switch to support only URP?
After asking on the WebXR Discord and other places, I continued with switching to URP.

There were still some issues with the implementation, like setting the correct size and pose of the view on mobile phone AR.
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/d04d1e4322e898e94a42edeaa83fce2ef56ea9f2

And then another dead end.
Some artifacts in AR mode, other artifacts in VR mode.

WebXR Export and Unity XR SDK — How and Why (3)

My guess was that while I do use OpenGL functions in C++, to not get Unity’s render stack out of order, I’m still doing something wrong.
Looking on a frame using Spector.JS, I noticed that the texture that I want to draw on the display, is already drawn on the HTML Canvas.
I removed the OpenGL code, and re-enabled the `bindFrameBuffer` and transparency overrieds.
Figured out how to catch pre rendering of a Camera component using URP.
Broke my head over wrong values in a shader in older Unity versions. Solved it as well by replacing a line in that shader.
And IT WORKS!🎉
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/6c4ecee3832370d3586c1930ca8bc6387159f84a

WebXR Export and Unity XR SDK — How and Why (4)

Later had to test for more bugs, more Unity versions.

Unity switched to use `setTimeout` instead of `requestAnimationFrame` in some cases. Had to fix that.
https://github.com/De-Panther/unity-webxr-export/pull/331/commits/71f528e303f0610d985f7320098f042a01d6239b

The Render Textures of the Mixed Reality Capture feature needed to support depth and stencil as well.
https://github.com/De-Panther/unity-webxr-export/pull/333/files

And that’s about it.

WebXR Export supports Unity XR SDK.
Still using old hacks. Removed support for versions older than 2020.3.11f1. Switched to URP only. And there’s lots of code to improve, add and remove on the next versions.
But it works!🎉

WebXR Export and Unity XR SDK — How and Why (5)

While Unity is not affiliated with WebXR Export, some Unity Web and XR team members were great help with info and tips.
I hope that now that there’s a working web project that uses the XR SDK, it’ll be easier for Unity to test issues, fix bugs and implement Single-Pass and Multiview rendering for the web platform as well.

You can test the 2020.3.11f1 demo, or the 2022.3.10f1 + XR Interaction Toolkit demo.

And if you want to support this project, you can be a sponsor.

Cheers!

WebXR Export and Unity XR SDK — How and Why (2024)
Top Articles
Latest Posts
Article information

Author: Ouida Strosin DO

Last Updated:

Views: 5426

Rating: 4.6 / 5 (76 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Ouida Strosin DO

Birthday: 1995-04-27

Address: Suite 927 930 Kilback Radial, Candidaville, TN 87795

Phone: +8561498978366

Job: Legacy Manufacturing Specialist

Hobby: Singing, Mountain biking, Water sports, Water sports, Taxidermy, Polo, Pet

Introduction: My name is Ouida Strosin DO, I am a precious, combative, spotless, modern, spotless, beautiful, precious person who loves writing and wants to share my knowledge and understanding with you.