AR Headset R&D

feb - march 2023

R&D project to reverse-engineer software for an undocumented passthrough AR headset.


During my work with artist and writer Ben Fredericks, we were looking for unique ways to integrate access materials into a live performance with user interaction. One of the avenues we considered was passthrough AR. There are some passthrough headsets on the market, such as the Magic Leap or Microsoft Hololens but these are way out of budget for giving one to each of 20+ audience members.

Instead we found a cheap AR passthrough headset using a mobile phone on eBay. It came with no brand name, documentation, or any kind of example app, meaning we would be reverse engineering it from scratch. The first step was to create a distortion map that mapped to the distortion of the headset lenses. I then moved on to develop a Unity app that would use the phones accelerometer and gyroscope to place 2D and 3D elements in 3D space.

After developing the stereoscopic rendering with lens distortion, I moved on to placing UI elements, such as subtitles and sign language videos, and 3D elements in the scene, tracking the changes in position and rotation from the phones sensors. As the phone is mounted in front of and facing away from the face and at an angle, it makes the translation needed more complicated and unreliable. Ultimately it wasn’t capable of tracking movement throughout space, only rotation.

After many issues and roadblocks during development we eventually decided that this headset wasn’t suited to the application. It simply wasn’t comfortable enough, visually and physically, to wear for a 90 minute show. Despite not using the device in the end, this was still a unique and interesting project for me and it was very rewarding to work things out from scratch, not being able to resort to forum posts and documentation.


Ben Fredericks Collaborations -