Lightwerk

nov 2023 - may 2024

An exhibition at Phoenix Cinema and Arts Center as part of the Bring the Paint graffiti festival in Leicester, UK.

About

As part of Bring the Paint 2024, a city-wide festival in Leicester that celebrates the graffiti culture of the city, I was asked to contribute to the exhibition held at Phoenix Cinema and Arts Centre. The other half of the exhibition was print-based so we wanted something digital and playful to contrast.

I was provided a collection of scans and images of graffiti in the city from the past few decades to act as inspiration and material. I decided to create two interactive pieces, a retro styled infinite runner platform game and a large-scale body tracking application that allows the visitor to paint with their bodies.

Links

Bring the Paint: https://www.bringthepaint.co.uk/

Tag n Dash: https://adamstephensun.itch.io/tag-n-dash (link may not be active)






Phoenix Media Wall Sample


july - oct 2023

Sample project and documentation for use by artists commissioned by Phoenix Leicester to create work for their interactive media wall.

About

As part of it’s large refurbishment opened in 2023, Phoenix Cinema and Arts Centre had a new interactive media wall installed in the café area. It consists of a 4k display with both a regular webcam and a Zed 2 body tracking camera. To prepare for a series of commissions for the wall related to future exhibitions, I was tasked with creating a basic sample project and documentation that the artists could use to create their works without advanced technical knowledge.

I created a blank template project, as well as a simple Three.js project using physics and webcam input. The sample project contains a grid of physics-enabled cubes with the webcam texture divided across them in a box. The camera orbits around the cubes, revealing the full webcam texture on the inside of the cube (behind where the camera starts). The cubes are knocked down and reset when the camera is in a certain position, creating a seamless loop.

For the first exhibition launched with a corresponding media wall artwork, the artist decided to use a custom Unity template. This was an unintended use case for the wall but I worked together with the artist and the suppliers of the hardware to figure out a solution to display the work.

There was also a Unity WebGL template created by Anthony Woodward of Twin Planets, link below.

Links

Live webapp: media-wall-webcam-sample.vercel.app
Github repository: github.com/adamstephensun/media-wall-webcam-sample

Twin Planets: twinplanets.co.uk



Choosing Children Access


july - nov 2023

App to allow for real-time customisable access materials and user interaction for an immersive theatre show produced by Ben Fredericks collaborations.

About

Choosing Children Access is a unique approach to delivering accessibility content and allowing for visitor interaction during a live interactive theatre performance. The mobile app that I created was just one part of a larger interconnected system of systems. The stage setup included 24 light orbs which, along with the regular stage lighting, were controlled by a central lighting desk. The set also featured a dynamic video projection running from Unreal Engine into Madmapper, with the lighting desk sending network triggers to play certain video clips. There was also of course the actors which had to work in time with all of this technology during the live show. This was all brought together by a great multi-disciplinary team led by writer and director Ben Fredericks.

Following on from a useful (but ultimately unsuccessful) sprint of R&D with a questionable AR headset, we decided to use a mobile phone to display access content and allow for user interaction in the show. This allowed us to communicate to and from the lighting desk using the OSC networking protocol. The lighting desk told the phones when to trigger content based on the current scene.

Each performance was limited to 20 visitors, each with their own phone. The phones were mounted on an adjustable stand in front of the visitor and all start with everything disabled, allowing the user to enable certain elements as they saw fit. The options included were subtitles, British sign language interpretation, audio description, and camera passthrough.

There are also ten instances in the show were the characters ask the audience questions and they are prompted to answer on their devices. These were simple yes/no question and depending on the answer given, one of the orbs on stage linked to that particular device would light up and change colour. There is also a prompt at the beginning of the show for the user to “wake up” their orb, turning on their orb to allow them to see which is theirs. There is also a results page that displays at the end of the show and shows what percentage of people gave each answer.

Links

Ben Fredericks Collaborations: benfredericks.co.uk/choosing-children

Github repository: github.com/adamstephensun/CC_Access



Jess+ Artefact


june - july 2023

An interactive representation of the data collected from recording sessions of Jess+, a musicking robot interacting in real-time with a musician ensemble through drawing digital score.

About

This project is an offshoot of the larger project Jess+, a collaborative abstract music score drawing robot. Nearing the end of development, I suggested that we record the data from the sessions with the robot and represent them in 3D and this was the result.

We captured the xyz position of the robot arm head as well as the outputs of the AI factory and which AI stream was in control of the robot that frame. We also recorded the microphone input during the performance.

I initially started development in Unity, but quickly moved to Three.js as it was more suited to the simplicity of the project. I represented each position in the file as a sphere, changing the colour and scale of the sphere to match the current master AI stream and its value.

Links

Live project - jess-artefact.vercel.app

The Digital Score - digiscore.github.io
Github repo - github.com/DigiScore/artefact



Object Memory Access


march - may 2023

Online virtual reconstruction of the Object Memory exhibition at Phoenix Cinema and Arts Centre

About

Object Memory was the first exhibition after the opening of Phoenix’s extensive refurbishment in January 2023. It consisted of 5 separate pieces, all by local Leicestershire-based artists. I was tasked with capturing and recreating the exhibition in virtual space, using photogrammetry and 3D scanning. The other goals of the project were to use it as a form of digital archive, as well as a way to present access materials for each work.

Just before the exhibition was to be taken down, I scanned the entire space using the iPhone 12 Pro Lidar scanner and the Polycam app. I also scanned each individual piece of work using photogrammetry, again using Polycam. I used Unity to overlay the higher detailed photo scans on top of the larger, less detailed 3D scan to replicate the space, adding in immersive elements such as video players and dynamic lighting.

I also added access materials, such as audio descriptions recorded by the Arts Manager, Irina Tsokova, and sign language interpretation videos. These were all integrated into the system and would play automatically as the user navigates through the space. To make it as accessible as possible, I simplified the navigation to 6 points to travel between, as well as an optional walkthrough mode that requires no user input and simply plays through all the content, moving between exhibits automatically.

Links

Live project - adamstephensun.itch.io/om-access