Speakers

Wednesday 15 April 2026 (8:30am – 5:00pm)

Immersive Soundscapes in XR: Engineering Ambisonics for Virtual Environments

Assoc Prof Dr Bruce Wiggins

Keynote talk overview

Ambisonics, first developed in the 1970s by Michael Gerzon, has found its ‘killer app’ in head-tracked audio, needed for XR applications. This talk will look at the benefits and limitations of Ambisonics decoded for both loudspeakers and headphones giving example applications of the technology in computer games, VR presentations and music festivals, with particular focus on the award winning Derby Theatre/Plus One project, ‘Odyssey’, which involved a live performance from care experienced young people to an audience experiencing the show in an explorable VR virtual world settings designed by the performers.

Speaker bio

Dr. Bruce Wiggins specialises in spatial audio signal processing and its application within immersive environments and co-leads the Electro-Acoustics Research Lab at the University of Derby. His research focuses on the intersection of engineering and creative practice, specifically the development of accessible tools for high-order Ambisonics 3D audio reproduction for both loudspeaker and headphone reproduction. His work has featured as high performing Impact Case Studies in both REF2014 and REF2021 (Research Excellence Framework – the UK’s assessment of the quality, impact and environment of research in Higher Education Institutions) and he has extensive experience applying this research to commercial productions, most recently providing the technical R&D and live multi-speaker spatial audio system for Odyssey (Derby Theatre/Plus One). This VR production was awarded Digital Project of the Year at The Stage Awards 2023, evidencing his ability to deliver robust technical solutions for award-winning creative outputs.


Multimodal Accessibility Solutions for Visually Impaired Users of Virtual Worlds

Alex Briggs

Talk overview

As an audio-visual and interactive medium, virtual worlds leverage a significant amount of visual information to communicate much of what is happening in the world that surrounds the player. However, if players experience total blindness, how can we substitute the visual information that we communicate so blind players can still interact and engage with the gameplay experiences that developers create?

In this talk, Alex Briggs addresses this question through the lens of an action-stealth experience, as an example to showcase possible multimodal accessibility solutions that game creators could use to develop games more inclusively for blind audiences, through adopting audial and haptic methods and techniques. The solutions were validated through the simulation of blindness among student playtesters, which helped to form expectations that testing among blind users would produce similar results.

The techniques used include audio waypoints to navigationally assist players through complex 3D spaces, radar-like pulses that reveal key items within the player's immediate surroundings, pertinently displayed interaction prompts, detection states communicated through crossfaded audio cues, dynamic free-aiming systems based on distance, as well as controller haptic feedback for wall collisions.

These techniques combine to present a possible solution, incorporating spatialised binaural audio approaches throughout, as well as varied haptic intensities to build a robust design language that allows blind players to engage and successfully progress through the gameplay experience.

Alex will also share his application of an accessibility design pipeline and workflows that he adopted following research into the topic, as well as how they were contextually integrated into this project, while discussing how iterations were made following playtest observations, so other developers can learn how to develop for broader audiences more inclusively.

Speaker bio

Alex Briggs is a producer and designer passionate about games leadership, user experience (UX), and accessibility. He has been invested in the games industry since 2017 through volunteering for Bethesda Softworks, where he received first-hand experience of the developer-player feedback loop after participating in focus group testing for Fallout 76 in Dallas, Texas. Following this, he pursued an education in games and began studying at the University of Staffordshire in 2021, where he has produced TIGA-nominated and award-winning projects through the university’s 1UP scheme. Alex recently completed a 13-month placement as a Producer Intern at Criterion Games (Electronic Arts), contributing to Battlefield 6 (the best-selling game of 2025) and Need for Speed Unbound, where he was awarded the studio’s Rising Star of the Year award in 2024. During this time, he developed a strong interest in accessibility after working closely with User Research and UX teams to drive playtesting initiatives. He is now finishing his undergraduate degree in Computer Gameplay Design and Production, having made use of his final major project to further delve into the study of accessibility and inclusive design.


Photogrammetric Documentation and Preservation of Dynamic Urban Art Spaces within Interactive Immersive Environments

Daryl Marsh

Talk overview

Photogrammetric recording and presentation of real-world street art via immersive environments has featured in the digital art community for some years. But use of old technology and a lack of interactivity fail to create a meaningful presentation of the artists and motivations behind the art they present, resulting in static, unfulfilling experiences. These environments fall short of the dynamic “meta-worlds” envisioned in recent years and often use pre-built drag-and-drop technology that is not appropriate for the needs of this specific challenge. This research considers technological developments, approaches to cultural heritage preservation and discussions around audience flow in immersive virtual environments for greater viewer engagement. In tandem with this the artists themselves are being interviewed and their recorded responses embedded into the experience with animation as an interactive feature that moves through the world with the visitor. This way the interactive environment will be layered to contain the artwork as presented in the real-world context, artist-specific audio guides and insights, with additional close-up explorations of the art and the textures of the environments onto which is placed. This approach will allow the user to see these real spaces as the artists do; a landscape of textures and forms onto which the artworks are pasted, to age, over several years.

Using a combination of photogrammetry, gaussian splatting techniques, animated characters, 360-degree images, collaged 2D images and 3D audio immersion; these advanced tools allow for the detailed recreation of sprawling art surfaces while capturing the fine textures and forms that characterise street art.

The Street Art hub: Manchester & beyond... debuted on Spatial.io in staged releases from February to June 2024 and was featured in public demonstrations at HOST Salford. The follow-up site Street Art Realm is the test area for the developments listed above.

Street Art Realm:

https://www.spatial.io/s/Street-Art-Realm-65b394048e2f7d82aabdb2d5?share=2096789160902351543

Street Art & Graffiti showcase hub 2024:

https://www.spatial.io/s/Street-Art-hub-Manchester-and-beyond-6583002288664e9cce2f5934?share=7376650763686425080

Speaker bio

Daryl Marsh is an Animation Professional, Senior Lecturer and Course Leader of undergraduate Animation at the University of Staffordshire. He is engaged in practice-based research of immersive experiences utilising photogrammetric technology to create digital site-specific experiences of analogue art forms; creating virtual artistic experiences using a visual narrative to engage the viewer and give alternative experiences based on choices made while exploring artworks.

He has collaborated with artists within the metaverse to create exhibitions, live events, AR give aways, shared galleries, event recordings, and in real world street art events and paste-ups.


Designing Disbelief: Virtual Scenography and the Threshold of Reality in The Nether

Dr Owen Brierley

Talk overview

In collaboration Dr Zachary McKendrick of the University of Waterloo (Canada), Owen designed the virtual worlds of a production of "The Nether," a play by Jennifer Haley that takes place in both the IRL theatre and the VR world simultaneously. Treating both audiences as equal experiences of the play, the IRL audience viewed the virtual world as a projection, while the VR audience experienced the IRL performance through a real-time video feed into VR Chat. Exploring how we bring audiences together regardless of virtual or physical space, Owen, Zach, along with the Theatre team at UWaterloo (specifically designers Paul Cegys and Jay Havens) collaborated through opportunities and challenges to stage the Nether as a phygital experience. As the Virtual Scenographer, Owen will walk through using both Unity and Unreal Engine to produce the spaces and experiences for the show.

https://uwaterloo.ca/theatres/events/nether

Speaker bio

Dr Owen Brierley computational media designer who approaches his work from a transdisciplinary position. He holds a Ph.D. in Computational Media Design from the University of Calgary and has extensively used game engines for research, exploring the interplay between technology and humanity. Owen's first career was in theatre where he worked as an actor and director for 15 years. In the middle '90s, Owen discovered a love for interactive media production starting with web technologies, then to rich media (Flash), and then on to games. In the early 2010s, Owen bridged these two worlds by using game engines for live theatre productions.

Owen is the Course Leader for Kingston School of Art's Creative Industries department. His current research focus is to improve the suspension of disbelief in the acting behaviours of non-player characters in games which contributes to what he refers to as "digital empathy." Owen's work in theatre continues with a recent role as the Virtual Scenographer for a collaboration with the University of Waterloo (Canada) on a production of Jennifer Haley's 'The Nether'.


When Worlds Collide: Interprofessional XR Design

Jonathan Furmedge

Talk overview

Exploring best practices of co-designing XR projects alongside experts outside of technology fields, something Jon has intimate knowledge, having worked with a wide range of professions. This short talk is focused on co-design practices and not design issues of XR in general but does shine a light on an issue plaguing the XR industry, often resulting in questionable design decisions.

Speaker bio

Jon Furmedge is a Technical Specialist at University of Staffordshire, he helps people from all areas understand, use and develop technology. His area of expertise is Immersive Tech, specifically using games technology such as games engines and VR outside of games. With a background as a Games Programmer, Lecturer in Games Programming then Game Designer, Jon has a breadth of knowledge of how to design and develop effective software, and has led projects internally and externally including police forces and the NHS.


Comprehending 4D Environments: Creating an Environment to Explore Situational Awareness in Virtual Reality Head-Mounted Displays

Bradley Davis

Talk overview

Visualisation work in virtual reality (VR) head-mounted displays (HMDs) has yet to explore the human factors (HF) implications of real-time use of the technology during high-pressure scenarios.  Existing work considers the use of VRHMDs in the context of high-pressure scenarios but these often reflect a real-world task one-to-one, putting the user in the shoes of the operator, allowing for safe and repeatable practice of a given scenario. Conversely, VRHMDs are also used for the exploration of high-dimensional data due to the immersive stereoscopic visuals provided. Herein, the combination of these is considered, considering the possibility of combining these two elements. A study to investigate this area is proposed and validated, with future work aimed at a wider study using this validated method.

Speaker bio

Bradley Davis is a Lecturer in Immersive Technologies in the Staffordshire Games Institute. Bradley's research focuses on the considerations of possible use cases for virtual and augmented reality displays to enhance the usability of these devices.


Retracing the car park

Assoc Prof Dr Kenneth Feinstein & Dr Racelar Ho

Talk overview

This project is a collaborative work between Ken Feinstein (University of Leeds) and Racelar Ho (IVAS/York University, Canada). It looks at how we create relationships in the world through and with media. We start with amateur photography and explore how the act of photography is an action that creates and documents our engagement with otherness through spatial relationships. Within the act of photography, we create a relationship between the camera operator, the camera and the people/objects being photographed. Through this act we place ourselves into the Heideggerian idea being-in-the-world. Our physicality to the world is established through the act of taking a photograph. At the same time, we move beyond this into a more ontological relationship to what we are photographing. Through the act of taking a photo of the other we are engaging in what Martin Buber calls the I-You relationship (Buber and Kaufmann 1970). The act of photography itself demands that take on the other as itself and not thematize it (Toumayan 2004). Even though the resulting image may appear to do this, in reality it is inscribing the moment when this relationship is created. It freezes a spatial relationship of You to I that is created through the use of the camera. This is a relationship that goes beyond signification and puts us in direct encounter with the other that is before the lens. The images made carry no meaning in and of themselves, meaning is found only in the reading of the image at a future date. Within the action that is the moment of taking the picture only the relationship created through the camera matters. The camera allows to come face-to-face with the subject as the other. Creating the image means that we take on a responsibility for who or what we have captured. As we read amateur images, we do not recognise these relationships because a) these images use highly conventualised that they just reinforce what we expect to see in the image and b) we tend to use such images as a vehicle for storytelling. In order to move beyond the highly conventualised image we chose to look at carpark images. IN order to remember where one has parked in a large and complex car park, we take photographs on our smartphones as mnemonic device. These images will have some form of marker so that we can refer to it later when we need to find our car. What is interesting about these images is that they are specifically based on creating a spatial relationship between the camera operator, the marker, usually a pillar with markings and the car itself, even if it is not in the image. Since these images have not yet fallen into photographic compositional conventions, it is easier for us to explore the spatial aspect of the images.

The project is a currently ongoing work in Goungzhou City in southern China. It is part of the exhibition Hybrid Chrono-Spatial Inscription. In this work we are asking people visiting the exhibition to upload their carpark images. These images can be from place they have parked, be it a multistory carpark or field. The uploaded images will be used to create an imagined space in VR. This space will refer to real spaces, but the created world will not be used to recreate a specific place. Instead, it will create a new imaged surreal space that will refer back to the real world. With in this space the images supplied by the public will appear as one negotiates the VR space. As our relationship between the real and the imagined merges the work is intended to emphasise the linking of the spatial to our relationship towards the other.

References

Buber, Martin, and Walter Arnold Kaufmann. 1970. I and Thou. New York,: Scribner.

Toumayan, Alain. 2004. Encountering the other: the artwork and the problem of difference in Blanchot and Levinas. Pittsburgh, Pa.: Duquesne University Press.

Speakers bio

Kenneth Feinstein is an associate professor at the School of Design in the University of Leeds. He is a media artist, theorist and academic working on issues regarding our ethical relationship to Otherness and how technology plays a role in it. His work investigates how the physicality of a spatial design can help to enhance the relationship of the viewer to the work making it more intimate and personal. It investigates presence and the obligation found in being present with and bearing witness to the Other and the ethical call found in it.

Ken’s prize-winning work has been exhibited at major film festivals and museums. He has authored over 25 articles and chapters on media arts and media theory as well as the monograph, The Image That Doesn’t Want to be Seen.

Racelar Ho is an interdisciplinary scholar-artist and the founder of an international artist collective, Independent Various Art Space (IVAS). She has PhD in Computational Arts, from York University (Toronto). Her research and artistic practice focus on post-technological experimental art as a creative approach used to examine how emerging media reshape human ontology and spatial relationships. She investigates meta-, post-, and transhumanism alongside post-constructionist spatial rhetorics. Her practice involves structuring mixed-reality worlds and incorporating Shanshui-dialogic world-making to place audiences in direct, physical encounters with post-technological otherness.


Colour vision balancing – it’s beyond a toggle

Billy Gray

Talk overview

Communication of virtual worlds to users naturally requires a consistent and predictable replication of the desired experience. Many elements which can adversely affect the literal replication can be mitigated through hardware requirements and suitable calibration. However, how can developers hope to balance the perceived experiences when the end user may have varying colour vision capabilities.

This is the focal point of the proposed talk, to discuss the ramifications and highlight challenges in balancing the perceived virtual world. This will be explored through the lens of gameplay for players with differing colour vision capabilities, evaluating the possible impacts of historically identified advantages such as improved scotopic vision and natural disadvantages of the reduced colour information received by in those with specific colour vision deficiency (CVD).

It has been identified as early as 1940 that those with certain CVD are able to identify camouflaged subjects with greater ease than those with full colour vision. In gameplay which involves identifying specific targets which simulate camouflage like situations it is possible that those with CVD are in fact advantaged not disadvantaged.

On the other side to this, a great deal of design language and techniques leaver the expected identification of colour-based highlighting for areas of interest, which for those without full colour vision said highlights may blend entirely into the environment.

There is no proposed standardised solution to all of these issues, this talk endeavours to highlight that the current standardised approach by solely including a generalised colour filter does not account for design language or the full spectrum of effects that CVD has. Thus, the current approach may in fact be causing more harm to the societal understanding of CVD and its effects.

Speaker bio

Billy Gray is the Associate Lecturer in Games Technical Art for the University of Staffordshire. Currently the only lecturer worldwide who is specifically employed in such a position.

They have received two games-specific degrees from the University of Hertfordshire and Staffordshire University. As a student, their work earned awards including DSSA’s Best Full CG Scene. In previous employment they worked with teams and clients on projects ranging from the CampaignUK award-winning Ford Explorer™ - Virtual Test Drive, to the TIGA award-winning MechHead.


From Data to Virtual Worlds: Immersive Simulation for Environmental and Heritage Research

Linards Bitte

Talk overview

This presentation will provide an overview of the Immersive Visualisation and Simulation Lab and its unique capabilities, including:

  • A 360-degree cylindrical LED screen environment
  • A 32.4‑channel audio system enabling spatialised sound
  • A custom workflow that operates directly through Unreal Engine

Building on this, the talk will explore recent research and development projects undertaken in the lab:

Flooding Visualisation for NERC & FDRI:

  • Visualisation of pluvial flooding in Bristol and fluvial flooding in Carlisle
  • Processes for transforming raw scientific flooding data into immersive, intuitive, and spatially accurate visual simulations
  • Reflections on communicating environmental risk through experiential media

Heritage Site Preservation & Digital Twin Creation:

  • The development of an immersive digital twin of St Benet’s Abbey
  • Approaches to capturing, reconstructing, and preserving vulnerable heritage sites using virtual world technologies

The talk aims to highlight how immersive visualisation can support research, public engagement, and creative industry innovation, aligning closely with the Symposium’s focus on emerging media, virtual worlds, and the transformation of future lifestyles.

Speaker bio

Linards Bitte is a Research Associate specialising in immersive visualisation, simulation, and real‑time 3D workflows. Working within the Institute for Creative Technologies (ICT) at Norwich University of the Arts, he contributes to interdisciplinary research projects spanning environmental communication, heritage preservation, and interactive media.

He is a graduate of Staffordshire University and continues to collaborate with academic and creative partners to explore the potential of virtual worlds as tools for understanding complex data and reshaping user experiences.


Assessing Workplace Fire Safety Knowledge Among UK Business Owners and Responsible Persons through Games-Based Learning

Robert Lambert

Talk overview

The Business Fire Safety Awareness Tool (BFSAT) is an interactive games-based learning resource developed to help small business premise owners and responsible persons understand their legal responsibilities for workplace fire safety. As runner-up for the Innovation Award at the 2022 Regulatory Excellence Awards hosted by the Office for Product Safety and Standards (OPSS), the tool was co-designed with games students and the National Fire Chiefs Council to create an engaging and accessible training experience.

This presentation will explore the development of BFSAT, the challenges of communicating fire safety responsibilities to small businesses through games design and the results of the research after 5 years of data collection.

Speaker bio

Robert S. Lambert is a Senior Lecturer at The University of Staffordshire and Visiting Fellow at the University of Northampton specialising in Game Art. He has over a decade of experience teaching, leading and developing games courses at a several further and higher education institutions in the UK and abroad. He is the founder of the indie games company Buildersbrew LTD, and his passion lies with delivering positive social impact through games. Much of his game development and research resides within the games-based learning subject area.


How an art gallery can animate its spaces and collections through gaming

Steven Roper

Talk overview

A presentation as to how the Whitworth ignite their collections with new energy by integrating gaming and popular culture elements that foster new engagement and broader appeal. This current research has developed new ways to design interactive experiences utilising gaming platforms to innovate exhibit design, create resources and champion storytelling.

Speaker bio

For the last 10 years, Steven Roper have embedded digital practice and gaming at the heart of the Whitworth’s work using creative technology to amplify youth voice. From digital light-painting workshops to the development of Whitworth Minecraft, gaming has become a powerful tool for interpretation, learning and play-enabling young people to build, explore and reimagine the gallery and its park through design and collaborative gameplay. This commitment to gaming and digital creativity also informs exhibition work, including youth-curated projects that challenge traditional interpretation and champion young people’s perspectives.


for Career Prospects

Whatuni Student Choice Awards 2025

for Social Inclusion

The Times and Sunday Times Good University Guide 2026

for First Generation Students

The Mail University Guide 2026

in the UK for Games Education

Rookies Games Design and Development 2023, 2025

TIGA Best Games Institution 2024, 2025

of Research is “Internationally Excellent” or “World Leading”

Research Excellence Framework 2021

of Research Impact is ‘Outstanding’ or ‘Very Considerable’

Research Excellence Framework 2021