If you met a driverless vehicle on the pavement, how would you react?
How would you want it to communicate with you?
How would you like it behave?
As autonomous vehicle research takes to the roads across the world, the UK and TSC are helping organisations to investigate how it might be possible for these vehicles to interact with and operate in pedestrianised areas. One of three projects to have emerged successfully from the UK government’s ‘Introducing Driverless Cars’ competition, UKAutodrive will carry out on-road trials in Milton Keynes and Coventry, using cars provided by project partners Ford, Jaguar Land Rover and Tata Motors European Technical Centre. The programme will also trial a fleet of lightweight, self-driving ‘pods’ for use on pavements and other pedestrianised areas.
The outcome of these projects could revolutionise accessibility and mobility in pedestrianised areas. But whilst autonomous cars follow the rules of the road, behaviours and communication with pedestrians at road crossing and in pedestrianised areas are not well defined, with no established methods for communication.
The Transport Systems Catapult’s Simulation and Visualisation Team has been investigating how to help the industry to develop and test new communication systems. One of the biggest challenges is, how do you provoke genuine reactions to hypothetical situations and test out new human-machine interfaces (HMIs) in a realistic way without building physical prototypes. Building functional prototypes is an expensive, time consuming and iterative process.
Using Virtual Reality hardware and Unity (game engine), we have built a system that allows engineers to create a number of scenarios within an accurate simulation of Milton Keynes. Using this unique system, our customers can implement their HMIs onto any vehicle design dynamically, whilst infinitely adjusting and replaying scenarios to test their effectiveness.
Recent project work has shown high levels of immersion and engagement with the virtual environment and autonomous vehicles, allowing participants to provide verbal feedback that is dynamic and contextual. This will help our clients to refine their HMI for testing in the real world, and help them to discount HMIs that were given negative feedback. In addition to subjective feedback collected during the trial, we collected physiological data, recorded video and virtual camera views. This allowed us to analyse movements and physical reactions to the HMIs during a scenario. We also collected biometric data including respiration rate, heart rate and electrodermal activity (skin conductance). The next step will be to integrate eye tracking so that we can link verbal and physiological feedback and data to the participant’s gaze within the virtual environment.
We have created a Virtual Reality Midsummer Blvd extending from our offices to the Milton Keynes train station plaza. This was built using a combination of traditional and experimental methods to ensure the virtual environment matched the real world as closely as possible. Ordnance Survey data, procedural modelling and measurements taken in the real world were used to ensure the environment was built to scale with an accuracy of <0.5m.
We use the HTC Vive as our head-mounted display (HMD) of choice for virtual reality user trials. To provide immersive omnidirectional sound, audio was delivered wirelessly using Logitech G933 7.1 headphones. Audio recordings were taken from the real world and injected into the environment to create an immersive experience that replicates the real world MK. For example, the sound of flag poles clanging can be heard in the station plaza.
To improve immersion, we used an Omnidirectional treadmill called the Omnideck6. Built by Omnifinity, this omnidirectional treadmill enables participants to walk indefinitely in any direction, within a large scale virtual environment. This allows users to move through a virtual environment naturally, focusing on the experience and scenarios. This hardware mitigates the motion sickness issue that some people experience in VR. Having had over 350 people use our system, we believe that this is the most effective and enjoyable way for people to experience VR.
A custom Unity application was built to composite all the data into one synced video. Data feeds included live biometric graphs, observational cameras and 3rd and 1st person VR viewpoints.
Participants were instructed to walk along a predetermined route through the VR Milton Keynes model. During the experiment, the participants encountered autonomous pods at predetermined points, where they observed the Pods displaying HMI designs. Participants observed the HMIs and were allowed to make decisions about continuing to cross a road, moving out of the way or ignoring the pod and proceeding as intended. This freedom of movement, enabled via the use of a virtual environment, is a much closer representation of what would happen in the real world, allowing clients to see how their concepts could be experienced and reacted to much earlier in the development process.
The use of new immersive technologies and our user trials methodology has enabled the TSC to demonstrate how immersive technology and simulation software can deliver insight and value to clients working in the Autonomous Vehicle industry. Traditionally, HMI evaluations would comprise a static illustration of an HMI concept, whereas we are able to place that concept onto a dynamic, interactive vehicle in a contextual simulation. This approach will enable HMI engineers to evaluate concepts much faster, filter out good ideas from a range of options and have greater confidence earlier in the development process. Furthermore, this will result in HMI concepts, and their requirements, being fed into the systems engineering process earlier, resulting in a reduction in costs via late changes.
If you would like to learn more about how VR and how this technology could be applied to your business, please get in touch via: firstname.lastname@example.org