New Tech – The Autonomous Driving Simulator

The Visualisation team (part of M&V) have recently bought a Driving Simulator to complement our Virtual Reality pedestrian simulator (Omnideck6).

We are building a simulation and visualisation capability that will provide the TSC with the ability to recreate multiple modes of transportation and for users to experience them virtually. This will form part of our unique capabilities hosted at the TSC and will be made available to internal colleagues, projects and external clients.

This type of driving simulator is normally used for race car driver training, but we will be modifying the simulator to work with Virtual Reality technologies to increase the level of immersion. The simulator comes with electromechanical actuators on each corner, allowing the simulator/virtual vehicle to rise and fall as it moves off bumps, curb stones, road textures or inclines in the virtual world. The system is also equipment with 5.1 surround sound allowing users to hear directional sound, which further enhances the user’s level of immersion.

M&V are building a collection of assets within our Visualisation Lab which will allow us to conduct Virtual Reality user trials. This will give us the ability to run multiple scenarios and variables with complete consistence and repeatability at a relatively low cost and short production timescales. Traditional real world trials can require lots of paperwork, risk assessments, insurance cover, administrators and the hire of an appropriate venue, however none of this is needed for simulated user trials. This technology and approach will allows designers and engineers to reduce the number of variables under consideration to a much lower number before physical trials. This will allow designers and engineers to have greater confidence at the point of transitioning from the virtual to the real world and will save time and money.

Our Oculus DK2 now includes eye tracking technology, implemented by SMI

We have also acquired SMI’s DK2 eye tracking system. This is an eye tracking technology that sits inside an Oculus rift and provides a link between what subjects participating in user trials are looking at in VR. We are combining eye tracking with physiological monitors that will allow us to collect biometric data during experiments/user trials. Eye tracking provides another opportunity beyond knowing where you are looking within a virtual world. Using a technique call “Foveated Rendering” we can increase graphical performance significantly. This could save time and money during the construction of 3D environments.

“Foveated Rendering” works by rendering the user’s focal point at to the highest resolution. Areas outside the focal point are rendered at a lower resolution, therefore using less processing power. This technique means that we can build scenes that are more detailed and we can worry less about IT performance limitations.

Using these new technologies, we hope to be able to undertake the following using tasks/user trials:

  • Behavioural/physical responses to trigger events in a virtual environment. (how do pod users react to an unexpected event during a journey i.e. someone changes direction in front of the pod’s trajectory)
  • Behavioural response to user interface concepts. (what will autonomous vehicle exhibit externally or internally to passengers and pedestrians?)
  • Measure stress levels. What was the participant looking/doing at when their heart rate spiked. (looking at a departure board, walking the 2miles to the platform/departure gate etc)
  • Feedback on autonomous vehicle dynamics (as a pedestrian or a passenger, what did the vehicle’s’ driving characteristics feel like/look like?)
  • What override mechanism does an autonomous vehicle need? (an emergency stop button, a voice activated control system, destination change interface, what payment methods and what service offerings)
  • How should the final autonomous vehicle look. (Design it in VR and allow designers and potential end users to walk around it and sit in it virtually to give feedback)
  • What do control rooms look like in the future with VR headsets? (do we need large, high cost and high complexity rooms any more when you can display multiple screen all around you in VR?)

Other applications are being considered on a daily basis. Please let us know your thoughts!
In addition to the road based simulation, the same rig can be used to simulate driving a train or an aircraft. Using VR, we simply recreate the interior of the vehicle we’re interested in and move the virtual vehicle through a virtual environment.

SMI Eye tracking: www.youtube.com/watch?v=cx-8Xp1fxgA

Foveated Rendering: www.roadtovr.com/hands-on-smi-proves-that-foveated-rendering-is-here-and-it-really-works/

For any further details, please contact the Visualisation team.

Cookies on Catapult explained

To comply with EU directives we now provide detailed information about the cookies we use. To find out more about cookies on this site, what they do and how to remove them, see our information about cookies. Click OK to continue using this site.

OK