At the Transport Systems Catapult we have built our simulation and visualisation capability to allow us to recreate multiple modes of transportation in virtual reality. To this end, we recently acquired a driving simulator to aid the development of driverless cars and other forms of transportation.
The Driving Simulator is available to internal colleagues, projects, and external clients and forms part of our unique capabilities hosted at the TSC.
The simulator is usually used for race car driver training; however, we have modified the simulator to work with Virtual Reality technologies to create an immersive world.
The simulator is equipped with electromechanical actuators, allowing the simulator/virtual vehicle to freely move as it navigates around different road conditions in the virtual world.The simulator is also equipped with surround sound which allows users to hear directional sound which strengthens the immersive world.
We have built a collection of assets within our Visualisation Lab which enables us to conduct Virtual Reality user trials. This gives us the ability to run multiple scenarios and variables with complete consistence and repeatability at a relatively low cost and short production timescales.
Reducing real world costs of trials and testing
Traditional real world trials can require lots of paperwork, risk assessments, insurance cover, administrators, and an appropriate venue, however none of this is needed for simulated user trials.
This technology and approach allows designers and engineers to reduce the number of variables under consideration to a much lower number before physical trials. This allows designers and engineers have greater confidence at the point of transitioning from the virtual to the real world and saves time and money.
Using this new technology, we can currently undertake the following tasks/user trials:
In addition to the road based simulation, the same system can be used to simulate driving a train or an aircraft.
Using VR, we can recreate the interior of the vehicle we’re interested in and move the virtual vehicle through a virtual environment.
Next Steps
We have now acquired SMI’s DK2 eye tracking system. This is an eye tracking technology that sits inside an Oculus rift and provides a link between what subjects participating in user trials are looking at in VR.
We are combining eye tracking with physiological monitors that will allow us to collect biometric data during experiments/user trials. Eye tracking provides another opportunity beyond knowing where you are looking within a virtual world.
Using a technique call “Foveated Rendering” we can increase graphical performance significantly. This could save time and money during the construction of 3D environments. “Foveated Rendering” works by rendering the user’s focal point to the highest resolution. Areas outside the focal point are rendered at a lower resolution, therefore using less processing power.
This technique means that we can build scenes that are more detailed and we can worry less about IT performance limitations.