The Visualisation team (part of M&V) have recently bought a Driving Simulator to complement our Virtual Reality pedestrian simulator (Omnideck6).
We are building a simulation and visualisation capability that will provide the TSC with the ability to recreate multiple modes of transportation and for users to experience them virtually. This will form part of our unique capabilities hosted at the TSC and will be made available to internal colleagues, projects and external clients.
This type of driving simulator is normally used for race car driver training, but we will be modifying the simulator to work with Virtual Reality technologies to increase the level of immersion. The simulator comes with electromechanical actuators on each corner, allowing the simulator/virtual vehicle to rise and fall as it moves off bumps, curb stones, road textures or inclines in the virtual world. The system is also equipment with 5.1 surround sound allowing users to hear directional sound, which further enhances the user’s level of immersion.
M&V are building a collection of assets within our Visualisation Lab which will allow us to conduct Virtual Reality user trials. This will give us the ability to run multiple scenarios and variables with complete consistence and repeatability at a relatively low cost and short production timescales. Traditional real world trials can require lots of paperwork, risk assessments, insurance cover, administrators and the hire of an appropriate venue, however none of this is needed for simulated user trials. This technology and approach will allows designers and engineers to reduce the number of variables under consideration to a much lower number before physical trials. This will allow designers and engineers to have greater confidence at the point of transitioning from the virtual to the real world and will save time and money.
We have also acquired SMI’s DK2 eye tracking system. This is an eye tracking technology that sits inside an Oculus rift and provides a link between what subjects participating in user trials are looking at in VR. We are combining eye tracking with physiological monitors that will allow us to collect biometric data during experiments/user trials. Eye tracking provides another opportunity beyond knowing where you are looking within a virtual world. Using a technique call “Foveated Rendering” we can increase graphical performance significantly. This could save time and money during the construction of 3D environments.
“Foveated Rendering” works by rendering the user’s focal point at to the highest resolution. Areas outside the focal point are rendered at a lower resolution, therefore using less processing power. This technique means that we can build scenes that are more detailed and we can worry less about IT performance limitations.
Using these new technologies, we hope to be able to undertake the following using tasks/user trials:
Other applications are being considered on a daily basis. Please let us know your thoughts!
In addition to the road based simulation, the same rig can be used to simulate driving a train or an aircraft. Using VR, we simply recreate the interior of the vehicle we’re interested in and move the virtual vehicle through a virtual environment.
SMI Eye tracking: www.youtube.com/watch?v=cx-8Xp1fxgA
Foveated Rendering: www.roadtovr.com/hands-on-smi-proves-that-foveated-rendering-is-here-and-it-really-works/
For any further details, please contact the Visualisation team.