Electric vehicle Scotland | EV Scotland

How NVIDIA is changing the game with their new DRIVE Sim

NVIDIA DRIVE Sim

One of the toughest challenges in autonomous vehicle simulation is generating a world with enough detail and realism that an AI driver can perceive it as real. To solve this problem, we’ve developed computer vision techniques to generate highly detailed 3D environments using deep learning algorithms which are able create visually realistic images from start-to end without human intervention! These types Deep Learning Technologies allow us not only simulate different scenarios but also test how well our existing policy algorithms work on them by allowing us accesss train these models locally over time through testing data sets gathered during earlier stages.

The founders of NVIDIA have created a new AI-based tool to help scientists build simulations directly from real world data. Jensen Huang, CEO previewed this breakthrough during his GTC keynote speech and said it would revolutionize how we study physics in universities around the globe.

Neural Reconstruction Engine

The Neural Reconstruction Engine uses AI to turn recorded video data into simulation. The new pipeline automatically extracts key components like environments, 3D assets and scenarios that have the realism of hand-crafted models but are fully reactive with adjustable parameters for manipulation in real time – all at once.

Environments and Assets

The AI-powered pipeline behind the new driving experience in DRIVE Sim imagines a world that’s more immersive, natural and trustworthy than ever before. The system uses machine learning to convert 2D video data from your real life drive into dynamic 3D digital twins which can then be loaded onto any vehicle—or even used as stand alone applications for scientists who want access without having ownership of physical assets like simulators do now days.

Scenarios

The scenarios are the events that take place during a simulation in an environment combined with assets. The Neural Reconstruction Engine assigns AI-based behaviors to the actors in this scene, so when presented with original drive’s event they behave precisely as if it were real life – but since these figures have been given artificial intelligence abilities by their computer modeling programing (which can also change timing/location), then any changes made happen immediately upon changing something else near them or introducing new synthetic elements.

Integration Into DRIVE Sim

DRIVE Sim creates the perfect conditions for training perception systems by re-creating real world events. The engine’s tools allow developers to adjust dynamic and static objects, vehicle path as well location orientation sensors making it possible reduce sims data gap with reality creating synthetic prelabeled examples that can teach AI how humans see.

Exit mobile version