Microsoft expands VR-based training environment for drones to include autonomous cars

Reading time icon 3 min. read


Readers help support MSpoweruser. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more

Virtual reality is not just good for training humans.  Due to the increased fidelity of gaming environments, they are now good enough to even train robots.

We wrote in February of a new platform called AirSim by Microsoft Research designed to help drone developers easily build autonomous and robotic systems. Their new Aerial Informatics and Robotics platform provided realistic simulation and tools for designers and developers to seamlessly generate the large amounts of training data they need. It makes use of the advancements in computation and graphics, incorporating physics and perception, to create accurate, real-world simulations.

Now Microsoft has expanded the tool to also help in the training of autonomous self-driving computers.

The new version of AirSim includes car simulations, new environments, APIs to ease programming and ready-to-run scripts to jump-start your research. Simulating the GTA-like environment means less need to build expensive hardware platforms, delivers large amounts data and the ability to quickly test and benchmark results and opens up the research to a wider range of developers and researchers with fewer resources.

AirSim comes with a detailed 3D urban environment that includes a variety of diverse conditions, including traffic lights, parks, lakes and construction sites. Users can test their systems in several types of neighbourhoods, including downtown, semi-urban, vegetation and industrial environments. The simulation contains more than 12 kilometres of drivable roads spanning more than 20 city blocks.

AirSim has been developed as a plugin for Unreal Engine, a popular tool for game development. This means that the car simulation is decoupled from the environment it runs in. You can create an environment for your specific needs, such as a city or rural road, or choose from a variety of environments available online, and then simply drop in the AirSim plugin to test your self-driving algorithms in that environment. AirSim extensibility also allows researchers and developers to incorporate new sensors, vehicles or even use different physics engines.

AirSim provides APIs that can be used in a wide variety of languages, including C++ and Python. This makes it easy to use AirSim with various machine learning toolchains. For example, you can use Microsoft Cognitive Toolkit (CNTK) with AirSim to do deep reinforcement learning.

The latest version is available now on GitHub as an open-source, cross-platform offering. Microsoft has also made AirSim available as compiled binary release, which means you can now download and start calling its Python APIs to control the vehicle in just minutes.

The updated version of AirSim also includes many other features and enhancements, including additional tools for testing airborne vehicles. Microsoft made it easier for people to simulate flying drones by adding a built-in flight controller, called simple_flight, that simplifies the setup process. This allows rapid experimentation with control and state estimation algorithms without requiring expensive debugging and development in the embedded world.

In future releases, Microsoft hopes to add new sensors, better vehicle physics, weather modelling and even more detailed realistic environments.

Read more about the project on Github here.

More about the topics: airsim, autonomous cars, microsoft, microsoft research