GET ACCESS TO MCITY 2.0 CAPABILITIES

Progress developing remote access capabilities for the Mcity Test Facility has been measured by milestones marking the completion of several sample use cases. Each use case highlights a different set of goals and features.

How can Mcity 2.0 capabilities be used?

Sample use cases described below include:

  • Remote Testing of AV Motion Planning Algorithms
  • Multi-Agent Distributed Remote AV Testing
  • Teleoperation of AVs
  • Remote Testing of AV Perception Systems
  • Remote Testing of Joint Control of AVs and Traffic Signals

NSF Logo

Remote Testing of AV Motion Planning Algorithms

Use case 1 was the original proof of concept for Mcity 2.0. For this first use case, the Connected Automated and Resilient Transportation (CART) Lab, led by Dr. Yiheng Feng (Assistant Professor of Civil Engineering at Purdue University) were the guest remote researchers. Our goals were to establish that this kind of remote collaboration is possible, and that it is beneficial for all participants. The CART Lab does not have access to a test facility, no advanced infrastructure, nor can it run tests with background/challenge vehicles (due to safety concerns of testing on public roads). These limitations were successfully overcome by the Mcity 2.0 platform.

More on Remote Testing of AV Motion Planning Algorithms

Screengrab of Mcity's remote testing platform

Multi-Agent Distributed Remote AV Testing

Mcity 2.0’s Use Case #2 embarked on an initiative to integrate its advanced simulation and mixed reality system with the VOICES platform. VOICES, an initiative led by the Department of Transportation, is a distributed virtual platform designed to facilitate collaborative efforts among diverse stakeholders including state and local governments, the private sector, and academic institutions. This integration aligns with the National Science Foundation’s objectives of fostering collaboration with national laboratories.

More on Multi-Agent Distributed Remote AV Testing

 

Map of the Remote AV Testing Teams in the United States

Teleoperation of AVs

Mcity 2.0 capabilities offer a safe way to experiment with potentially risky scenarios, such as employing a remote teleoperator to help an AV get unstuck. This demonstration uses a state-of-the-art driving simulator with a sit-in cockpit and projection screen, supplied by VI-grade. It allows the user to attempt to safely navigate a real AV around a virtual obstacle in the Mcity Test Facility.

 

 

 

Photo of the VI-grade simulator

Image courtesy of VI-grade.

Remote Testing of AV Perception Systems

This use of Mcity 2.0 capabilities illustrates the benefits of having a digital twin of the Mcity Test Facility and the simulations research teams can run with it. The demonstration, presented by Dr. Gaurav Pandey, associate professor of Engineering Technology & Industrial Distribution at Texas A&M University, uses a full simulation of the Mcity Test Facility environment, including virtual background vehicles.

The simulation renders a synthetic front-facing camera feed from inside a real autonomous vehicle inside the Mcity Test Facility that is sent to Texas, where Dr. Pandey’s team carries out real-time depth perception from the mono-camera feed. Mcity’s TeraSim, a traffic simulator trained on Ann Arbor driving behaviors, controls the vehicle as the synthetic feed is generated. Camera depth perception using neural networks is currently very challenging, making it an ideal candidate for improvement through digital twin simulations. Once the algorithm is refined, testing can be conducted in a mixed reality environment, using both real and synthetic data feeds.

Animated image of an AV driving through a street

Remote Testing of Joint Control of AVs and Traffic Signals

Dr. Jeff Ban, professor of Civil and Environmental Engineering at the University of Washington, and his team in the intelligent Urban Transportation Systems (iUTS) Lab, have been working on an algorithm to control traffic signals more intelligently based on data from connected vehicles. This scenario is fairly difficult (if not dangerous) to test in the real world and the team needed a safe way to experiment with it. Collaborating with Mcity and using the Mcity 2.0 platform, the iUTS tested scenarios of 25%, 50%, 75%, and 100% connected and automated vehicle (CAV) penetration in pure simulation, and in a mixed reality environment, which combined a physical vehicle with simulated background vehicles.

More on Remote Testing of Joint Control of AVs and Traffic Signals

Animated picture of a connected intersection