GET REMOTE ACCESS

NSF National Science Foundation logo

Remote Access Funded by NSF

With a $5.1 million Mcity 2.0 grant from the U.S. National Science Foundation, Mcity has enhanced the Test Facility by developing digital infrastructure to overlay the physical test facility and create a cloud-based, augmented-reality CAV testbed available to academic and industry researchers nationwide. This gives researchers, many without testing resources, remote access to the facility and helps create a more equitable playing field in mobility. Remote access is now operational, and can be used with the physical test facility and Mcity research vehicles.

Apply for free remote access – the proposal deadline is March 14, 2025.

Enhancing Autonomous Vehicle Testing

The Mcity remote access platform is designed to help users remotely test and refine autonomous vehicle (AV) motion planning algorithms. It allows users to conduct tests in both simulated and mixed reality environments without needing a complete AV system or physical testing facility. The Mcity platform integrates digital infrastructure with physical facilities to provide real-time visualization of AV status, safety metrics, and testing data, which are archived for ongoing analysis and improvement.

Goals and Objectives

  1. Create realistic traffic environments for AV testing.
  2. Enable remote access and control of AV testing facilities.
  3. Offer comprehensive evaluation metrics and sensor data to enhance AV motion planning algorithms.
Photo of the VI-grade simulator

Key Features

  • TeraSim on the Cloud: A fast, realistic traffic simulation generating critical events like collisions. It uses models calibrated with real-world data to simulate both normal and safety-critical driving scenarios.
  • AWS Cloud Hosting: TeraSim is hosted on AWS, allowing global remote access for seamless integration with user algorithms via the TeraSim API.
  • Evaluation Metrics: Includes minimum distance, time-to-collision, and speed difference to assess AV safety and performance.
  • Remote Control Communication: Uses Redis for data exchange and a low-latency communication pipeline to manage real-time data and control commands.
With Support From Amazon Web Services

Testing Capabilities

  • Simulation-Based Testing: AVs and background traffic are visualized in real time, with performance metrics plotted. Data is archived for post-analysis.
  • Mixed-Reality Testing: Combines digital simulations with physical AVs. Remote testers can view real-time AV status and control inputs via Mcity’s HD map.

Testing Procedure

  • Simulation Testing: Remote visualization and data recording of AV and virtual traffic.
  • Mixed Reality Testing: Real-time visualization of physical AV’s control inputs and performance on Mcity’s map, with remote access to onboard and roadside views.

Outcomes

  • Simulation Testing: Identified potential safety issues and improved longitudinal control to reduce conflicts.
  • Mixed Reality Testing: Enhanced passenger comfort by smoothing deceleration and reducing harsh braking.

By leveraging Mcity’s remote-access advanced simulation and mixed-reality capabilities, remote users can effectively test, validate, and enhance AV motion planning algorithms, driving progress in autonomous vehicle safety and performance.

How to get remote access to Mcity?

  1. Review use case examples below
  2. Experiment using Mcity open-source tools
  3. Contact us to discuss your project
  4. Or apply to the Request for Proposals for free remote access

USE CASE 1: Remote Testing of AV Motion Planning Algorithms

This was the original proof of concept for Mcity’s Test Facility remote access capabilities. For this first use case, the Connected Automated and Resilient Transportation (CART) Lab, led by Dr. Yiheng Feng, Assistant Professor of Civil Engineering at Purdue University, were the guest remote researchers. Our goals were to establish that this kind of remote collaboration is possible, and that it is beneficial for all participants. The CART Lab does not have access to a test facility, and has no advanced infrastructure, nor can it run tests with background/challenge vehicles (due to safety concerns of testing on public roads). These limitations were successfully overcome by the Mcity remote access platform.

Goals and Objectives: First, determine if the current technology architecture at Mcity was adequate to support mixed-reality remote collaboration. Second, measure how valuable such a remote collaboration could be.

Both of these goals were met successfully. While the latencies measured were on the edge of acceptable, they were still good enough to provide for a valuable experience. The remote team at Purdue was able to visualize all aspects of the test in real time.

Integration and Collaboration: Working with the Mcity remote access platform allowed the CART Lab access to a physical vehicle in a mixed-reality setting. The planning algorithm was carried out at Purdue University (some 250 miles away from the Mcity Test Facility). The planning results were then sent to Mcity, to the real vehicle on the track. From there, the planning results were translated into drive-by-wire actuation. The real vehicle’s position was then shared with Purdue and TeraSim. TeraSim generated background vehicles to interact with Purdue’s vehicle under test.

Diagram of the TeraSim API architecture.

Features and Capabilities: The data generated by both the real vehicle (position, dash cam feed, and video feeds from the infrastructure) and the simulation (the positions and trajectories of all of the background vehicles) was visualized in real time by remote participants and in-person attendants alike.

Using the data collected, Purdue was able to refine their algorithm to reduce the time to collision (TTC) in almost all scenarios. This was made possible by the feedback from having the background vehicles from TeraSim as part of the test. These refinements also improved passenger comfort, as deceleration was more smooth. This proof of concept also showcased how useful this kind of remote collaboration is.

USE CASE 2: Multi-Agent Distributed Remote AV Testing

This example use case aimed to integrate Mcity’s advanced simulation and mixed-reality system with the VOICES platform. VOICES, an initiative led by the U.S. Department of Transportation, is a distributed virtual platform designed to facilitate collaborative efforts among diverse stakeholders including state and local governments, the private sector, and academic institutions. This integration aligns with the National Science Foundation’s objectives of fostering collaboration with national laboratories.

Goals and Objectives: To analyze the econometrics of connected/autonomous vehicles within a collaborative environment and compare these metrics against solo drive baselines. Participants managed their vehicles within individual simulations, with key econometric data being collected for subsequent analysis by the Argonne National Laboratory.

Integration and Collaboration: At the heart of this use case is the seamless integration of Mcity’s simulation capabilities and its remote-access technology into the VOICES platform. This demonstration showcased Mcity’s adaptability and interoperability. Key participants in this event included Argonne National Laboratory, Oak Ridge National Laboratory, Econolite, the University of California Los Angeles, and the Federal Highway Administration.

Multi-Agent Distributed Remote AV Testing graphic

Features and Capabilities: This use case emphasized the Mcity remote access platform’s flexibility. Instead of hosting the simulation entirely in the cloud-based Mcity OS, the simulation map and world simulation were shared with participants. Each participant was able to set up their own simulation and link in with every other participant’s simulation. This technique is called Distributed Simulation. Mcity’s participation showcased that the Mcity platform is capable of almost any kind of deployment and collaboration.

Test plan diagram.

Simulation and Real-World Testing: Mcity contributed a high-definition map and a dynamic CARLA environment of its test facility. This formed the basis for each participant’s hosted simulation. Real-time vehicle data was shared among all participants. Unique to this project, both Argonne National Laboratory and Mcity linked their simulations to real vehicles. In Argonne’s case, an electric vehicle was mounted on a dynamometer, while Mcity employed a real-world vehicle at its Ann Arbor, Michigan test track. Data from both simulated and real-world environments was gathered for comprehensive analysis.

Additional Contributions and Future Applications: Mcity also introduced background vehicles into the distributed simulation. These background vehicles were controlled by the TeraSim system, which is trained upon extensive traffic data from Ann Arbor intersections. These vehicles were configured to avoid interfering with the test vehicles to ensure unaltered econometric data. This unobtrusive implementation served as a proof-of-concept of the interoperability of TeraSim with distributed simulation systems like VOICES. For this use case, the background vehicles were configured to be very safe, so as not to interfere with the econometrics being gathered. Future test runs could be configured to be more adversarial toward the vehicles under test.

Capabilities Delivered to the VOICES project:

  • Ability to run physical test vehicles on a physical test track that is integrated into the VOICES distributed simulation environment.
  • Ability to create virtual background traffic in TeraSim with the potential to create challenging test scenarios for the virtual and physical vehicles that are being tested.
  • Provision of a high-definition map and a dynamic CARLA environment representing the Mcity Test Facility.

USE CASE 3: Teleoperation of AVs

Mcity remote-access capabilities offer a safe way to experiment with potentially risky scenarios, such as employing a remote teleoperator to help an AV get unstuck. This demonstration uses a state-of-the-art driving simulator with a sit-in cockpit and projection screen, supplied by VI-grade. It allows the user to attempt to safely navigate a real AV around a virtual obstacle in the Mcity Test Facility.

VI-grade driving simulator. Image courtesy of VI-grade.

USE CASE 4: Remote Testing of AV Perception Systems

This use of Mcity remote-access capabilities illustrates the benefits of having a digital twin of the Mcity Test Facility and the simulations research teams can run with it. The demonstration, presented by Dr. Gaurav Pandey, associate professor of Engineering Technology & Industrial Distribution at Texas A&M University, uses a full simulation of the Mcity Test Facility environment, including virtual background vehicles.

The simulation renders a synthetic front-facing camera feed from inside a real autonomous vehicle inside the Mcity Test Facility that is sent to Texas, where Dr. Pandey’s team carries out real-time depth perception from the mono-camera feed. Mcity’s TeraSim, a traffic simulator trained on Ann Arbor driving behaviors, controls the vehicle as the synthetic feed is generated. Camera depth perception using neural networks is currently very challenging, making it an ideal candidate for improvement through digital twin simulations. Once the algorithm is refined, testing can be conducted in a mixed-reality environment, using both real and synthetic data feeds.

USE CASE 5: Remote Testing of Joint Control of AVs and Traffic Signals

Dr. Jeff Ban, professor of Civil and Environmental Engineering at the University of Washington, and his team in the intelligent Urban Transportation Systems (iUTS) Lab have been working on an algorithm to control traffic signals more intelligently based on data from connected vehicles. This scenario is fairly difficult (if not dangerous) to test in the real world and the team needed a safe way to experiment with it. Collaborating with Mcity and using the Mcity remote access platform, the iUTS Lab tested scenarios of 25%, 50%, 75%, and 100% connected and automated vehicle (CAV) penetration in pure simulation, and in a mixed-reality environment which combined a physical vehicle with simulated background vehicles.

Goals and Objectives: The iUTS Lab wanted to test the performance of their signal controlling framework, the “SVCC” Multiscale Signal-Vehicle Coupled Control in real-world and mixed traffic environments. They also wanted to explore how they might extend this platform to a large urban environment and different penetration levels.

The iUTS Lab also has similar encumbrances to those encountered by the Purdue team, featured in Use Case 1. The University of Washington has no test facility, they have no access to advanced infrastructure, and it is unsafe to test on public roads, so they cannot test with background vehicles.

The encumbrances were easily overcome as Mcity had already vetted that capability through Use Case 1. The iUTS Lab was also able to test all of their planned scenarios and hone their algorithm while proving that the algorithm does indeed perform well under almost any penetration level. The performance metrics tested were fuel consumption, waiting time, time loss, queue length, and number of vehicles (through the intersection).

Data flow architecture.

Integration and Collaboration: The team from the University of Washington was able to participate in real-time by viewing various feeds made accessible by the Mcity remote access platform. They were also able to control how many background vehicles were “connected” and therefore visible to their algorithm, thus simulating different levels of penetration.

integration and collaboration graphic
Clockwise from left: vehicles, signals status and ego vehicle’s performance metrics; chasing camera view; onboard camera view.

Features and Capabilities: A prototype was created to allow a researcher to immediately change the states of traffic signals via an API. The prototype is a Raspberry Pi connected to the Mcity OS network and directly to the traffic signals via typical A, B, and C connectors. The local Malfunction Monitoring Unit (MMU) was configured to allow all states, so that an immediate change from the research team would not trigger a fault. A simple REST API was written and installed on the Raspberry Pi so that the research team need only get on the Mcity OS network in order to control the state.

Traffic controller prototype.