The Mcity AV Challenge pits researchers against each other to design an AV decision-making module that can safely, efficiently and comfortably navigate routes in Mcity’s first-of-its-kind, city-scale virtual proving ground. The challenge is supported by the Mcity 2.0 Project funded by the National Science Foundation.
About the Challenge
Autonomous Vehicles (AVs) hold the promise of substantially enhancing the safety and efficiency of our transportation systems, offering potential reductions in traffic accidents and improvements in traffic flow. However, despite these considerable advantages, widespread deployment of AVs remains limited. The reason for this is multifold. But above all, the safety performance of AVs is still below that of human drivers.
The main objective of the Mcity AV Challenge is to rigorously assess the driving intelligence of AVs in realistic traffic environments. An interactive simulator is provided, featuring a high-fidelity simulation environment within the Mcity Test Facility, a dedicated AV testing facility that incorporates a diverse range of driving scenarios, including highways, intersections, and roundabouts.
Participants design an AV decision-making module that enables the successful completion of various routes, prioritizing safety, efficiency, and comfort.
The Mcity AV Challenge Leaderboard below lists team standings as of July 31, when the official competition ended. The top team is eligible to test their algorithms in Mcity’s mixed reality testing environment and will have an opportunity to present at the ITSC Workshop on Safety Testing and Validation of Connected and Automated Vehicles in September.
But the Mcity AV Challenge doesn’t end there! Current teams are encouraged to continue their work and new teams can still sign up. We’ll update the Leaderboard as more results come in.
Mcity AV Challenge Leaderboard
Rank | Team Name | Safety Score | Rule Compliance | Efficiency | Comfort | Trajectory Completion | Total Score |
---|---|---|---|---|---|---|---|
1 | THU_AD_Lab | 99.94 | 99.55 | 37.56 | 82.02 | 95.62 | 82.94 |
2 | Spartans | 99.93 | 96.22 | 28.33 | 82.37 | 69 | 75.17 |
3 | Slow Car | 47.66 | 90.01 | 44.80 | 84.47 | 88.66 | 71.12 |
4 | Automated CART | 43.46 | 100 | 38.38 | 82.32 | 82.04 | 69.24 |
5 | Shreya | 26.83 | 99.39 | 35.55 | 82.92 | 93.33 | 67.60 |
6 | CATLAB | 25.80 | 100 | 76.21 | 82.76 | 7.96 | 58.54 |
7 | Meh-optimal | 0 | 96.89 | 76.84 | 78.64 | 25.62 | 55.60 |
8 | JLU_AI | 23.76 | 97.91 | 50.61 | 84.83 | 14.88 | 54.40 |
9 | Subcontinent | 24.68 | 100 | 50.54 | 85.52 | 4.44 | 53.03 |
10 | Smart Mobility Lab | 0 | 99.97 | 47.21 | 86.55 | 2.80 | 47.31 |
Accept the Challenge.
Gather everything you need to participate.
Introduction
Get familiar with the challenge background
Registration
Sign up for the challenge
Challenge Setup
Get familiar with the task and challenge procedure
User Manual
Get more detailed information about the development