Mcity software engineer applies human brainpower to AI

June 29, 2024
Man wearing a dark blue shirt with white lettering that says, "The Michigander EV & Mobility Scholars" stands in front of a cabinet that stores traffic control equipment at the Mcity Test Facility.

Artificial intelligence is popping everywhere, from real estate to retail and food services but also in less obvious sectors like construction, education, and agriculture. And, of course, the world of autonomous vehicle testing is also deploying artificial intelligence. 

But even more important is real intelligence – the kind of brainpower the human species depends on to create artificial intelligence in the first place. And what does a university-based research center such as Mcity do to attract such genius? 

In the case of Raj Patnaik, nothing.                                                                                           

“We were living in India, and my wife, got the opportunity to bring her brain cancer research to Michigan Medicine,” said Raj, 39. “Her research involves  figuring out treatment for a deadly type of brain cancer , so we had to come here. She is working toward saving lives, which was more important than what I was doing, and so, we decided to move to Ann Arbor.” Raj said. “This major life decision was made easy by Mcity, which hired me.”

Arriving in Ann Arbor as a “trailing spouse” during December 2021, Raj was quickly recruited by Mcity, the public-private mobility research partnership led by the University of Michigan. Raj became the lead software engineer for the team building Mcity 2.0, the advanced remote platform that will give researchers all across the world easy, remote access to the resources of the Mcity Test Facility through the cloud. The goal is to create a more equitable and accessible playing field in the development of connected autonomous vehicles.

Mcity senior software engineer Raj Patnaik sets up screens for visualization of remote vehicle testing using the Mcity 2.0 platform. Photo credit: Calvin Tuttle, University of Michigan.

A native of Bangalore, India, Raj earned a master’s degree in physics before deciding to turn his main focus to computer engineering. “I realized that a lot of problems I wanted to work on need a lot of math, and for that you need computers,” he said. 

While Raj was working at JPMorgan&Chase as a software engineer his wife, Sravya Palavalasa, 35, snagged a prestigious invitation to work at Michigan Medicine. The couple promptly packed for Ann Arbor with their now seven-year-old son, Reyaansh, an aspiring musician in second grade. 

Without Mcity 2.0, a large part of CAV research would require in-person work with the vehicles, roadways, intersections, road sensors and more at Mcity’s 16-acre, $10 million test track. Once the updated, expanded software goes into use later this year, remote users will have full access to all the features and components of the physical test track. 

“Researchers working anywhere in the United States can access the Mcity test facility and run their autonomous algorithms from wherever they are, and the vehicle will respond while moving on the test track here in Ann Arbor” Raj said. “They’ll have the opportunity to do all the stuff that people used to do in person here.”

Raj and the development team tested the software in October, working with a research team from Purdue University in West Lafayette, Indiana, 260 miles away from Ann Arbor. “The researchers were running their autonomous algorithm there and the test vehicle was running on the Mcity track, while receiving simultaneous visuals of whatever the car was seeing,” Raj said.

Even as the two-year development process for Mcity 2.0 is drawing to a close, Raj and Mcity’s team of five engineers and two interns are continuing to add new features that can provide extreme real-world challenges for the test vehicles, such as how to handle a dangerously malfunctioning traffic signal.

“We tested one of the best new features yesterday, and it’s crazy stuff but this kind of research is required,” Raj said. 

Normally, a traffic signal is programmed to conform to local and state laws so that it won’t show the same signal on all four sides. But the Mcity engineering team wanted to test how autonomous vehicles will react to a malfunction where all four sides might turn green or red at the same time. Such an extremely rare situation had to be generated in order to present the AV with a circumstance where a rogue driver jumps lights and drives as if there was green on more than one side. 

“You definitely need to test how the AV will behave in that situation, and what exactly the AV algorithm should do in that case,” Raj said. “If you try to make that happen with a regular traffic controller at an intersection, it’ll start telling you, ‘There’s no way you can do this.’ You’d need to install completely different hardware and circuit boards. We’ve made it so that the Mcity 2.0 software can easily present all sorts of configurations for the traffic lights. They’ll do whatever you want as a researcher.”

Artificial intelligence factors into Mcity 2.0 as a way of processing years of real-world traffic data the test facility has collected from 21 intersections in Ann Arbor, including data from roadside sensors that record the movements of vehicles, pedestrians, cyclists and others, as well as motorcyclists and persons with disabilities or reduced mobility and orientation. 

“What makes Mcity 2.0 pretty special is that everybody can access it. Mcity is not just Ann Arbor or Michigan now. It’s national”

Raj Patnaik, Senior Software Engineer, Mcity

The Michigan Traffic Lab, led by Mcity Director Henry Liu in his role as a professor of Civil and Environmental Engineering, often works with Mcity. The MTL team created its own AI programs to extrapolate traffic situations from the roadside sensor data, then uses it to insert virtual “mixed reality vehicles” into the sensors of a test vehicle, using Mcity’s Digital Twin virtual reality technology. The resulting simulations challenge vehicle programming in perfectly repeatable conditions and scenarios to produce the mountain of test data engineers need to make autonomous vehicles a reality.

“You need to have some virtual traffic with other vehicles moving around when you are doing your test to simulate a real world situation,” Raj said. “Virtual vehicles can cut you off and do all sorts of stuff to test your car’s behavior. We can inject aggressive traffic behavior, or regular or low traffic. You also have road elevation, different objects and shadows, as well as digital wind, simulated rain or snow conditions, and simulate daytime or nighttime.”

The result is software that can be used anywhere in the United States to accurately develop, test and retest AV algorithms under all kinds of conditions and all kinds of situations. 

“Right now, if you’re at UCLA and want to test something with Mcity, you have to take a six-hour flight,” Raj said. “If you have your own software algorithm, you have to bring it, install it and execute it. If you have some bugs to fix, it’ll take a couple of days. Then you’ll go back and fix and bring it again.”

But by the end of 2024, all those hurdles to vehicle testing will simply disappear. 

“What makes Mcity 2.0 pretty special is that everybody can access it,” Raj said. “Mcity is not just Ann Arbor or Michigan now. It’s national”

This story was written by Brian J. O’Connor, a Michigan-based freelance writer.