German team examines risk factors and injury patterns of e-scooter injuries
Electrolyzer company Hystar raises US$26M to scale up to full commercial operations

MIT study finds computational load for widespread autonomous driving could be a huge driver of global carbon emissions

In the future, the energy needed to run the powerful computers on board a global fleet of autonomous vehicles could generate as many greenhouse gas emissions as all the data centers in the world today, according to a new study from MIT researchers that explored the potential energy consumption and related carbon emissions if autonomous vehicles are widely adopted.

The research appears in the January-February issue of IEEE Micro.

The data centers that house the physical computing infrastructure used for running applications are widely known for their large carbon footprint; they currently account for about 0.3% of global greenhouse gas emissions, or about as much carbon as the country of Argentina produces annually, according to the International Energy Agency.

Realizing that less attention has been paid to the potential footprint of autonomous vehicles, the MIT researchers built a statistical model to study the problem. They determined that 1 billion autonomous vehicles, each driving for one hour per day with a computer consuming 840 watts, would consume enough energy to generate about the same amount of emissions as data centers currently do.

BFF8EE5B-81C1-49FD-8E99-F19F10576328

Emissions from computing onboard AVs driving 1 With one billion AVs, an avg. computer power of 0.84 kW yields emissions equal to emissions of all data centers. Sudhakar et al.


The researchers also found that in more than 90% of modeled scenarios, to keep autonomous vehicle emissions from zooming past current data center emissions, each vehicle must use less than 1.2 kilowatts of power for computing. This would require more efficient hardware.

In one scenario—where 95% of the global fleet of vehicles is autonomous in 2050—computational workloads double every three years, and the world continues to decarbonize at the current rate, they found that hardware efficiency would need to double faster than every 1.1 years to keep emissions under those levels.

If we just keep the business-as-usual trends in decarbonization and the current rate of hardware efficiency improvements, it doesn’t seem like it is going to be enough to constrain the emissions from computing onboard autonomous vehicles. This has the potential to become an enormous problem. But if we get ahead of it, we could design more efficient autonomous vehicles that have a smaller carbon footprint from the start.

—first author Soumya Sudhakar

The researchers built a framework to explore the operational emissions from computers on board a global fleet of electric vehicles that are fully autonomous. The model is a function of the number of vehicles in the global fleet, the power of each computer on each vehicle, the hours driven by each vehicle, and the carbon intensity of the electricity powering each computer.

On its own, that looks like a deceptively simple equation. But each of those variables contains a lot of uncertainty because we are considering an emerging application that is not here yet.

—Soumya Sudhakar

For example, some research suggests that the amount of time driven in autonomous vehicles might increase because people can multitask while driving and the young and the elderly could drive more. But other research suggests that time spent driving might decrease because algorithms could find optimal routes that get people to their destinations faster. In addition to considering these uncertainties, the researchers also needed to model advanced computing hardware and software that doesn’t exist yet.

To accomplish that, they modeled the workload of a popular algorithm for autonomous vehicles, known as a multitask deep neural network because it can perform many tasks at once. They explored how much energy this deep neural network would consume if it were processing many high-resolution inputs from many cameras with high frame rates, simultaneously.

When they used the probabilistic model to explore different scenarios, Sudhakar was surprised by how quickly the algorithms’ workload added up.

For example, if an autonomous vehicle has 10 deep neural networks processing images from 10 cameras, and that vehicle drives for one hour a day, it will make 21.6 million inferences each day. One billion vehicles would make 21.6 quadrillion inferences. To put that into perspective, all of Facebook’s data centers worldwide make a few trillion inferences each day (1 quadrillion is 1,000 trillion).

After seeing the results, this makes a lot of sense, but it is not something that is on a lot of people’s radar. These vehicles could actually be using a ton of computer power. They have a 360-degree view of the world, so while we have two eyes, they may have 20 eyes, looking all over the place and trying to understand all the things that are happening at the same time.

—Sertac Karaman, co-author

Autonomous vehicles would be used for moving goods, as well as people, so there could be a massive amount of computing power distributed along global supply chains, Karaman says. And their model only considers computing—it doesn’t take into account the energy consumed by vehicle sensors or the emissions generated during manufacturing.

To keep emissions from spiraling out of control, the researchers found that each autonomous vehicle needs to consume less than 1.2 kilowatts of energy for computing. For that to be possible, computing hardware must become more efficient at a significantly faster pace, doubling in efficiency about every 1.1 years.

One way to boost that efficiency could be to use more specialized hardware, which is designed to run specific driving algorithms. Because researchers know the navigation and perception tasks required for autonomous driving, it could be easier to design specialized hardware for those tasks, Sudhakar says. But vehicles tend to have 10- or 20-year lifespans, so one challenge in developing specialized hardware would be to “future-proof” it so it can run new algorithms.

In the future, researchers could also make the algorithms more efficient, so they would need less computing power. However, this is also challenging because trading off some accuracy for more efficiency could hamper vehicle safety.

Now that they have demonstrated this framework, the researchers want to continue exploring hardware efficiency and algorithm improvements. In addition, they say their model can be enhanced by characterizing embodied carbon from autonomous vehicles—the carbon emissions generated when a car is manufactured—and emissions from a vehicle’s sensors.

This research was funded, in part, by the National Science Foundation and the MIT-Accenture Fellowship.

Resources

Comments

Bob Niland

So if the on-board IT energy burden flows directly to the direct lifecycle cost/km (as it does today), and the off-board burden flows to subscription service expense (as it presumably does today), is this study then implying that autonomous vehicle economy is illusory (or will become so)?

PARunner

You have to factor in autonomous vehicles won't be getting into accidents, or killing pedestrians. The reduced number of car crashes, and replacement vehicles needed to buy would also offset the in-car burn rate.

Even today we're seeing these cars avoid accidents from human drivers making mistakes on highways. Intra-vehicle communication too at some point should make these cars exponentially safer, at least among themselves.

The comments to this entry are closed.