Light is the key to long-range, fully autonomous EVs - 7 minutes read




Nick Harris is a scientist, engineer, and the founder and CEO of
Lightmatter, which manufactures photonic processors.






Advanced driver assistance systems (ADAS) hold immense promise. At times, the headlines about the autonomous vehicle (AV) industry seem ominous, with a focus on accidents, regulation or company valuations that some find undeserving. None of this is unreasonable, but it makes the amazing possibilities of a world of AVs seem opaque.

One of the universally accepted upsides of AVs is the potential positive impact on the environment, as most AVs will also be electric vehicles (EVs).

Industry analyst reports project that by 2023, 7.3 million vehicles (7% of the total market) will have autonomous driving capabilities requiring $1.5 billion of autonomous-driving-dedicated processors. This is expected to grow to $14 billion in 2030, when upward of 50% of all vehicles sold will be classified as SAE Level 3 or higher, as defined by the National Highway Traffic Safety Administration (NHTSA).




Fundamental innovation in computing and battery technology may be required to fully deliver on the promise of AEVs with the range, safety and performance demanded by consumers.



While photonic chips are faster and more energy efficient, fewer chips will be needed to reach SAE Level 3; however, we can expect this increased compute performance to accelerate the development and availability of fully SAE Level 5 autonomous vehicles. In that case, the market for autonomous driving photonic processors will likely far surpass the projection of $14 billion by 2030.

When you consider all of the broad-based potential uses of autonomous electric vehicles (AEVs) — including taxis and service vehicles in major cities, or the clean transport of goods on our highways — we begin to see how this technology can rapidly begin to significantly impact our environment: by helping to bring clean air to some of the most populated and polluted cities.

The problem is that AEVs currently have a sustainability problem.

To operate efficiently and safely, AEVs must leverage a dizzying array of sensors: cameras, lidar, radar and ultrasonic sensors, to name just a few. These work together, gathering data to detect, react and predict in real time, essentially becoming the “eyes” for the vehicle.

While there’s some debate surrounding the specific numbers of sensors required to ensure effective and safe AV, one thing is unanimously agreed upon: These cars will create massive amounts of data.

Reacting to the data generated by these sensors, even in a simplistic way, requires tremendous computational power — not to mention the battery power required to operate the sensors themselves. Processing and analyzing the data involves deep learning algorithms, a branch of AI notorious for its outsized carbon footprint.

To be a viable alternative, both in energy efficiency and economics, AEVs need to get close to matching gas-powered vehicles in range. However, the more sensors and algorithms an AEV has running over the course of a journey, the lower the battery range — and the driving range — of the vehicle.

Today, EVs are barely capable of reaching 300 miles before they need to be recharged, while a traditional combustion engine averages 412 miles on a single tank of gas, according to the U.S. Department of Energy. Adding autonomous driving into the mix widens this gap even further and potentially accelerates battery degradation.

Recent work published in the journal Nature Energy claims that the range of an automated electric vehicle is reduced by 10%-15% during city driving.

At the 2019 Tesla Autonomy Day event, it was revealed that driving range could be reduced by up to 25% when Tesla’s driver-assist system is enabled during city driving. This reduces the typical range for EVs from 300 miles to 225 — crossing a perceived threshold of attractiveness for consumers.

A first-principle analysis takes this a step further. NVIDIA’s AI compute solution for robotaxis, DRIVE, has a power consumption of 800 watts, while a Tesla Model 3 has an energy consumption rate of about 11.9 kWh/100 km. At the typical city speed limit of 50 km/hour (about 30 mph), the Model 3 is consuming approximately 6 kW — meaning power solely dedicated to AI compute is consuming approximately 13% of total battery power intended for driving.

This illustrates how the power-hungry compute engines used for automated EVs pose a significant problem for battery life, vehicle range and consumer adoption.

This problem is further compounded by the power overhead associated with cooling the current generation of the power-hungry computer chips that are currently used for advanced AI algorithms. When processing heavy AI workloads, these semiconductor chip architectures generate massive amounts of heat.

As these chips process AI workloads, they generate heat, which increases their temperature and, as a consequence, performance declines. More effort is then needed and energy wasted on heat sinks, fans and other cooling methods to dissipate this heat, further reducing battery power and ultimately EV range. As the AV industry continues to evolve, new solutions to eliminate this AI compute chip heat problem are urgently needed.

The chip architecture problem

For decades, we have relied on Moore’s law, and its lesser-known cousin Dennard scaling, to deliver more compute power per footprint repeatedly year after year. Today, it’s well known that electronic computers are no longer significantly improving in performance per watt, resulting in overheating data centers all over the world.

The largest gains to be had in computing are at the chip architecture level, specifically in custom chips, each for specific applications. However, architectural breakthroughs are a one-off trick — they can only be made at singular points in time in computing history.

Currently, the compute power required to train artificial intelligence algorithms and perform inference with the resulting models is growing exponentially — five times faster than the rate of progress under Moore’s law. One consequence of that is a huge gap between the amount of computing needed to deliver on the massive economic promise of autonomous vehicles and the current state of computing.

Autonomous EVs find themselves in a tug of war between maintaining battery range and the real-time compute power required to deliver autonomy.

Photonic computers give AEVs a more sustainable future

Fundamental innovation in computing and battery technology may be required to fully deliver on the promise of AEVs with the range, safety and performance demanded by consumers. While quantum computers are an unlikely short- or even medium-term solution to this AEV conundrum, there’s another, more available solution making a breakthrough right now: photonic computing.

Photonic computers use laser light, instead of electrical signals, to compute and transport data. This results in a dramatic reduction in power consumption and an improvement in critical, performance-related processor parameters, including clock speed and latency.


Photonic computers also enable inputs from a multitude of sensors to run inference tasks concurrently on a single processor core (each input encoded in a unique color), while a traditional processor can only accommodate one job at a time.

The advantage that hybrid photonic semiconductors have over conventional architectures lies within the special properties of light itself. Each data input is encoded in a different wavelength, i.e., color, while each runs on the same neural network model. This means that photonic processors not only produce more throughput compared to their electronic counterparts, but are significantly more energy efficient.

Photonic computers excel in applications that require extreme throughput with low latency and relatively low power consumption — applications like cloud computing and, potentially, autonomous driving, where the real-time processing of vast amounts of data is required.

Photonic computing technology is on the brink of becoming commercially available and has the potential to supercharge the current roadmap of autonomous driving while also reducing its carbon footprint. It’s clear that interest in the benefits of self-driving vehicles is increasing and consumer demand is imminent.

So it is crucial for us to not only consider the industries it will transform and the safety it can bring to our roads, but also ensure the sustainability of its impact on our planet. In other words, it’s time to shine a little light on autonomous EVs.

Source: TechCrunch

Powered by NewsAPI.org