Commercial

Autonomous Is the New Mobile: Linley on Cars

Cars are everywhere these days. It doesn’t really matter what sort of event you attend in the semiconductor ecosystem, you will hear a lot about cars. In fact, even if you go to the movies. The best-picture-Oscar-for-two-minutes winner La La Land, is a love story but it famously opens with…cars. Pixar’s next movie…Cars 3. I think it is the perfect theme for semiconductors in 2017. Wherever you turn, it will be cars.
I recently went to IRPS, the International Reliability and Physics Symposium. It has been going since 1962 when the sophisticated automotive electronics of the day was pretty much a coil, to generate a high enough voltage for a spark, and a mechanically driven distributor to sequence the cylinders. This year’s conference covered a lot of topics but one of the tutorial tracks was automotive, and a several of the papers were concerned with automotive reliability. I left IRPS in Monterey early in the morning to drive up to Santa Clara for the Linley Autonomous Hardware Conference. This used to be focused on mobile, but from a merchant semiconductor point of view that is now boring: the big guys design their own chips, and Qualcomm and Mediatek mop up most of the rest. Now it is all about cars: LED-based lidar, networks-on-chip for automotive reliability, vision processing, deep learning.

Automotive Semiconductors Are the Next Big Thing
One of the reasons for this is that it is the Next Big Thing in semiconductors. But it is also a segment of the industry in transition. Until recently, “interesting automotive semiconductor” was an oxymoron like “jumbo shrimp.” It was a segment that consisted of low-complexity devices designed in extremely mature and well-characterized processes. Competition was mainly on price and reliability considerations. No chips were in leading-edge processes since there wasn’t a decade of data to characterize them, and the performance wasn’t required. Then along came advanced driver assistance systems (ADAS) and autonomous driving. Suddenly higher performance networks (Ethernet) and higher performance processors were required to process camera data.
It was a new class of requirements. The advanced semiconductor ecosystem didn’t understand the reliability requirements in detail since they weren’t required for mobile. The automotive semiconductor leaders didn’t understand advanced processes or high-performance multicore processors, since they had no experience with them. However, it wasn’t just the case of taking existing vision algorithms and characterizing them for automotive reliability requirements. Another transformation was under way that added a new ingredient to the mix: vision processing, such as recognizing street signs or pedestrians, moved from algorithmic approaches to deep learning approaches in the space of a few years. If you have read the book Blue Ocean Strategy, this was a blue ocean that created itself. In some sense, there are no market leaders. The historic market leaders in automotive semiconductors have the relationships with the OEMs and Tier-1s. The foundries and IDMs have the advanced processes. Various small companies have bits and pieces of the sensor fusion and vision processing. Another wrinkle was that ISO 26262 appeared in 2011 and gradually worked its way into everyone’s consciousness. But there were no incumbents to be displaced, it was the start of a new era. What would be important going forward was not the same as what had been important in the past.

Qualcomm, NXP, Intel, Mobileye, NVIDIA
It is early but there is jockeying for position. Qualcomm is acquiring NXP, which was the market leader in automotive semiconductors and is itself fresh from digesting Freescale. On paper, that makes Qualcomm the market leader, but only in old-style automotive. Of course they have more experience with communication interfaces, especially 5G, than pretty much anyone. Intel is acquiring Mobileye for $15B. They are a manufacturing powerhouse at the leading edge, they have cellular modem technology, and now they will have the current leader in vision processing for automotive. Intel publicly admits that it “missed mobile” and it is clear that it doesn’t want to make the same mistake in automotive. Another leader is NVIDIA with their DRIVE platform. When it was clear that they were not going to be among the winners in mobile—Qualcomm and Mediatek had all the prizes—NVIDIA redirected all those resources to automotive. Of course, they already have a lot of experience with deep learning since their GPUs power a lot of the training phase.
Being the early leader isn’t always decisive, it’s the second mouse that gets the cheese, but it is hard to enter a market once the players have been established. We seem to be at that phase right now and it may already be too late for a new semiconductor player to enter the market with any chance of success against Intel/Mobileye, Qualcomm/NXP/Freescale, NVIDIA, Renasas, and others who are already in the race.

Linley Autonomous Hardware Conference
To me, the story of Linley Autonomous Hardware Conference was not some earth-shattering announcement, but instead that it was really a bit boring. It was boring because everyone said the same thing. That in itself is a story. The future is going to be cars with lots of sensors (lots of cameras, because they are cheap, some radar, some lidar) and high-performance chips that perform the sensor fusion, do the vision recognition, and handle the policy aspects of driving.
A decade ago, every presentation in EDA opened with a graph illustrating the design gap, before going on to show how whatever product was being presented would close it. Today, every automotive presentation opens with a picture showing the complexity of future automotive electronics. Here are a selection from the day:
Linley’s opening keynote gave a good high-level overview of the space. He started off talking about how autonomous technology drives many markets such as planes and drones. But really it is all about cars (and trucks, but they are mostly just big cars). He covered a lot of the basics, such as SAE autonomous mode levels, that I have covered in numerous posts here already. Since Linley Group talks to a lot more people than I do, it is interesting to see what he considers the timescales for introduction:
Level 3 vehicles to debut this year in high-end ($50K+) vehicles and in trucks
Level 3 BOM cost will drop below $5K by 2022, and market may be lubricated by reduction in insurance cost
Level 4 vehicles in 2022 in high-end brands and commercial vehicles (taxis/uber)
True level 5 may take 10 years to develop
I think that everything may happen faster than this, since progress is being made so fast. It is not much more than a decade ago that autonomous vehicles couldn’t go ten miles and required so much electronics that they required extra air conditioners on the roof. Deep learning has only become dominant in the last five years, perhaps fewer. Fully autonomous trucks have been in use in open cast mining for some years. Planes can land themselves, although the automotive people all claim that there are several orders of magnitude more code in a high-end car than a plane. That may be true, but there is also probably a reason we let 15 years olds behind the wheel of a car but not a 777.
Cadence presented a lot of information about the OpenVX standard, and how it has complete support on the Tensilica Vision P5 and P6 cores. See my earlier post See Further by Standing on the Shoulders of…OpenVX.

Linley also had some information on specific automotive processors:
Mobileye EyeQ3 dominates ADAS vision processors today. EyeQ4 rated at 2 trillion operations per second (TOPS) at just 3W. EyeQ5 expected to sample in 2018 with production in 2020, delivering 12 TOPS at 3W. One interesting wrinkle, that Linley didn’t mention, is the EyeQx designs are MIPS-based (I don’t think Intel was a MIPS licensee and the future of MIPS is unclear with Apple moving away from Imagination GPUs).
NVIDIA is developing a single-chip solution of their DRIVEPX2 called Xavier that combines 8 custom CPUs, 512-shader Volta GPU delivering >3TFLOPS, new integer only 30 TOPS vision processor, and a 30W power budget (sampling late this year and could be in 2020 cars).
NXP has a reference design called BlueBox with a vision-processing chip and an 8-core A-57 and a 40W power budget. Qualcomm is expected to boost R&D in this area. I covered BlueBox in passing in the DVCon Europe keynote.
Renasas has a new automotive platform called Autonomy, although Linley didn’t mention it. That’s because it was announced between the conference and me writing this post, that’s how fast things are moving.

Lexus Lane Valet
It’s way past April 1, so a bit late for a prank video, but Lexus came up with a new feature for advanced driver automation, with its lane valet:

Comments are closed.