, Sweat-proof “smart skin” takes reliable vitals, even during workouts and spicy meals
, Sweat-proof “smart skin” takes reliable vitals, even during workouts and spicy meals

It’s all about user experience

HARMAN and Daimler Bring the First AR-Capable Infotainment System to Market with the Mercedes-Benz A-Class. A look from the inside

There’s a new demographic of car buyer that’s gaining prominence. These individuals are becoming increasingly familiar with never-before-seen technology, and are diverse connoisseurs with premium tastes but mindful wallets. This leaves automakers, like Daimler, with the unique problem of innovating entry-level cars with luxurious and future-looking features while still maintaining value. On top of that, these consumers are experiencing the benefits of semi-autonomous driving now and will soon expect even more advancement out of their vehicles. One way to meet this challenge is through scalable, future-proof technology that offers features to enhance the user experience and prepare drivers for their transition into passengers.

, Sweat-proof “smart skin” takes reliable vitals, even during workouts and spicy meals

Picture 1: The car demonstrates a high-resolution widescreen cockpit with touchscreen
operation of the media display Credit: Daimler AG

As a long-standing strategic development partner and tier one supplier for Daimler, HARMAN, is helping the automaker accomplish this by redefining the modern compact car class with luxury driving experiences – chiefly the first-ever use of augmented reality navigation. Debuting in the next generation Mercedes-Benz A-Class as a part of the Mercedes-Benz User Experience (MBUX), the system combines Daimler and HARMAN technology to provide turn-by-turn directions with AR visuals to transport the navigation user interface into the digital age.

“The new MBUX adds a new layer into navigation which will make life easier for the drivers”, said Ohad Akiva, Director computer-vision in Harman who lead the Israeli team which developed the application. “The focus of Daimler was always on making the user life easier, so we only add the augmented reality parts in when the context is needed”.

The name MBUX – Mercedes-Benz User Experience for the new infotainment system signifies that the user experience (UX) has first priority. A unique feature of this system is its ability to learn thanks to artificial intelligence. MBUX can be individualised and adapts to suit the user. It thus creates an emotional connection between the vehicle and driver. Moreover, new content can be transmitted “over the air” as updates.

“Our new MBUX is incredibly intelligent and tailored to your needs.” Said Daimler AG Chief Design Officer Gorden Wagener, at the CES show.

, Sweat-proof “smart skin” takes reliable vitals, even during workouts and spicy meals

Picture 2: The new Mercedes-Benz A-Class Credit: Daimler AG

Its further strengths include the high-resolution Widescreen cockpit with touchscreen operation of the media display, the navigation display with augmented reality technology (optional) plus intelligent voice control with natural speech recognition, which is activated with the words “Hey Mercedes”.

Its highlight is the comprehensive touch operation concept – the combination of a touchscreen, touchpad on the centre console and touch control buttons in the steering wheel. In addition to the intuitive operating experience, less driver distraction is another advantage. MBUX is a revolution of the user experience in the car. Emotively appealing showcasing features underline the comprehensibility of the control structure and thrill through brilliant 3D maximum-resolution graphics, which are rendered, i.e. calculated and displayed, in real time.

Map display supplemented by augmented reality is a completely new feature. A video image of the surroundings taken with the aid of the front camera is augmented with helpful navigation information, for example arrows or house numbers are automatically superimposed directly onto the touchscreen of the media display. This makes it easier for the driver to search for a certain house number, or to find the correct side road for turning off.

, Sweat-proof “smart skin” takes reliable vitals, even during workouts and spicy meals

Picture 3: CES 2018 was the perfect arena for demonstrating new and exciting
Auto-Tech innovations Credit: Daimler AG

As an example, imagine you are driving with your navigation system and you follow it’s instructions. Just before you are reaching to a turn point you will usually glimpse towards the map to see the 2D view of it and then when looking out of the windshield you will try to understand which turn you should make.

In some cases, this task is trivial, mainly when there is only one option to turn or the turn is a simple turn but in some cases there could be more than one road that you can turn into – this is when the augmented reality come in handy – it will show you a highlighted street sign oriented towards the exit you should take with the street name in it now the task of transforming the data from the 2D map to the view in front of us becomes easy and clear.

Other feature I really like is house numbers – especially when I had to drive to locations I didn’t know. Once you are getting close to the destination point the camera view will pop-up next to the map and will show you the view in front overlaid which house numbers for each house. You will also see the house you are looking for highlighted so you will know exactly where you are.

“The task we gave to the team was to find a way to show only the relevant contextual data at the right moment – to expect the driver need and show it to him”, says Akiva.

AR and the autonomous car

The new system also helps OEMs prepare for Level 3, 4 and 5 autonomous vehicles. It’s an important first step in the journey toward increased safety and driver-assistance features and helps by both providing a unique visualization for navigation, and by communicating vehicle autonomy to occupants. This will also help former

self-drivers ease into their transition as passengers when getting behind the wheel of an autonomous car.

“The work on level 3,4 and 5 autonomy already started but we still have quite a lot of technical challenges ahead of us”’ says Akiva. “The augmented reality work we did helped us in many ways to improve our development and learn a lot on different areas in the road for autonomy such as HD maps, sensors behavior in different scenarios, synchronization issues, algorithms & perception issues and more”.

The ability to see visually issues and know that something is wrong in the system is much easier than looking on hundreds of log lines and trying to understand what happened in the system.

When a user have the ability to see what the ADAS sensor see, what the Lidar see combined with the computer vision and Deep learning algorithm – he can quickly detect where and when something went wrong and allow the developers to pin-point their search for the issue to very specific areas.

In a self-driving situation, the feature eliminates the abstract map all together to show occupants what the car is about to do next: for example, make a turn, accelerate, brake, switch lanes, or merge. This intuitive and holistic navigation experience is shown in real-life context and gives passengers a feeling of control despite the vehicle’s autonomous capabilities. By communicating these actions to occupants through intuitive visualizations, an entire generation of drivers can more comfortably transform into passengers through a better understanding of surrounding traffic and environmental conditions. These features aren’t exclusive to fully-autonomous technology either – the augmented reality technology is capable of helping drivers experience contextual mapping and additional functionality enabled through advanced driver assistance systems (ADAS) such as lane-departure warning, blind spot monitoring and automated cruise control.

In last CES, Samsung and Harman announced its DRVLINE platform, an open-source and modular platform for Level 3, 4, and 5 self-driving cars. Ahead of more research on the autonomy front, the companies will bring a forward-facing camera to market capable of enabling lane-departure warning, adaptive cruise control, forward collision warning, and pedestrian warning. The new unit will launch this year.

“As we worked on this AR application we had to develop and adapt a lot of tools to enable debugging and testing. We needed to see how our application actually see the world based on the input sensors and data it get. It’s a unique experience to drive in a road and see data such as house numbers or street names that before only existed in the data based as 2D information and now it’s visual. You are literally drive through the data base and can find error or edge cases you weren’t aware that existed as those weren’t visible in the 2D maps navigation traditionally use”’ says Akiva.

“We are now working to enable the usage of this tools also for the autonomous development that is being done in Samsung and Harman – in those cases it’s critical to show the developers what the car is seeing and understanding and what it’s planning to do next. As this autonomous driving development is still in the making it allows the developers to react faster in case the car is doing something unexpected and they are able to save the data for future analysis or simulation so we can learn and improve quickly from actual scenario from the road.”

Digital cockpits

, Sweat-proof “smart skin” takes reliable vitals, even during workouts and spicy meals

Mercedes-Benz A-Klasse, Interieur
Mercedes-Benz A-Class interior

Tomorrow cars aren’t going to have just a main screen showing the user the navigation or radio as it is today. Tomorrow cars are going to have Digital cockpits, based on  curved screens designed using OLED (Organic Light-Emitting Diode) technology, And much much more screens.

Harman and Samsung used the most recent Consumer Electronics Show (CES) to unveil a digital cockpit platform, which combines 5G technology and an IoT platform, across a suite of OLED and QLED screens. QLED stands for Quantum dot Light-Emitting Diode. The digital cockpit comprises three customisable displays and knobs. The 12.3-inch OLED driver display provides information such as speed and RPM. Infotainment is handled by a 28-inch QLED central information display. Positioned below this central screen is another curved OLED display that enables control of other features such as the air-con. The three knobs may be set to functions most frequently used by the driver, such as clock, temperature and volume.

There is going to be a screen for the driver and screens for passengers in the front and the back. All of them will be managed by the car infotainment system and will show data and application which fit to each different user. The driver data will not be the same as a passenger data, while for a driver we want to show relevant data that fits his current driving scenario, for the passengers we want to enable more data and functionality that allow him not only see info from the augmented reality view but to also interact with it.

“Our current challenge is to build an augmented reality platform – such that will enable different car makers to develop and their own augmented reality applications which will fit the user experience they vision. We want to enable them, once the platform is integrated with their car, to be able to easily change the logic and the design of the application so car makers could quickly and easily try different type of application and interactions and rapidly improve and update those. Our team will be responsible for the heavy lifting on the technology side – computer vision, calibration, position estimation with complex algorithms while the developers could easily leverage those and build upon them their applications for the car”.

, Sweat-proof “smart skin” takes reliable vitals, even during workouts and spicy meals

Comments are closed.