LATEST NEWS

A smart car that can read brain signals

EPFL and Nissan researchers are able to read a driver’s brain signals and send them to a smart vehicle so that it can anticipate the driver’s moves and facilitate the driving process. Nissan recently unveiled this brain-to-vehicle (B2V) technology.

Future cars will be both self-driving and manual. “We wanted to harness technology to enhance drivers’ skills without interfering with the enjoyment of being behind the wheel,” explains José del R. Millán, who holds the Defitech Foundation Chair in Brain-Machine Interface (CNBI). As part of a joint project with Nissan researchers based at the CNBI, the team managed to read the brain signals that indicate a driver is about to do something – such as accelerate, brake or change lanes – in order to send that information to the vehicle. With this advance notice of a few hundreds of milliseconds, the smart vehicle can anticipate the driver’s movements and make the ensuing maneuver easier. The car is also equipped with sensors to monitor its environment, which means it can help the driver when traffic conditions are difficult.

An easier and more personalized driving experience

The signals produced in the driver’s frontal motor cortex are detected using a sensor-equipped EEG (electroencephalography) headset. They are then sent to the smart vehicle for processing. By combining that data with the information detected by its own sensors, the car can react to the situation at hand. “If you’re coming to a red light and getting ready to brake, the car will assist you by starting to brake 200–500 milliseconds before you do. But if you approach a red light and your brain shows no intention of slowing the car down, the car will warn you that the light is red to make sure you’ve seen it,” says Millán.

We all generate different patterns of brain signals, so the vehicle learns from each driver and customizes its software. It stores each driver’s regular routes, as well as their driving habits and style, using this information to more accurately anticipates what each driver might do at any time. The brain-machine interface not only makes driving easier, it also creates a more personalized experience, as the car will always be in sync with the driver. Even the car’s setting can be transparently adapted to the driver’s preferences. For instance, if the driver has adopted a more relaxed driving style, the interface will detect that the selected sports mode is not appropriate and switch the car to a more comfortable setting.

Tech transfer

In 2014, after four years of research, scientists from the CNBI delivered the brain-machine interface to their industry partner Nissan. The carmaker then continued the research through a senior innovation research program with support from the CNBI. The resulting technology was integrated into a prototype, creating the interface that enables the vehicle to communicate with the driver. This prototype was unveiled at the Consumer Electronics Show in Las Vegas in January.

Since 2015, Nissan and the CNBI researchers have been working on further developing this brain-vehicle technology, mainly by adding an eye tracker function. “Our eyes are always moving and observing what’s going on around us,” says Millán. “But not everything we see is relevant or important. So we are studying ways to detect brain signals that indicate that a certain object or situation has caught our attention and needs to be factored in by the vehicle.”

For several years now, Millán’s team has been developing scientific knowledge in detecting and using brain signals to control objects and our environment. They have been focusing on driver-assist technology with Nissan since 2011. In parallel, they have been exploring other applications for their expertise as well, such as helping people with motor disabilities.

Lihi

Recent Posts

Quantum Machines to Establish Flagship Hub at the Illinois Quantum and Microelectronics Park

New collaboration will establish a quantum-control–enabled center at the IQMP to accelerate and scale fault-tolerant…

9 hours ago

SENAI raises $6.2M to launch real-time intelligence for threats hiding in online video content

With a seed round led by 10D Ventures, SENAI emerges from stealth to help government…

10 hours ago

Dassault Systèmes and NVIDIA Partner to Build Industrial AI Platform Powering Virtual Twins

 Shared industrial AI architecture combines Virtual Twins and AI infrastructure deployable at scale. Science-validated world…

10 hours ago

New Power Module Enhances AI Data Center Power Density and Efficiency

Microchip’s MCPF1525 power module with PMBus™ delivers 25A DC-DC power, stackable up to 200A The…

2 days ago

Datarails Launches Spend Control to Give CFOs Full Visibility on Contracts and Eliminate Zombie Subscriptions

New AI-powered platform – the first with full ERP integration – includes an AI agent…

2 days ago

AccuLine reports 94% sensitivity in clinical trial of its 4-minute cardiac diagnostic system

The study validated the CORA system’s ability to rule out coronary artery disease with a…

4 days ago