Internet Of Things | IoT

#

Cars that see: when your car “gets” you

  • Posted by admin on November 13, 2017

“What’s on your mind?” is the question we ask our colleagues and friends when we notice that they’re deep in thought.  We humans naturally read each other’s thoughts using body language and facial expressions. That allows us to help each other by sharing common concerns and working together to solve problems. What if your car could read your facial expression? Could it help you and your fellow drivers be more safe?

The IBM Ireland Innovation Exchange is working with partners, including Vicomtech-IK4 and Honda Research Institute Europe in VI-DAS. With project funding by the European Union via the Horizon 2020, they’re exploring how computer vision, in-vehicle edge computing and vehicle-to-vehicle communications can all help improve the understanding of risky road conditions by both drivers and automated vehicles.

How your car “sees”

VI-DAS aims to bring a 720-degree view to support the driving process handover between a human driver and an automated driving system. The 720-degree concept is based on using computer vision to sense the outside world, 360-degree around the car. For example, it can sense pedestrians, bicyclists, other cars and road signs. It also encompasses a 360-degree view of the car interior, focusing mainly on the driver and potential distractions.

Computer vision technology has reached a stage where detailed driving behavior can be extracted from images that capture your facial expressions and body language as you drive. These can be analyzed in real time using machine learning to allow your car to understand things like where on the road you’re looking or if you’re checking your side mirrors. It can even detect improper phone usage, then evaluate immediate risks in the driving situation. When this technology is combined with vehicle-to-vehicle communications, the system(s) can alert you and your car when drivers around you are distracted and or not paying adequate attention to driving conditions.

Understanding driver behavior

Knowing a driver’s cognitive awareness in real time is a requirement for the safe transfer of control between human drivers and higher-level automated driving systems. When transferring control back to the driver, the vehicle assistant will need to verify that the driver’s hands are on the steering wheel. It must also confirm that the driver is looking at the road ahead and is aware of the immediate road risks.

Vicomtech-IK4 are applying computer vision to record cues like the driver’s eye movements, gaze direction or emotional states. These can then be fused with vehicle driving performance to construct a comprehensive profile of a driver’s behavior. Honda Research Institute Europe is analyzing driving situations from the driver’s perspective. They’re doing this to estimate the situational risk factors and evaluate the best behavior options available to both driver and vehicle to alleviate risks in specific situations.

IBM is exploring how all the information gathered about driver behavior and driver risk awareness can be correlated with the exterior environmental information. They’re using road conditions and similar information gathered from other cars on the road, to provide drivers with a detailed history of their behavior. This information can be combined and analyzed through cloud services. Then this analysis will help uncover insights for drivers. Ultimately, the ability to monitor the operation of vehicles requires a scalable solution like IBM IoT for Automotive, which also offers pre-packaged services to track and score driver behavior as well as contextual mapping that provides real-time road conditions.

What this means for tomorrow’s drivers

All this data can change the driving experience. For example, a system based on eye-gaze detection could make you aware of the advanced driver-assistance systems (ADAS) alerts provided to you via various human-machine interfaces (HMIs), that you may not be fully using while driving. Gathering such insights from a larger population allows OEMs to improve the HMIs by understanding which drivers aren’t benefiting from them. And facial expression detection could provide important insights about specific locations that have proved difficult for a driver to maneuver. This could be a particular highway merge or lane change that produce high stress levels and emotional changes. Once the system(s) learn a driver’s routines, it/they can offer insights into how other, less stressed drivers have managed that same situation.

Creating a complete driving picture

There also is an opportunity to develop a collective driving assistant that combines all individual driver safety insights, gathered from you and others, with contextual information such as weather or accidents. This data can then be correlated with the detailed road network map to identify patterns of anomalous driver behavior experienced by many drivers. Or it could highlight issues with how the road layout is set, such as misplaced traffic signs.

This would allow individual drivers to learn how their behavior compares to others. That means you could improve your cognitive awareness when similar road scenarios occur. Not just the drivers — safety authorities reading anonymized samples of such driver behavior insights will learn where and why critical situations or accidents ensue. They could learn if they’re caused by poorly designed roads. and they even know whether a new billboard distracted drivers to the point that they didn’t notice a specific traffic sign.

IBM is exploring information that is collaboratively gathered and transferred from vehicles to the cloud using Watson IoT. There, it can be visualized to show actual driving scenarios in which all drivers’ gazes are shown on a map. Presenting this information from multiple cars can help determine appropriate speed limits and road planning. And it also helps drivers share and learn gaps in their own contextual awareness. (See this video clip.)

The research and activities described in this article include work that IBM Ireland  and partners are working on as part the VI_DAS project. More on VI-DAS project and the activities of project partners can be found at http://vi-das.eu.

This project aligns with IBM’s broader initiative demonstrating that monitoring driver behavior and understanding drivers can lead to innovative safety solutions. Find out more at our “Cars that Care” site.

And find more details on IBM IoT for Automotive visit our IBM Marketplace site.

About the authors:

Dr. Cristian Olariu is a Research Engineer at IBM’s Innovation Exchange in Dublin, Ireland. There, his main research focus is on wireless access technologies for time-critical applications, with an emphasis on automotive scenarios. He has a proven track record in wireless networking, cellular network architectures, software-defined networks and service provisioning for time-critical applications.

LinkedIn

cristian.olariu@ie.ibm.com

Gary Thompson is a Solution Architect and Technical Manager at IBM’s Innovation Exchange team, Dublin, Ireland. He is currently working on automotive projects that focus on V2X communications and cloud infrastructures to support ADAS development. Gary has a technical interest in data management and information modelling methods that enable a better understanding, application and governance between all stakeholders in the connected car ecosystem.

LinkedIn

gary.n.thompson@ie.ibm.com

The post Cars that see: when your car “gets” you appeared first on Internet of Things blog.

Internet of Things blog

Leave a Reply

Your email address will not be published. Required fields are marked *