Empathic AI: The Next Generation of Vehicles Will Understand Your Emotions

Empathic_Vehicle-Sensum-3144x2089

Transportation will never be the same once our machines know how we feel.

We are all entering a wholesale, global disruption of the way people move from one place to another.  More than any other change in this sector, the one that is likely to have the most significant impact on human society is the rise of autonomous (i.e., self-driving) vehicles.

The transportation industry, historically dominated by a handful of large vehicle-manufacturing brands, is evolving into an ecosystem of ‘mobility services,’ underpinned by artificial intelligence (AI). A major step in the development of AI is to give it ‘empathy,’ allowing our physiological and emotional states to be observed and understood. This connection will mostly be achieved by connecting to wearable or remote sensors, the same way that fitness bands allow our physical state to be monitored.

By feeding this sensor data into AI systems, we can train them to know how we feel and how to respond appropriately. This kind of empathy can also be enhanced by giving AI its own artificial emotions, imbuing it with simulations of feelings.

Empathic technology will have no small effect on the mobility sector. How might an empathic vehicle look?

Safe Travels

There is already a growing body of research from top-tier auto companies into what kind of empathic interactions will protect drivers, passengers and everyone around them from harm. To investigate this, biometric sensors, cameras, and microphones are being used to detect:

  • Fatigue & drowsiness: e.g., monitoring head or eye movements, posture or heart/breathing rate.
  • Distraction: e.g., gaze detection to ensure the driver is watching the road.
  • Intoxication: e.g., using infrared optics or analyzing voice or breath.
  • Medical incidents: e.g., detecting a potential cardiac event from a wearable heart-rate sensor.

A Comfortable Journey

After ensuring the safety of the humans in the system, empathic tech can be employed to optimize the ride experience. There is a universe of auto-suppliers you’ve probably never heard of, who build all the components and systems that end up in the well-known vehicle brands. They are leading the way to a more empathic ride, with innovations such as:

  • Environmental controls: e.g., lighting, heating, AC, sound and olfactory output, customized to suit your current mood.
  • Physical controls: seat position, engine configuration, etc.
  • Humanising AI feedback: the virtual assistants like Alexa and Siri that are invading our homes and phones are also reaching into our vehicles. With empathic AI we can tailor their feedback to suit our preferred style of interaction.

An Entertaining Ride

Now that our customer is safe and comfortable, they can benefit from AI that knows how to push the right emotional buttons at the moment. This is particularly likely to apply to the onboard music and infotainment systems. Other subtle ways in which a vehicle could be designed to optimize the thrill of the ride include offering to increase the engine’s power output when the driver is feeling confident and happy.

The New Norms of Autonomous Society

An autonomous vehicle doesn’t exist in a bubble. Much of its intelligence is based on sensing its environment and making rapid judgments about how to act. Each vehicle will also be integrated with a global network of systems, able to share information ranging from weather forecasts to road obstructions. By connecting each vehicle to its neighbors and the wider world, we will see the emergence of a new type of ‘social’ structure with its own norms of behavior.

This AI-driven ‘society’ will involve interactions not just between the vehicles and their drivers or passengers, but also with onboard devices, nearby pedestrians, other vehicles, and their occupants, as well as surrounding infrastructure. The etiquette and rules of what the market calls ‘vehicle-to-everything’ (V2X) communications will establish themselves as we gradually let go of the wheel and hand our mobility needs over to ‘the machines.’

This mobility ecosystem is also likely to share data and processes with the rest of the AI in our lives, such as in our smartphones and home-automation systems. If coordinated correctly, this unified data architecture would allow empathic vehicles to know us much better, behaving ever more like a trusted friend.

This is not just a technological problem; it’s a monumental user-experience challenge too. Gradually increasing the empathic capability of the system will support the evolution of the transport experience towards one that is not only safe and comfortable but also delightful.

The future of mobility is emotional.

Editor Note & Disclaimer: The author is a member of the Sensum team, which is an alumnus of our ReadWrite Labs accelerator program. 

The post Empathic AI: The Next Generation of Vehicles Will Understand Your Emotions appeared first on ReadWrite.

ReadWrite

How IoT could let you better understand your workout data

Screen Shot 2017-06-27 at 2.53.20 PM

With the increasing popularity of wearables, like Fitbit trackers and Apple Watches, people are starting to understand their fitness levels much better. By wearing Apple Watches, for instance, we can figure out our average heart rate while exercising; by wearing smart sport jackets, we can tell if our motion tails are correct — frequent use of wearables will result in rich personal database.

See also: Walk this way towards better wearables security

But to analyze the data from wearables, the concern should not be on how much data is collected, but rather, what connections can be drawn between different data sets; for example, how do the data collected from your smart watch and the data from your smart jacket relate? Most of the big names in the wearables industry either lack fair data analyzing platforms or are incompatible with other companies’ databases, which makes useful and accurate data analysis very difficult.

Consumers now have a solution to their problems. Early this year, Lumo BodyTech, a fitness-related smart device company, launched their fitness data analyzing platform Lumo Motion Science Platform. By combining advanced algorithms, this platform can not only perfectly support their own hardware devices, but can also work with data sets collected by third-party devices. By offering API interfaces and SDK, Lumo BodyTech makes it easy for people to join the platform, helping fitness lovers understand and regulate their own fitness data, and thus preventing unnecessary damages during exercise.  

Fitness data needs a better platform

According to a survey by Gartner, Inc., the abandonment rate of wearables is 30 percent. Immediately after buying our new wearable devices, we feel excited to get our fitness data so we can learn to exercise better, but that data is essentially useless without a platform with which to analyze the data.

The Lumo Motion Science Platform is that platform. According to their official website, the platform offers access to unique algorithm models that can track a myriad of human movements and provide a wide spectrum of biomechanical insights relevant to various industrial applications and products. The platform also has learning engine, accounts, and big data management services.

See also: Can we use wearables to predict the future?

Using the platform’s algorithms, we can combine the data so that they can provide more complete and accurate results and advice. For example, by comparing the data from Lumo Back, a posture sensor that can help you to sit or stand straight by gently vibrating when you slouch from your lower back, and Lumo Lift, a fitness tracker that tracks your posture, steps, and calories burned when you lift heavy objects, we can learn the best posture of the lower back when doing heavy lifting.

Lumo BodyTech is also focusing on growing their hardware sector. Besides Lumo Back and Lumo Lift, Lumo BodyTech has also developed the wearable Lumo Run, which can help collect running data more easily. All three products are extremely light; customers barely feel them while wearing them, reflecting Lumo BodyTech CEO Monish Perkash’s belief that wearables will eventually become “invisible and seamlessly embedded in our everyday lives.”

By focusing on hardware and software at the same time, Lumo BodyTech believes the user experience of its wearable devices will greatly improve consumers’ experience with wearables.

How IoT and wearables will cooperate

The connection between IoT and wearable devices is much like the connection between VR/AR and IoT.

By My Eyes, a free mobile app, is designed to bring sight to blind and visually-impaired people — volunteers use the app’s live video connection to act as “eyes” for users. AR technology is embedded in the app so volunteers can better understand the living conditions of blind and visually-impaired people — in this device, AR and IoT are combined for a humanizing experience.

Even though IoT’s influence on the wearable market is not immediately obvious, and the Lumo Motion Science Platform is still in its early stage of development, if a growing number of products join theLumo Motion Science Platform and start providing more useful data, the platform could inspire more startups or tech giants to share and analyze the data from a wider range, and subsequently, we will have the opportunity to enjoy more user-friendly and useful wearables.

The post How IoT could let you better understand your workout data appeared first on ReadWrite.

ReadWrite

Tobii Pro combines eye tracking with VR to understand human behaviour

eye tracking using vr technology tobii pro

Stockholm-based Tobii Pro is a world leader in eye-tracking technology, with its products and services used by businesses and academic institutions around the world. Now, it is combining eye tracking solutions with virtual reality. 

Eye-tracking technology is a widespread method employed by organizations and institutions keen to understand human behavior better. The movement of the eyes offers information about much more than what we are looking at. Eye tracking is also a doorway into what draws our attention and for how long it keeps it. It’s a simple, objective way to observe the conscious and unconscious mind at work.

There are plenty of parties interested in applying eye-tracking technology, from advertisers conducting market research to psychologists observing phobias.

In this regard, Tobii Pro has notched up a real track record. It currently provides eye-tracking research products and services to every one of the world’s top 50 universities, four of the top five global market research organizations and 18 of the world’s top 20 advertising spenders.

Read more: Competition – Charities challenged to take advantage of AR & VR technologies

Eye tracking meets VR

Tobii Pro has now announced new research solutions that combine eye tracking with virtual reality (VR). This will allow the company’s partners to conduct eye-tracking research within virtual environments, supporting potentially endless new experiments.

The new eye-tracking solution has been embedded into HTC’s Vive headset and comes with Tobii Pro’s software development kit. Researchers will now be able to conduct experiments in virtual environments that would otherwise be too costly, dangerous or difficult to create in real life.

Modernizing the toolkit

Tobii Pro’s new VR eye-tracking solution promises to open doors for researchers of human behavior. Most notably, scientists eager to better understand anxieties, phobias and disorders such as PTSD [post-traumatic stress disorder] can now carefully control stimuli, regulate scenarios and study without putting participants at risk.

This is because with VR, the real world can be duplicated to allow for stricter controls on variables than behavioural studies usually support.

The technology is also useful for testing professionals in disciplines where on-the-job training might put lives at risk. Tobii Pro highlights surgeons and crane operators as examples in which the need to ensure professional skills are constantly assessed and sharpened cannot be met in the real world.

Recreating these high-risk environments virtually and applying eye-tracking technology will provide objective insights into situational awareness and form an ideal training tool.

“Combining eye tracking with VR is growing as a research methodology and our customers have started to demand this technology to be part of their toolkit for behavioral studies,” said Tobii Pro president Tom Englund.

“The Tobii Pro VR Integration is our first step in making eye tracking in immersive VR a reliable and effective research tool for a range of fields. It marks our first major expansion of VR-based research tools.”

Read more: Lloyds is banking on Virtual Reality to attract top grads

The combination of eye tracking and VR could help researchers tackle phobias.

Retrofitting HTC hardware

Tobii Pro’s new VR solution is a retrofit of the HTC Vive business edition headset. It’s capable of eye tracking all types of eyes and collecting binocular eye tracking data at 120 Hz.

The headset can be used in conjunction with handheld controllers. It’s been designed not to compromise the user experience or the output of eye tracking data.

The post Tobii Pro combines eye tracking with VR to understand human behaviour appeared first on Internet of Business.

Internet of Business