top of page
Navigation.png
5.png
33.png
555.png
Dashboard.png

RESEARCH + UX/UI DESIGN (Distinction-Awarded Academic Dissertation)

Emotion Detection in the Automotive Environment for Smart In-Car Systems

This is an academic research and design project, and all the work presented was carried out solely by myself.

This project explored the integration of emotion recognition technologies into the automotive environment, focusing on enhancing both driver safety and comfort. The system aims to monitor and interpret the driver’s emotional states—such as stress, frustration, and fatigue—in real time, using advanced algorithms that detect physical cues, including facial expressions, body posture, and gestures. By providing timely and adaptive feedback, the system seeks to mitigate the negative impact of these emotions on driving behaviour, thereby reducing the likelihood of accidents and improving the overall driving experience.

The Problem

Driving is a complex and often emotionally charged activity. Emotions like stress, anger, and fatigue can significantly impair a driver’s judgement, decision-making, and reaction times, increasing the risk of accidents. While many existing in-car systems focus on detecting physical distractions or cognitive lapses, they often neglect the emotional dimension of driving. Current solutions also struggle with the accuracy and real-time processing required to respond effectively to emotional cues. Moreover, cultural differences in emotional expression pose an additional challenge for these systems to accurately detect and respond to emotions across diverse driver populations.

Proposed Solution

The solution designed in this project is a comprehensive emotion recognition system designed specifically for the automotive environment. This system monitors the driver’s emotional state in real-time by analysing non-verbal cues, including gestures, facial expressions, and body movements. When an emotional state is detected—such as stress, anger, or fatigue—the system provides adaptive interventions, such as calming suggestions, break reminders, or environmental adjustments (e.g., altering the music or climate control). The overall objective is to ensure that drivers remain emotionally balanced, reducing stress and improving decision-making during critical moments on the road.

Research Requirements 
  1. Secondary research

  2. Primary research was conducted with a diverse group of drivers to understand emotional states experienced during driving and gauge user receptiveness to emotion-monitoring systems

  3. Iterative research with prototypes

Successful Output

A high-fidelity prototype, tested with 8 participants, outlines the solution and demonstrates 2-3 key user flows related to emotion detection and adaptive feedback systems.

Considerations
  • Data Privacy and User Consent - Emotion detection systems inherently rely on sensitive biometric data. A major consideration was ensuring that the collection, storage, and usage of such data adhered to strict privacy guidelines. 

  • Transparency and Control - Giving users control over the system's behaviour was critical to building trust. Users could adjust sensitivity settings and choose the types of feedback they preferred, ensuring that the system did not feel invasive or overly controlling.

  • Non-Intrusive Interventions - The system needed to balance safety and usability, ensuring that interventions did not become distracting or overwhelming for the driver. Subtle alerts were designed to minimise cognitive load and prevent the system from causing additional stress, especially in high-risk driving scenarios.

  • Intuitive Interface Design - The system had to ensure that drivers could quickly and easily interpret and respond to alerts or suggestions without taking their focus off the road. To this end, the interface was kept clean and simple, with minimal text and prominent, clear visuals.

  • Accessibility - The design followed WCAG standards to ensure that users with different abilities, including those with visual or cognitive impairments, could use the system. This included colour contrast tests and font size adjustments.

  • Cultural Sensitivity in Emotion Recognition- Emotions are not universally expressed the same way across cultures. The system had to be adaptive enough to account for cultural differences in emotional expression and body language to avoid misinterpreting emotions.​

  • Scalability - The system was designed with scalability in mind, ensuring that it could be adapted for different types of vehicles, including commercial fleets and personal cars. Future updates could incorporate more advanced machine learning models to improve accuracy and introduce features like AI-driven predictive safety interventions.

Design Process

The design followed a structured, iterative approach, utilising Lean UX methodology and incorporating research-driven insights. The process was divided into multiple phases, each focused on addressing key design challenges and refining the system through continuous feedback loops. 

Research and Discovery (Think Phase)

The project began with a comprehensive research phase aimed at understanding the context of emotion detection in driving environments. This phase involved:

Literature Review

The project's initial phase involved conducting an extensive literature review, which laid the foundation for understanding the key theoretical frameworks, technologies, and challenges associated with emotion recognition in automotive environments.

Basic Emotion Theory (BET):

 This framework, pioneered by Paul Ekman, was central to the project. It categorises universal facial expressions that correspond to core emotions like happiness, sadness, fear, and anger. This theory provided a solid foundation for detecting these emotions in a driving context through facial recognition technology.

Non-Verbal Communications:

The literature showed that emotions are expressed not only through facial expressions but also through body language and gestures. Studies highlighted the importance of interpreting hand gestures, posture, and even eye movements as key indicators of emotional states. These insights directly informed the choice of technologies for the project.​

Real-Time Emotion Recognition in Automotive Systems:

 A gap was identified in existing research regarding the real-time applicability of emotion recognition technologies in vehicles. While some systems could detect basic emotions like fatigue or drowsiness, few addressed the complexity of emotions like stress or frustration in real-world driving scenarios. This informed the goal of creating a more comprehensive and adaptive emotion recognition system

User Research
Survey

A survey was conducted with 44 participants, ranging from ages 25 to 64, to understand common emotional experiences while driving. This research yielded several key findings:

  • Common Emotions- Frustration, boredom, and fatigue were the most frequently experienced emotions, especially during long drives or in stressful traffic situations. These emotions were closely linked to unsafe driving behaviours, such as aggressive driving or reduced focus.

  • Privacy Concerns - Participants expressed concerns about the potential privacy implications of a system that continuously monitors their emotional state. This highlighted the need for transparent data handling and robust privacy features in the system design.

  • Interest in Adaptive Features - Many participants expressed interest in features that could adjust the in-car environment based on their emotions, such as mood-based music, calming interventions, or break reminders. This was a crucial finding that shaped the system’s adaptive feedback mechanism.

image.png
image.png
Market Analysis

The market analysis identified direct competitors, however, these systems were primarily focused on detecting fatigue or distraction, with limited scope for recognising complex emotions like stress or anger.

Industry Gaps

The research revealed that most existing systems were focused on safety but overlooked the driver’s emotional comfort and engagement. This gap in the market underscored the importance of creating a system that not only enhances safety but also improves the overall driving experience by reducing stress and fatigue.

Experimental Design

The experiment involved 11 participants who were asked to respond to specific driving scenarios designed to provoke emotional responses. During the experiment, their facial expressions, gestures, and body posture were monitored. 
The collected data was then analysed to explore the correlations between particular emotional states and their physical expressions. 

image.png
image.png

Tools Used 

  • PoseNet: Monitors body posture and gestures

  • Face API: Identifies facial expressions and emotional cues

  • Holistic API: Offers integrated tracking of both body movements and facial expressions

  • Microsoft Teams: Utilised for recording sessions and conducting experiments

  • DVSA Theory Test App: Supplied hazard perception clips for the study 

image.png

Key Value of the Experiment

  • Emotion Mapping: The experiment enabled to map physical cues (e.g., head movements, hand gestures, facial expressions) to emotional states like stress, fatigue, and frustration.

  • Real-Time Detection: It validated the real-time detection of emotions through combined body posture and facial expression tracking, providing a holistic understanding of the participant’s emotional state.

image.png

Participant Interviews

After the experiment, in-depth interviews were conducted to gather qualitative feedback on their experiences. Key insights included:

  • Emotional Triggers: Most participants identified stress as the dominant emotion during difficult driving scenarios, particularly when navigating heavy traffic or unexpected hazards.

  • Preferences for Interventions: Participants expressed interest in adaptive features such as music adjustments or break reminders, which could help them manage stress in real-time without being intrusive.

  • Concerns about Privacy: Participants were concerned about how their emotional data would be stored and used, highlighting the need for transparent data governance and user control over data access.

Focus Group

A follow-up focus group involving 3 participants explored deeper emotional challenges in driving. Participants suggested several additional features, including:

  • Real-Time Stress Management: Participants expressed a strong desire for real-time feedback that could help reduce stress while driving.

  • Customisable Features: Drivers wanted the ability to customise the system’s sensitivity to emotional triggers and control how often feedback was provided.

Expert Interview with General Motors

An expert interview provided industry-level insights and emphasised the potential for emotion detection systems to be integrated into autonomous vehicle platforms, improving safety through enhanced driver emotional awareness.

Personas

Based on the insights gained from research and experimentation, two personas were developed to guide the design:

image.png

Natalie O’Brien

  • A busy working mother who experiences high levels of stress while driving. The design for Natalie focused on stress-reducing features like calming music, route adjustments, and visual cues to de-stress her driving experience.

image.png

James Booth

  • A fatigued delivery driver who spends long hours on the road. The system for James prioritised break reminders, energising playlists, and relaxation techniques to help him stay alert.

How Might We (HMW) Questions

The ideation process was driven by a series of How Might We questions designed to tackle the key challenges identified during the research phase:

  • HMW detect driver stress early enough to prevent unsafe behaviours?

  • HMW provide adaptive interventions without further distracting the driver?

  • HMW create a responsive in-car environment that adjusts to the driver’s emotional state in real time?

These questions led to brainstorming sessions where several potential solutions were proposed, ranging from simple calming prompts to more sophisticated systems that adjust the in-car environment (e.g., lighting, music, temperature) based on detected emotional states.

image.png

Several creative solutions emerged from the ideation sessions, which were then refined using the MoSCoW method to prioritise features:

  • Must-Have: Real-time emotion detection, adaptive feedback mechanisms, and safety-focused interventions.

  • Should-Have: Voice command integration to allow drivers to interact with the system without taking their hands off the wheel.

  • Could-Have: Smartphone integration and long-term emotional analytics to help drivers track their emotional states over time.

image.png

Several creative solutions emerged from the ideation sessions, which were then refined using the MoSCoW method to prioritise features:

  • Must-Have: Real-time emotion detection, adaptive feedback mechanisms, and safety-focused interventions.

  • Should-Have: Voice command integration to allow drivers to interact with the system without taking their hands off the wheel.

  • Could-Have: Smartphone integration and long-term emotional analytics to help drivers track their emotional states over time.

Concept Development (Make Phase)

Building on the insights gained, the Make Phase focused on translating the findings into practical solutions. The next step was to conceptualise and prototype a system that could demonstrate the detection of emotional states and provide adaptive feedback in a way that enhances the driving experience without causing distractions.

Prototype

Low-Fidelity Prototypes (Lo-Fi)

The Lo-Fi prototypes were created using pen-and-paper sketches, which allowed quickly visualising ideas and exploring different design directions.
These prototypes were used to test basic concepts and ensure the system's flow and functionality aligned with the intended user experience before moving to more refined versions.

image.png

MidFi Wireframes

​The Mid-fidelity wireframes include more detailed layouts and interactions, focusing on ease of navigation and minimalism. The user interface was designed with simplicity in mind to ensure that drivers could quickly understand their emotional state and respond to feedback without being distracted.

Frame 1000004624_edited.png

Styleguide 

The Moodboard and Style Guide played a main role in establishing the aesthetic and functional design of the system. The primary objective was to create a visually cohesive, user-friendly interface that would be intuitive and reduce distractions.
 

The system was designed to evoke a sense of calm, clarity, and focus, aligning with the system’s goal of helping drivers manage their emotions effectively. The following themes and visual inspirations were considered:

  1. Calming Visuals: Inspired by natural elements, such as serene landscapes and smooth textures, the design emphasised visuals that would create a soothing experience. Calm blues, deep greens, and soft greys were chosen to minimise stress and anxiety in the driver.

  2. Automotive Design Elements: Drawing inspiration from modern vehicle dashboards, the design adopted sleek, minimalistic lines that focused on functionality and accessibility. 

  3. Dark Mode: Automotive environments are often dimly lit, especially during night driving. Therefore, the design leaned heavily on a dark mode interface to reduce glare and eye strain. Dark backgrounds with high-contrast elements (text, icons) were favoured, ensuring legibility without overwhelming the driver.

  4. Technology-Driven Aesthetics: To align with the advanced technology behind emotion detection, the system incorporated elements of modern UI trends, such as flat icons, fluid animations, and clear typography. The goal was to make the interface feel innovative but not distracting.

  5. Accessibility-Centric Design: Ensuring that the design was usable for all drivers, including those with visual impairments, was a key consideration. High contrast ratios and large, clear fonts were featured throughout the design to meet WCAG 2.1 accessibility standards.

Frame 1.jpg
image.png

Interaction Design

  • Feedback Animations: Smooth transitions and subtle animations were included to enhance the user experience. For example, when stress levels were detected, the dashboard smoothly transitioned to display the relevant alert, followed by suggestions for calming actions.

  • Minimalist Approach: Interaction points were kept to a minimum to avoid overwhelming the driver. Rather than multiple menus, the design favoured simple swipes and taps, ensuring that drivers could interact with the system without diverting their focus from the road.

High-Fidelity Prototypes (Hi-Fi)

Hi-Fi prototypes were created using Figma, enabling to building of fully interactive mockups that could simulate real-world usage scenarios.
These prototypes include the full-colour palette, typography and icons. This allowed testing the look and feel of the system while ensuring that it adhered to design standards. 
The Key Features in the Hi-Fi Prototypes:

  • Emotion Detection Interface: Emotion tracking dashboard, allowing users to view their emotional state in real time. The interface displayed emotions such as stress, fatigue, and calmness using easy-to-read visuals.

  • Real-Time Interventions: Demonstration of the system’s ability to provide real-time feedback (e.g., calming prompts, adjusting music), allowing users to experience how the system responds to various emotional states during simulated driving sessions.

  • Customisation Settings: Users can interact with the customisation options.


Prototype Link

User Testing

After developing the final high-fidelity prototype, we conducted user testing with  8 participants, each representing the target demographic. The goal was to evaluate the system's usability and effectiveness in detecting emotions during driving scenarios and providing real-time feedback.
Testing Process:

  1. Scenario-based testing: Participants interacted with the system in a driving simulation environment where various emotional states were triggered, such as stress, frustration, and fatigue.

  2. Tasks: Users were asked to navigate through a series of driving situations, with the system detecting emotions through facial recognition and gestures. The system provided real-time feedback to help them manage their emotional state.

Key Insights:

  • High accuracy in emotion detection: 85% of participants reported that the system accurately identified their emotional states, with particular success in detecting stress and fatigue.

  • Positive feedback on intervention: Participants appreciated the system’s real-time interventions, such as calming prompts and environmental adjustments (e.g., music changes).

  • Ease of use: 90% of participants found the system easy to navigate, especially when adjusting settings like intervention frequency and feedback types.

  • Room for improvement: Some participants suggested more personalisation options, such as additional control over when and how interventions are triggered.

Iteration & Final Adjustments

Based on user feedback, several key adjustments were made:

  1. Enhanced Personalisation: Added more customisation options for users to control how and when interventions are triggered, ensuring the system is tailored to individual preferences.

  2. Refined UI: The dashboard is now clearer, making it easier for users to track emotional states and adjust settings on the go.

Results & Success Metrics

  • Task Completion Rate: 93% of participants successfully completed all tasks without needing additional guidance.

  • System Usability Scale (SUS): The system received a SUS score of 84, indicating a high level of user satisfaction.

  • User Feedback: Participants highlighted that the system helped them feel more in control of their emotional state, which positively impacted their driving experience.

image.png

Final deliveries

Frame 1000004622.png
bottom of page