Skip to content
This repository has been archived by the owner on May 3, 2024. It is now read-only.

This project aims to propose an innovation in the field of in-car driving assistants. The idea is to introduce human emotions as input into path generation, in fact a path would be evaluated not only by traditional characteristics, but also by the emotions that path might arouse in the user.

License

Notifications You must be signed in to change notification settings

yolly98/Emotional-Navigation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

IA-Logo

Emotional Navigation

University Project for "Industral Applications" course (MSc Computer Engineering @ University of Pisa).

Overview

This project aims to propose an innovation in the field of in-car driving assistants such as Google Map or Waze. The idea is to introduce human emotions as input into path generation, in fact a path from a starting point to an end point would be evaluated not only by traditional characteristics such as shortness in km and estimated travel time, but also by the emotions that path might arouse in the user. To do this, an emotional historian was designed to collect for each user the emotions felt while driving, these emotions are then used to construct an emotional path score, that is, a measure of how much the path might be enjoyed.

The requirements and functionalities of the application are as follows.

  • Must be able to geolocate the user on the road and provide directions if a destination has been set
  • The route is automatically recalculated whenever the user deviates from the route that was established
  • The client module is activated after detecting a face through a face detection service
  • the client module recognizes users through a face recognition service, if the user is unknown a new user profile is created
  • Each user has a profile consisting of username, face image for authentication, and an emotional history
  • The system periodically detects the user's emotions and associates them with the stretch of road being traveled, these tuples (emotion, road, timestamp) will compose the user's emotional history
  • Whenever a path has to be constructed from the current location to a destination, that path has to be evaluated through the emotional history, if the route is evaluated negatively (because it is composed of roads not liked by the user) a better rated path is chosen that does not bring too much delay
  • The client module must interact with the user vocally
  • The system initiates speech recognition if the user presses a specially designed button

Related Works

The following open-source projects were used to build the prototype:

Architecture Overview

The GUI

Prototype Description

Dell Latitude 5480 (Server)

  • OS: Windows 10
  • CPU: Intel Core i5-6200U (2 cores, 4 threads) 2.80 GHz
  • RAM: 16GB DDR4
  • GPU: Intel HD Graphics 520

Raspberry Pi 3 Model B+ (Client)

  • OS: Raspberry Pi OS
  • CPU: quad-core ARM Cortex-A53 64-bit 1.4GHz
  • RAM: 1GB LPDDR2
  • GPU: VideoCore IV

Devices Connceted to Client

  • WEMISS CM-A1 Camera

  • SONY SRS-XB13 Speaker

  • GT-U7 GPS Module

The Control Device

This device was used to physically interact with the system and was made from the following components:

  • Arduino Nano RP2040 microcontroller
  • button
  • blue led, green led
  • 2 resistors (220 Ohm)

The operation of this device is very simple, the blue led is turned on if the microcontroller receives via serial the string ’ON’, and it is turned off if the microcontroller receives ’OFF’. Pressing the button the green led is turned on and the microcontroller sends via serial the string ’PRESSED’. This system is used by the client to turn on the blue led if the microphone is listening, while if it receives the button signal it starts the speech recognition procedure (the same result can be obtained by using the RIGHT SHIFT key on the keyboard).

Prototype Road Test Result

It is the road path followed during the test. The indicators on the path have the following meanings:

  • red circle: collected negative emotion
  • yellow circle: collected neutral emotion
  • green circle: collected positive emotion
  • violet inverted triangle: path recalculation due to an error in positioning the gps coordinates in the path or an error in the driver's turn

A schreenshot during the test

Getting Started

Install all libraries

pip install -r requirements.txt

Initialize the server

cd build
make
python3 initialization.py

Run the Server

python3 -m Server.server

Run the Client

python3 -m Cleint.client

Note:

  • modify the config.json file in Client/Resource to prepare your client
  • install ffmpeg
  • install flac on Linux
  • use 'pip install -r requirements.txt --no-cache-dir' if the process crash during the installation

Project Architecture

Emotional Navigation
├── arduino_module.ino
├── build
├── requirements.txt
├── Client
│   ├── Dashboard
│   ├── InOutModules
│   ├── Monitor
│   ├── Resources
│   ├── client.py
│   ├── communication_manager.py
│   └── state_manager.py
└── Server
    ├── Core
    ├── Persistence
    ├── listener.py
    └── server.py

Author

Gianluca Gemini ([email protected])

About

This project aims to propose an innovation in the field of in-car driving assistants. The idea is to introduce human emotions as input into path generation, in fact a path would be evaluated not only by traditional characteristics, but also by the emotions that path might arouse in the user.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages