Skip to content

The project represents a research application for the University of Glasgow Human Computer Interaction (GIST) Group for researching the benefits of dynamic of audio designs in reaching for blind and visually impaired people.

Notifications You must be signed in to change notification settings

ovidiup13/Hand-Tracking-with-Intel-RealSense

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

65 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Hand and Object Tracking using the Intel RealSense Camera

Abstract

Blind and visually impaired people, especially from a young age, have difficulties in perceiving and conceptualizing their surrounding environment due to lack of visual feedback. Research in this field suggests that unsighted individuals tend to compensate for the lack of vision through their other senses such as hearing, touching, tasting and smelling. Recently, a study conducted by Wilson and Brewster, which experimented with dynamic feedback sounds coming out of objects in peri-personal space, has had promising results on improving accuracy of reaching.

However, the system used to perform the experiment was technologically complex, unreliable and closed for extensions. The purpose of this report is to present an alternative system using the Intel RealSense F200 camera, which provides improvements in terms of hardware, software and acts as a solid base for further additions. The approach taken towards the project was a Hybrid methodology and evaluation has shown that the system is ready to be used in practical experiments. Finally, the system fulfils the software engineering attributes of a great software product.

Aims

The aims of this project are to design and implement a system capable of supporting further research, gathering data more efficiently and providing a more seamless experience to participants. In addition, another major motivation to the project was the replacement of the Kinect with the Intel RealSense F200 camera, making the experiment technologically simpler and more compact. The differences between the two cameras are discussed in section 3.5. Note that the F200 camera has been replaced with the SR300 model. A comparison between the cameras could be found on the Intel website.

Conclusions

The purpose of this project was to improve the software of a system capable of assisting researchers in studies of how dynamic sounds might improve the accuracy of reaching in blind and visually impaired people. The new system included the replacement of a Kinect sensor with the Intel RealSense F200 camera which had made the system more compact and improving the software to fit hardware changes and comply with software engineering practices if further additions in implementation would be required. The application tracks the movement of the hand and updates audio feedback coming out of one of multiple speakers placed in front of the participant.

In contrast with the previous system, the application exposes a user interface where experiments could be configured and controlled seamlessly. Evaluation of the system design has shown that the system could be extended to support more audio designs, to add new tracking features, present more settings at the UI level and implement a new experiment with different work flow, if required.

About

The project represents a research application for the University of Glasgow Human Computer Interaction (GIST) Group for researching the benefits of dynamic of audio designs in reaching for blind and visually impaired people.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages