Accéder directement au contenu Accéder directement à la navigation
Article dans une revue

Mobile video-to-audio transducer and motion detection for sensory substitution

Abstract : Visuo-auditory sensory substitution systems are augmented reality devices that translate a video stream into an audio stream in order to help the blind in daily tasks requiring visuo-spatial information. In this work, we present both a new mobile device and a transcoding method specifically designed to sonify moving objects. Frame differencing is used to extract spatial features from the video stream and two-dimensional spatial information is converted into audio cues using pitch, interaural time difference, and interaural level difference. Using numerical methods, we attempt to reconstruct visuo-spatial information based on audio signals generated from various video stimuli. We show that despite a contrasted visual background and a highly lossy encoding method, the information in the audio signal is sufficient to allow object localization, object trajectory evaluation, object approach detection, and spatial separation of multiple objects. We also show that this type of audio signal can be interpreted by human users by asking 10 subjects to discriminate trajectories based on generated audio signals.
Type de document :
Article dans une revue
Liste complète des métadonnées

Littérature citée [34 références]  Voir  Masquer  Télécharger

https://hal-univ-bourgogne.archives-ouvertes.fr/hal-01220310
Contributeur : Yannick Benezeth <>
Soumis le : samedi 31 octobre 2015 - 14:04:47
Dernière modification le : vendredi 17 juillet 2020 - 14:54:06

Lien texte intégral

Identifiants

Citation

Maxime Ambard, Yannick Benezeth, Philippe Pfister. Mobile video-to-audio transducer and motion detection for sensory substitution. Frontiers in information and communication technologies, Frontiers Media S.A., 2015, 2, ⟨10.3389/fict.2015.00020⟩. ⟨hal-01220310⟩

Partager

Métriques

Consultations de la notice

444