liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Improving Speech Intelligibility by Hearing Aid Eye-Gaze Steering: Conditions With Head Fixated in a Multitalker Environment
Eriksholm Res Ctr, Denmark; Danish Tech Univ, Denmark.
Eriksholm Res Ctr, Denmark.
Eriksholm Res Ctr, Denmark.
Danish Tech Univ, Denmark.
Show others and affiliations
2018 (English)In: TRENDS IN HEARING, ISSN 2331-2165, Vol. 22, article id 2331216518814388Article in journal (Refereed) Published
Abstract [en]

The behavior of a person during a conversation typically involves both auditory and visual attention. Visual attention implies that the person directs his or her eye gaze toward the sound target of interest, and hence, detection of the gaze may provide a steering signal for future hearing aids. The steering could utilize a beamformer or the selection of a specific audio stream from a set of remote microphones. Previous studies have shown that eye gaze can be measured through electrooculography (EOG). To explore the precision and real-time feasibility of the methodology, seven hearing-impaired persons were tested, seated with their head fixed in front of three targets positioned at -30 degrees, 0 degrees, and +30 degrees azimuth. Each target presented speech from the Danish DAT material, which was available for direct input to the hearing aid using head-related transfer functions. Speech intelligibility was measured in three conditions: a reference condition without any steering, a condition where eye gaze was estimated from EOG measures to select the desired audio stream, and an ideal condition with steering based on an eye-tracking camera. The "EOG-steering" improved the sentence correct score compared with the "no-steering" condition, although the performance was still significantly lower than the ideal condition with the eye-tracking camera. In conclusion, eye-gaze steering increases speech intelligibility, although real-time EOG-steering still requires improvements of the signal processing before it is feasible for implementation in a hearing aid.

Place, publisher, year, edition, pages
SAGE PUBLICATIONS INC , 2018. Vol. 22, article id 2331216518814388
Keywords [en]
eye tracking; electrooculography; hearing device; sound perception
National Category
Otorhinolaryngology
Identifiers
URN: urn:nbn:se:liu:diva-153675DOI: 10.1177/2331216518814388ISI: 000452994500001OAI: oai:DiVA.org:liu-153675DiVA, id: diva2:1276235
Note

Funding Agencies|EU Horizon 2020 Grant [644732]

Available from: 2019-01-07 Created: 2019-01-07 Last updated: 2019-03-26

Open Access in DiVA

fulltext(645 kB)64 downloads
File information
File name FULLTEXT01.pdfFile size 645 kBChecksum SHA-512
7766ab073c8be4b01a6a7a514edb718cdab91bdf1cdb4c092d44445a307018309ef890235916912cb4a6bfd8fe79eed35761f44e89b3cbf6c338320a9689e546
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Search in DiVA

By author/editor
Lunner, Thomas
By organisation
Automatic ControlFaculty of Science & EngineeringThe Swedish Institute for Disability Research
Otorhinolaryngology

Search outside of DiVA

GoogleGoogle Scholar
Total: 64 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 110 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf