Detecting gaze direction using robot-mounted and mobile-device cameras

2019
journal article
article
3
dc.abstract.enTwo common channels through which humans communicate are speech and gaze. Eye gaze is an important mode of communication : it allows people to better understand each others’ intentions, desires, interests, and so on. The goal of this research is to develop a framework for gaze-triggered events that can be executed on a robot and mobile devices and allows experiments to be performed. We experimentally evaluate the framework and techniques for extracting gaze direction based on a robot-mounted camera or a mobile-device camera that are implemented in the framework. We investigate the impact of light on the accuracy of gaze estimation and also how the overall accuracy depends on user eye and head movements. Our research shows that light intensity is important and the placement of a light source is crucial. All of the robot-mounted gazedetection modules we tested were found to be similar with regard to their accuracy. The framework we developed was tested in a human-robot interaction experiment involving a job-interview scenario. The flexible structure of this scenario allowed us to test different components of the framework in varied real-world scenarios, which was very useful for progressing towards our longterm research goal of designing intuitive gaze-based interfaces for human-robot communication.pl
dc.affiliationWydział Filozoficzny : Instytut Filozofiipl
dc.contributor.authorJarosz, Mateuszpl
dc.contributor.authorNawrocki, Piotrpl
dc.contributor.authorPłaczkiewicz, Leszekpl
dc.contributor.authorŚnieżyński, Bartłomiejpl
dc.contributor.authorZieliński, Marcinpl
dc.contributor.authorIndurkhya, Bipin - 227976 pl
dc.date.accessioned2020-05-12T08:08:49Z
dc.date.available2020-05-12T08:08:49Z
dc.date.issued2019pl
dc.date.openaccess0
dc.description.accesstimew momencie opublikowania
dc.description.number4pl
dc.description.physical453-474pl
dc.description.publication1,2pl
dc.description.versionostateczna wersja wydawcy
dc.description.volume20pl
dc.identifier.doi10.7494/csci.2019.20.4.3435pl
dc.identifier.eissn2300-7036pl
dc.identifier.issn1508-2806pl
dc.identifier.projectROD UJ / OPpl
dc.identifier.projectPOLTUR2/5/2018pl
dc.identifier.urihttps://ruj.uj.edu.pl/xmlui/handle/item/155743
dc.languageengpl
dc.language.containerengpl
dc.rightsUdzielam licencji. Uznanie autorstwa 4.0 Międzynarodowa*
dc.rights.licenceCC-BY
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/legalcode.pl*
dc.share.typeotwarte czasopismo
dc.subject.engaze-direction detectionpl
dc.subject.eneye-trackingpl
dc.subject.enface detectionpl
dc.subject.enrobotpl
dc.subject.enmobile devicepl
dc.subtypeArticlepl
dc.titleDetecting gaze direction using robot-mounted and mobile-device cameraspl
dc.title.journalComputer Sciencepl
dc.typeJournalArticlepl
dspace.entity.typePublication
dc.abstract.enpl
Two common channels through which humans communicate are speech and gaze. Eye gaze is an important mode of communication : it allows people to better understand each others’ intentions, desires, interests, and so on. The goal of this research is to develop a framework for gaze-triggered events that can be executed on a robot and mobile devices and allows experiments to be performed. We experimentally evaluate the framework and techniques for extracting gaze direction based on a robot-mounted camera or a mobile-device camera that are implemented in the framework. We investigate the impact of light on the accuracy of gaze estimation and also how the overall accuracy depends on user eye and head movements. Our research shows that light intensity is important and the placement of a light source is crucial. All of the robot-mounted gazedetection modules we tested were found to be similar with regard to their accuracy. The framework we developed was tested in a human-robot interaction experiment involving a job-interview scenario. The flexible structure of this scenario allowed us to test different components of the framework in varied real-world scenarios, which was very useful for progressing towards our longterm research goal of designing intuitive gaze-based interfaces for human-robot communication.
dc.affiliationpl
Wydział Filozoficzny : Instytut Filozofii
dc.contributor.authorpl
Jarosz, Mateusz
dc.contributor.authorpl
Nawrocki, Piotr
dc.contributor.authorpl
Płaczkiewicz, Leszek
dc.contributor.authorpl
Śnieżyński, Bartłomiej
dc.contributor.authorpl
Zieliński, Marcin
dc.contributor.authorpl
Indurkhya, Bipin - 227976
dc.date.accessioned
2020-05-12T08:08:49Z
dc.date.available
2020-05-12T08:08:49Z
dc.date.issuedpl
2019
dc.date.openaccess
0
dc.description.accesstime
w momencie opublikowania
dc.description.numberpl
4
dc.description.physicalpl
453-474
dc.description.publicationpl
1,2
dc.description.version
ostateczna wersja wydawcy
dc.description.volumepl
20
dc.identifier.doipl
10.7494/csci.2019.20.4.3435
dc.identifier.eissnpl
2300-7036
dc.identifier.issnpl
1508-2806
dc.identifier.projectpl
ROD UJ / OP
dc.identifier.projectpl
POLTUR2/5/2018
dc.identifier.uri
https://ruj.uj.edu.pl/xmlui/handle/item/155743
dc.languagepl
eng
dc.language.containerpl
eng
dc.rights*
Udzielam licencji. Uznanie autorstwa 4.0 Międzynarodowa
dc.rights.licence
CC-BY
dc.rights.uri*
http://creativecommons.org/licenses/by/4.0/legalcode.pl
dc.share.type
otwarte czasopismo
dc.subject.enpl
gaze-direction detection
dc.subject.enpl
eye-tracking
dc.subject.enpl
face detection
dc.subject.enpl
robot
dc.subject.enpl
mobile device
dc.subtypepl
Article
dc.titlepl
Detecting gaze direction using robot-mounted and mobile-device cameras
dc.title.journalpl
Computer Science
dc.typepl
JournalArticle
dspace.entity.type
Publication
Affiliations

* The migration of download and view statistics prior to the date of April 8, 2024 is in progress.

Views
22
Views per month
Views per city
Ashburn
4
Delray Beach
2
Dublin
2
Wroclaw
2
Berlin
1
Boardman
1
Des Moines
1
Hanoi
1
Krakow
1
Norfolk
1
Downloads
jarosz_nawrocki_placzkiewicz_sniezynski_zielinski_indurkhya_detecting_gaze_direction_using_robot-mounted_2019.pdf
28
jarosz_nawrocki_placzkiewicz_sniezynski_zielinski_indurkhya_detecting_gaze_direction_using_robot-mounted_2019.odt
11