It is an honor for us that this COGAIN Symposium is co-located with the ACM ETRA 2018 in Warsaw, Poland. This provides us with the great opportunity to share and exchange ideas with researchers of different areas of eye tracking. In particular, we appreciate the co-location with EMIP, PETMEI and ETVIS, with which we share a long history of interactions and inspiring discussions.
Proceeding Downloads
Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views
The current best practice for hands-free selection using Virtual and Augmented Reality (VR/AR) head-mounted displays is to use head-gaze for aiming and dwell-time or clicking for triggering the selection. There is an observable trend for new VR and AR ...
Eye movements and viewer's impressions in response to HMD-evoked head movements
The relationships between eye and head movements during the viewing of various visual stimuli using a head mounted display (HMD) and a large flat display were compared. The visual sizes of the images displayed were adjusted virtually, using an image ...
3D gaze estimation in the scene volume with a head-mounted eye tracker
Most applications involving gaze-based interaction are supported by estimation techniques that find a mapping between gaze data and corresponding targets on a 2D surface. However, in Virtual and Augmented Reality (AR) environments, interaction occurs ...
Moveye: gaze control of video playback
- Jacek Matulewski,
- Bibianna Bałaj,
- Ewelina Marek,
- Łukasz Piasecki,
- Dawid Gruszczyński,
- Mateusz Kuchta,
- Włodzisław Duch
Several methods of gaze control of video playback were implemented in MovEye application. Two versions of MovEye are almost ready: for watching online movies from the YouTube service and for watching movies from the files stored on local drives. We have ...
Playing music with the eyes through an isomorphic interface
Playing music with the eyes is a challenging task. In this paper, we propose a virtual digital musical instrument, usable by both motor-impaired and able-bodied people, controlled through an eye tracker and a "switch". Musically speaking, the layout of ...
Context switching eye typing using dynamic expanding targets
Text entry by gazing on a virtual keyboard (also known as eye typing) is an important component of any gaze communication system. One of the main challenges for efficient communication is how to avoid unintended key selections due to the Midas' touch ...
A Fitts' law study of click and dwell interaction by gaze, head and mouse with a head-mounted display
Gaze and head tracking, or pointing, in head-mounted displays enables new input modalities for point-select tasks. We conducted a Fitts' law experiment with 41 subjects comparing head pointing and gaze pointing using a 300 ms dwell (n = 22) or click (n =...
A Fitts' law evaluation of gaze input on large displays compared to touch and mouse inputs
Gaze-assisted interaction has commonly been used in a standard desktop setting. When interacting with large displays, as new scenarios like situationally-induced impairments emerge, it is more convenient to use the gaze-based multi-modal input than ...
Content-based image retrieval based on eye-tracking
To improve the performance of an image retrieval system, a novel content-based image retrieval (CBIR) framework with eye tracking data based on an implicit relevance feedback mechanism is proposed in this paper. Our proposed framework consists of three ...
Beyond gaze cursor: exploring information-based gaze sharing in chat
Gaze sharing is found to be beneficial for computer-mediated collaboration by several studies. Usually, a coordinate-based visualization like a gaze cursor is used which helps to reduce or replace deictic expressions and thus makes gaze sharing a useful ...
Cited By
-
Kapp S, Barz M, Mukhametov S, Sonntag D and Kuhn J (2021). ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays, Sensors, 10.3390/s21062234, 21:6, (2234)
-
Gardony A, Lindeman R, Brunyé T, Kress B and Peroz C (2020). Eye-tracking for human-centered mixed reality: promises and challenges Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR), 10.1117/12.2542699, 9781510633872, (27)