• Keine Ergebnisse gefunden

Eye Tracking and Gaze-based Interaction

N/A
N/A
Protected

Academic year: 2022

Aktie "Eye Tracking and Gaze-based Interaction"

Copied!
69
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Eye Tracking and Gaze-based Interaction

Introduction and Current Research Trends

Human-Computer Interaction 2 - 10.01.2018

WS 2017/2018

Picture source: http://www.black.ninja

(2)

New Interaction Methods are Desirable

The amount of interaction with computer devices increases.

Consequently we look for interaction methods which are:

● quicker

● do not require training

● do not need physical and mental efforts

Eye tracking is a promising technology as the eyes are quick and we use them with ease.

2

(3)

Things We Can Do with Eye Tracking

Text 2.0

A project from DFKI (German Research Center for Artificial

Intelligence)

(4)

A Smart Computer Needs Awareness for Gaze

Eye gaze is very important for human-human interaction ... and human-computer interaction should be like

human-human interaction.

Taken from Milekic: The More You Look the More You Get: Intention-based Interface using Gaze-tracking

(5)

What the Eyes C an Tell

• Eyes indicate visual interest

• Gaze precedes actions

• Eyes reflect cognitive processes

• Eyes reflect physical and mental condition

In HCI Gaze Can Be

● A source of information about the user

● An input method

Eye trackers keep advancing and getting cheaper!

(6)

Expectations for Eye Gaze Interaction

Hopes

6

● Ease of use

● Speed up interaction

● Maintenance free

● Hygienic interface

● Remote control

● Safer interaction

● Smarter interaction

Fear s ● Ability to control the eyes

● Conflict of input and vision

● Fatigue of eye muscles

As eye movements are part of our social protocol we are able to control our eyes

Normally we look at the interaction element for input

Our eyes move constantly even if we

sleep

(7)

Eye Tracking Applications

(8)

Eye Tracking Applications

8

Gaze Monitoring

Implicit gaze-based Interaction Gaze-supported (Multimodal) Interaction

Explicit gaze-based Interaction

Päivi Majaranta and Andreas Bulling, “Eye Tracking and Eye-Based Human–Computer Interaction,” in Advances in Physiological Computing, Human–Computer Interaction Series (Springer, London, 2014), 39–65, https://doi.org/10.1007/978-1-4471-6392-3_3.

(9)

Gaze Monitoring

● Collect data about the user

● Cognitive load

● User’s attention

● Visual preference

(10)

Gaze Monitoring

Studying visual attention

10 Nicholas S. et al. (CHI 2015)

(11)

Gaze Monitoring

Akkil et al. (CHI 2016)

Support in collaborative environments

(12)

Gaze Monitoring

12

Optimizing

Interfaces

(13)

Implicit gaze-based Interaction

Detecting intention/need for assistance

Karolus et al. (CHI 2017) Walber et al. (CHI 2014)

(14)

Implicit gaze-based Interaction

14 Santella et al. (CHI 2006)

Jain et al. (ACM Trans. Graph. 34, 2015)

Gaze-assisted Photo/Video editing

(15)

Gaze - supported (Multimodal) Interaction

● Combine gaze with other input modalities, such as

● Touch

● Mid-air gestures

● Pen-input, Phone-input, etc..

Chatterjee et al. (ICMI 2015)

Khamis et al. (CHI 2016)

(16)

Gaze-supported (Multimodal) Interaction

16 Pfeuffer et al. (UIST 2014)

(17)

Gaze-supported (Multimodal) Interaction

Stellmach et al. (CHI 2013)

(18)

Explicit gaze-based interaction

● Using eyes for input/control

○ Moving the mouse

○ Selection

○ Scrolling

○ Eye typing

○ Authentication

18

Gaze-based authentication

EyePassShapes - De Luca et al. (SOUPS 2009)

Microsoft Patent 2014

Source:

http://www.winbeta.org/news/microsoft-patents-eye-tracking-key board-could-use-technology-future-devices

Eye-based interaction with displays -

Zhang et al. (UbiComp 2013) Gaze input for the disabled

(19)

Esteves et al. (UIST 2015)

Explicit gaze-based interaction

(20)

Explicit gaze-based interaction

Spontaneous Interaction with Displays

20 Zhang et al. (CHI 2013)

(21)

Explicit gaze-based interaction

Manual And Gaze Input Cascaded (MAGIC) pointing

1. The user gazes at the target

2. The cursor is wrapped to the eye tracking position

3. The distance to target is significantly shorter despite the tracker’s accuracy limitations

Zhai et al., (CHI 1999)

(22)

How Does Eye Tracking Work?

22

(23)

Eye Anatomy

(24)

The Human Eye - Movement Control

Images from:

https://droualb.faculty.mjc.edu/Lecture%20Notes/Unit%203/muscles%20with%20figures.htm and https://de.wikipedia.org/wiki/Augenmuskeln#Entwicklungsgeschichte

(25)

The Human Eye - Vision

The eye works similar to a camera.

However there are some differences:

● For adjusting the focus the lens changes its form (and not the position)

● The sensor surface is curved (and not plane)

From Duchowski, T. D.: Eye Tracking Methodology: Theory and Practice

(26)

Types of Eye Movement

26

(27)

The Human Eye - “Movement Types”

• Fixations

–(mostly) stable eye position

•Saccades, Microsaccades

–fast „ballistic“ eye movements

• (Smooth) Pursuit

–following an object with your eyes –cannot be done voluntarily

•Vestibulo-ocular Reflex

–compensate head movements

•Optokinetic Nystagmus

–combination of pursuit and saccades

•Vergence Movements

–Cooperation of both eyes to focus a single object

•Pupil Dilation

(28)

Fixations

● A pause, usually between 200 and 600 ms

● Humans fixate to sharply focus on a narrow area, long enough for the brain to perceive it

● Visualization as heat map

http://www.prweb.com/releases/2005/03/prweb213516.htm

(29)

Fixations - Interaction example

● One possible interaction using fixations is the dwell time

method

Typical application: Eye typing for handicapped (since the

80ies)

Problem: Midas Touch effect, slow

● Another interaction using fixations is multimodal

interaction - fixate and press key Problem: Need of second

modality, not contact-free, but

https://www.youtube.com/watch?v=oXBKXRqxVnU

(30)

Saccades

● Fast jumps from a fixation point to another

http://www.web-solution-way.be/3-marketing-internet/19-eye-tracking- google.html

https://en.wikipedia.org/wiki/Eye_tracking

30

(31)

Saccades - Interaction Examples

● Gaze gestures

Looking at the corners of the screen in a certain order

● Reading detection, activity recognition

A series of forward saccades and a long backward saccade indicate reading activity.

(32)

Smooth Pursuits

● Smoothly follow a moving target

https://www.andreas-bulling.de/fileadmin/docs/vidal13_ubicomp.pdf

32

(33)

Interaction using Smooth Pursuits

(34)

Vestibulo-ocular Reflex

● Compensation of head movements

Interaction method using the vestibulo-ocular reflex:

gaze-based head gestures

https://www.youtube.com/watch?v=j_R0LcPnZ_w

34

(35)

Optokinetic Nystagmus

● Smoothly follow a moving target

Jalaliniya and Mardanbegi. CHI 2016. EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements.

(36)

Vergence Movements

● Cooperation of both eyes to focus a single object

36 Florian Alt, Stefan Schneegass, Jonas Auda, Rufat Rzayev, and Nora Broy. 2014. Using eye-tracking to support

interaction with layered 3D interfaces on stereoscopic displays. IUI '14

(37)

Pupil Dilation

● Change in diameter to accommodate lighting

○ circular muscles around the pupil

● (Smaller) change in diameter when exerting mental effort

○ radial muscles around the pupil

○ used to estimate cognitive/mental workload

(38)

Eye Tracking Technologies

38

(39)

Eye Tracking Technologies

Remote Eye Tracking Head Mounted Eye Tracking

(40)

Eye Tracking Technologies

40

Remote Eye Trackers Head Mounted Eye Trackers

Tracks gaze.. on displays in natural settings

Building responsive systems requires...

developing a software using an SDK labeling the surroundings (e.g. adding QR-codes to products)

Setup Easy to setup Cumbersome to wear

Flexibility Allows very limited movements Allows free head/body movements

Can users see the gaze data? Yes No, unless you provide the user with another display

http://www.research-results.de/

http://www.useeye.de/

(41)

Eye Tracking Producers

Commercial Open Source

Open Eyes project

http://thirtysixthspan.com/openEyes/

(42)

Eye Tracking Techniques

42

(43)

Eye Tracking Techniques

● Video-based Tracking

● Infrared Pupil-Corneal Reflection (IR-PCR) Tracking

● Electrooculography-Based (EOG) Tracking

Tobii Glasses Eye tracker

Tobii Eye Tracker

IR-PCR Eye trackers

(44)

Eye Tracking techniques - Video-based

● Can be remote or head-mounted

● Relies on image processing

○ Pupil detection (SET, Starburst algorithms)

● Accuracy/quality is influenced by:

○ camera parameters (distance, resolution, …)

○ lighting conditions

○ reflections of glasses/lenses, obstacles (eyelids)

44

● Li, D., & Parkhurst, D. J. (2005). Starburst : A robust algorithm for video-based eye tracking. Image (Rochester, N.Y.), (September 2005), 22.

● Javadi, A.-H., Hakimi, Z., Barati, M., Walsh, V., & Tcheang, L. (2015). SET: a pupil detection method using sinusoidal approximation. Frontiers in Neuroengineering, 8(April), 1–10. http://doi.org/10.3389/fneng.2015.00004

(45)

Eye Tracking techniques - IR-PCR

● Can be remote or head-mounted

● “Corneal reflection” serves as static reference point

○ Allows slight head movement

○ more accurate gaze data

● Does not work outside (sunlight)

(46)

Eye Tracking techniques - EOG

● Uses electrodes attached on the skin

● Can work in complete dark settings (e.g. closed eyes)

● Signal processing is computationally lightweight

● It can be affected by signal noise (e.g. power line)

● less accurate, even with medical-grade equipment

46 Ear-pads based [Manabe et al. 2013]

(only detects looks to the left/right)

(47)

Challenges

(48)

Interaction

● Midas touch (Perception vs Interaction)

● Involuntary eye movements Data Interpretation

● Gaze fixation point != Visual attention

● Experiment Bias

● Eye Fatigue Technical

● Accuracy

● Calibration

● Environmental Artefacts

● Sampling Rates

Challenges

48

(49)

Midas Touch

Should the class end earlier?

Accidently choosing “No” while reading

(Midas Touch)

(50)

Involuntary eye movements

50

In other words:

What is relevant data and what is not?

(51)

Data Interpretation

Gaze fixation point != Visual attention

Humans are not necessarily paying attention to what they look at

(52)

Data Interpretation

Eye fatigue

Kosch et al. (CHI 2018)

52

(53)

Accuracy

http://www.cns.nyu.edu/~david/courses/perception/lecturenotes/eye/eye.html

(54)

Environmental Artefacts

54 Makeup

Interference (e.g. electromagnetic)

(55)

Calibration

(56)

56

Calibration

(57)

Calibration

+ Results in more accurate data.

- Difficult and tedious.

- Might break if a user moves.

- Almost impossible to determine the exact pixel a

user is gazing at.

(58)

Calibration-free Eye Tracking

58

(59)

Calibration-free eye tracking

SideWays Smooth Pursuits

(Vidal et al. UIST 2013)

(60)

Calibration-free eye tracking

60

Nagamatsu et al.

PerDis 2014

Drewes et al. 2007

(can also be done without calibration)

(61)

Gaze is a promising modality

○ for understanding the user

○ for interacting with computers and smart environments

○ when combined with other modalities

Gaze technologies still have some limitations

○ Require calibration (low usability)

○ Track one user at a time

○ Can be confused with perception (Midas touch)

○ Not flexible for dynamic environments (e.g. public displays)

Take-home Messages

(62)

Our Research Interests

62

(63)

Gaze-based Interaction on Public Displays

(64)

Challenges of Gaze-based Interaction with Large Public Displays

64

1. Position 2. Movement

3. Calibration

(65)

EyeScout: Active Eye Tracking for Position and

Movement Independent Gaze Interaction with

Large Public Displays.

(66)

GazeDrone: Using Drones as Mobile

Remote Eye Trackers for Public Displays

66

Project by Anna Kienle

(67)

Text-based Calibration of Eye

Trackers

(68)

Text-based Calibration of Eye Trackers

Seamlessly integrating eye tracker calibration to public display application

68

(69)

Our Research Interests

Proficiency Awareness through Gaze Features

Referenzen

ÄHNLICHE DOKUMENTE

Die Bibliothek von morgen, so stellt sich Greifeneder vor, soll eine offene Bibliothek sein, die zum Hineingehen einlädt, ein Ort der Kommunikation, der zwar kein Café ersetzt,

Tracking zur Reduktion dieser Unsicherheiten stellt dabei eine vielversprechende Möglichkeit dar und der Nutzen konnte in zahlreichen Experimenten

We propose a custom-built hardware testing stage and an appropriate kinematic model with its calibration, to evaluate the accuracy of the center location of the corneal

In the Expanded-Viewport Screenshot method, the fixed located element is placed on the output image by its first appearance on the viewport, and whenever the user is scrolling the

Render the heat mapped scene from the reviewer’s perspective The core ideas of the approach are the aggregation of attention in object-local Attention Textures, the use of up to

We as- sessed whether the recent-event preference replicates overall and whether participants would follow the actor’s gaze to the future action target, thus effectively

The 2D selection algorithm determines the Eu- clidean distance between the 2D coordinates on the projection plane provided by the eye tracking software and the projected

We present our work on integrating a lightweight head-mounted eye tracking system in a CAVE-like Virtual Reality Set-Up and provide promising data from a user study on the