Possibilities of contactless control of web map applications by sight

This paper assesses possibilities of a new approach of control map applications on the screen without locomotive system. There is a project about usability of Eye Tracking System in Geoinformatic and Cartographic fields at Department of Geoinformatics at Palacky University. The eye tracking system is a device for measuring eye/gaze positions and eye/gaze movement ("where we are looking"). There is a number of methods and outputs, but the most common are "heat-maps" of intensity and/or time. Just this method was used in the first part, where was analyzed the number of common web map portals, especially distribution of their tools and functions on the screen. The aim of research is to localize by heat-maps the best distribution of control tools for movement with map (function "pan"). It can analyze how sensitive are people on perception of control tools in different web pages and platforms. It is a great experience to compare accurate survey data with personal interpretation and knowledge. Based on these results is the next step – design of "control tools" which is command by eye-tracking device. There has been elected rectangle areas located on the edge of map (AOI – areas of interest), with special function which have defined some time delay. When user localizes one of these areas the map automatically moves to the way on which edge is localized on, and time delay prevents accidental movement. The technology for recording the eye movements on the screen offers this option because if you properly define the layout and function controls of the map, you need only connect these two systems. At this moment, there is a technical constrain. The solution of movement control is based on data transmission between eye-tracking-device-output and converter in real-time. Just real-time transfer is not supported in every case of SMI (SensoMotoric Instruments company) devices. More precisely it is the problem of money, because eye-tracking device and every upgrade is very expensive. This constrains and their solutions are also discussed in paper. Main aim of the project is to design (both economically and technologically), optimal way how to record and convert eye-movement in a program with sophisticated control of movements.


Introduction
Tracking person's eye movements have been used in many fields since the last decade.There are many approaches for research on user perception and evaluation of the usability and effectiveness of cartographic and geoinformatic products as well.Eye-tracking system offers great opportunity for objective analysis and evaluation, but there are not many ideas of non-contact control of map application by sight.Mouse and keyboard are inherent parts of computer for locomotive control nowadays and there is the same potential for control by sight for the future.On the other side it requires the physical processes.Control by Netek R.: Possibilities of contactless control of web map applications by sight sight eliminates all constrains associated with locomotive control.It can be used by disabled users, in specialized fields like army or aviation, as well as wide range of home users.The following paper is based on eye-tracking research made on first eye-tracking device in the Czech Republic used on the field of cartography and geoinformatics.It generally describes eye-tracking technology, and then it is followed by description of used device and software as well, finally two approaches for non-contact control map applications by sight are discussed.Simpler approach is fully dependent on SMI Experiment Suite 360°software and based on AOI (Area of Interest) feature.Concept of second approach is softwarely independent and technically more difficult.

Human-computer interface
Human-computer interface (HCI) is an interdisciplinary discipline which examines issues of interaction and communication between the human (=user) and computer (computer systems) [13].It integrates a range of industries like computer science, design, psychology, sociology, physiology, artificial intelligence, etc.It is focused on design, creation and testing usability of information systems and their interfaces with the aim to be simple and intuitive for defined group of users.In addition to HCI examines perceptions, behaviors and information needs of end users.Interaction between users and computers occur at some interface and just for this transmission is widely used eye-tracking technology.According to [7] eye-tracking is the methodology of measuring and recording the eye movements relative to the head position of an observer or that of capturing the gaze on some visual scene ("where we are looking").The meaning of visual scene is wide and it is possible to be related to an analog product (e.g.image, poster etc.) or to a digital product (web page, digital map etc.) that is depicted on a computer screen or projected to a flat surface by an appropriate device.Evaluation by eye-tracking can be considered as objective according to [8], because it is not influenced by the opinion of monitored person.Eye trajectory is recorded and saved for subsequent work.

Fixations and saccades
There are two important terms, related to human-computer interface, which are keystones in eye-tracking researches and will be often mentioned in this paper.Eye movement is not smooth, between quick eyes shifts are some short delays.Saccades are rapid eye movements used in repositioning the fovea to a new location in the visual environment.Saccadic movements are both voluntary and reflexive.According to [6] saccades range in duration from 10 ms to 100 ms, which is a sufficiently short duration to render the executor effectively blind during the transition.Saccades are characterized by their different length, orientation and direction.On the other side during the fixation are eyes fixed on one point of perceived image.According to number of studies (e.g.[1], [2], [6]) is a new fixation observed when the length is longer than 100 ms.In addition to fixation and saccades there are very short eye movements of high frequency (tremors, drifts, microsaccades), but people are not aware of these small movements [1].

Technology
This research was made on static iView 250 RED device mounted under the screen (see Figure 1), developed by the German company SensoMotorics Instruments GmbH (SMI).It is placed in Eye-tracking lab at Department of Geoinformatics, Palacký University in Olomouc.It is first eye-tracking device in the Czech Republic, used for academic research on the field of natural science.User's eye-position and eye-direction is detected by infrared light reflection from the cornea, reflected light is recorded by optical sensor and then based on changes of light reflection from the cornea are eye movements is calculated.Device works at 60Hz temporal resolution, with latency < 25 ms.It is mounted under the user's monitor, where stimulus is presented, and connected with two operator's laptops.The first one (on the Figure 1 located on the right) is needed only for setup and calibration, the second one serves one is connected to the internet and contains specialized software for preparation and evaluation analysis.In every case it is necessary to make the calibration for every new user, which is managed and observed by operator sitting next to tested user.The calibration is necessary due to users different distinguish ability and it takes only about 10-20 seconds.

Participants
Despite the fact, that system was tested many times by author during development, was

Geoinformatics FCE CTU 2011
Netek R.: Possibilities of contactless control of web map applications by sight necessary to verify the functionality by independent respondents.It was chosen two-times verified approach, both beta and final version was tested by 12 respondents.According to [6] or [13] tested subject was normally distributed included both men and women.

Conditions and limitations
Localization of eye-tracking device in the room and surrounding conditions are not mandatory defined and it depends on concrete kind of device.Generally is recommended own enclosed room without windows.According to [6] is necessary adjustable chair and/or table, to direct horizontal position between user's head and monitor and operation distance is the out 60-80 cm.Illumination should be uniform with no halogen spotlights, important is same level of illumination during whole test, that is the reason why laboratory without windows are preferred.The wall around should be in neutral colour.Dark colours are unsuitable, but on the other side sharp white colour burden eyes.Colours like light yellow, light pink or very light blue are appropriate.Usage of glasses by participants is often discussed topic in literature.Common devices are designed also for people with glasses or contact lenses.Soft contact lenses and glasses with less then 2-3 diopter are allowed and not caused any distortion.Only really thick and dirty glasses are prohibited.In the case of using, both lenses and glasses must be clean.There can be one constrains, when pupil have very similar colour to iris, e.g.some ethnicity like Asians.Because eye-tracking devices are in the most cases based on pupil monitoring, in this cases it fails to identify border between pupil and iris.Another fail can happen if women used really strong mascara or some eye-diseases are diagnosed (glaucoma, iritis, Horner Syndrome).Eye-tracking devices are absolutely harmless.
No warnings concerning long term usage are mentioned, but it is assumed interrupted usage, due to loss of concentration and physiologic effect after long time like squinting, blinking or crying [6].

Stimulus
Prepare simple map application as stimulus was necessary at the beginning.Generally there are two groups of controls: simple or complex object.Unfortunately more sophisticated cases like arrow-cross or zoom-slider are inappropriate for connecting with eye-tracking.The main reason is very small area for eye fixation and possible overlap of more than one control on one point [4], [8].This fact was ratified by respondents during a test as well.According to oral discussion after testing bigger separated symbols are preferred.Some areas distinguishable by arrows/symbols/description are adequate in this kind of study.Active area for mouse-Netek R.: Possibilities of contactless control of web map applications by sight click assigned with concrete function is assigned to whole polygon around the symbol.This solution is much easier and faster for users, because just focusing at polygon launch assigned function, moreover the symbols are used for better interpretation.As functional example was chosen map application based on Google Maps API v3.It is a basic standard map, which can be controlled by both mouse and keyboard.Enable of control by keys play dominant role in this approach.Map field is supplemented by four rectangle areas located near the edge of the map.Every of these rectangles contains arrow in the direction on which edge of the map is the area located on and which direction is simulated (Figure 2).For example, area on the right edge contains right arrow for movement to the right.For this research was used only four areas/arrows in four main directions, but there is possibility to use eight areas/arrows for eight directions.The design of application for non-contact control was several modified during the step by step development, so it was not tested only once.Both beta and final version were tested by 12 respondents, as well as many-times by author.The aim was critical evaluation and functional verification by sample of typical users.

System description
In most cases, the common control of digital map application is enabled only by two functions called zoom and pan and their combination.Depending on idea non-contact control with Eye-tracking device it is necessary to transform "contact form" of movement with keyboard and mouse by locomotive apparatus into "non-contact form" by sight.Movement of map application discussed in this paper is based on emulate specific keys.

Zoom control
The "zoom" function is applied for vertical movement over the map; it allows zoom-in and zoom-out among different map scales.There are more possibilities how to emulate "zoom" function; generally it matches some predefined action with zoom trigger.This action can be eye blink, eye focus on some areas (e.g.zoom-in and zoom-out icons) or interconnection with so called "frequency-keyboard". Onscreen frequency-keyboard is used by seriously disablement users and due to wide range specific functions it is not the right choice for this kind of control.In fact, approach where "zoom" function is based on some predefined area (Area of Interest) it is basically identical to emulate key arrows discussed in next paragraphs.Just this concept is used in this research, zoom-in function is emulated by key "+" and zoom-out is emulated by key "-".

Pan control
The "pan" function is applied for horizontal movement over the map; it allows sideway movement.During the testing with SMI software there was an idea to develop two different approaches how to control map application by sight.The first one, simpler, is dependent on SMI Experimental Suite 360°software, in the second concept is map interface independent on SMI software, but on the other side it is more technologically difficult.

Approach dependent on SMI software
First model discussed in this paper is fully dependent on SMI Experimental Suite 360°software.It means that every part of non-contact control and every map must be implemented in this software, and in every moment when user would like to control application by sight is strictly needed to operate with this software.Actually the SMI Experimental Suite 360°runs on operators' laptop and user does not see experimental interface.On the user's screen is only projected map application.Simplicity is big advantage of whole process, due to internal function of this software is necessary only predefined so called Area of Interest (AOI) and add to every of these AOI specific key function.Outline of process is shown at figure 3.

Area of Interest
In the field of cartography the analysis of AOI can be used in advantage.AOI analyses are based on evaluation of concrete parts of the map (legend, scale, title, specific phenomena in the map, etc.).AOI analyses provide much better option to find out the exact behavior of test subjects [3].SMI Experimental Suite 360°allows defining one or more AOIs on the background of observed object automatically; both static and dynamic AOIs are available.In practice it requires to specify area of AOI, their attributes and trigger (when required) before own process of eye-tracing started.If trigger is required, there are two significant attributes: dwell-time and "trigger".Dwell-time specifies time (in milliseconds) for how long must be user focused on AOI until the trigger is activated.It prevents accidental short focuses caused by eye blinks or loss of concentration.The trigger specifies what happened, when AOI is focused on, for example turns into another image/question, emulate any key etc.Just this function is very important for this research and incredibly simplifies the whole process. 2 illustrate design AOI.There has been made six AOI polygons corresponding with six areas defined in map application.Extent each of them was specificated by drawn rectangle over the background.Specify dwell-time and trigger for each of polygons is required in next step.Based on previous testing on eye-tracking device, the dwell-time value was elected about 400-450ms.This time delay prevents accidental movement.It was verified by testing as sufficient for this kind of experiment, because shorter time period than 400ms does not guarantee accurate user's ability to purposeful focus, longer period than 0.5s unnecessarily slows working with application.As AOI trigger event was chosen specific key, which emulate standard key press by locomotive apparatus (Table 1).For example: AOI localized near to the right edge of map field has assigned right arrow key as a trigger.It is necessary to load/confirm "live" map application as stimulus; "static" stimulus is not allowed.The reason is that, just live map application enables to be controlled by keyboard and due to connection with trigger it is responsible for reacting and responding.Experiment was finally saved with all settings into operator's laptop and from this moment has been ready for launch.When another map field will be used as background new AOI with values must be specificated again.1) Experiment preparation.First step is managed by operator, the map application is loaded into SMI Experimental Center software as stimulus, then AOIs and their variables are defined.When experiment and its interface are ready for usage and saved, there are no restrictions for more times usage.That means, when experiment is once prepared, this part does not be required next times.In fact, both stimulus and AOI were prepared and created before the first test was made (described in chapter "Conditions and limitations").SMI software allows saving complete copy of project with all predefined AOIs, their values and settings.Whole project are saved in operator's laptop and can be immediately loaded at any time.
2) Calibration.Next part takes calibration users' eye sensitivity, which is also started by operator, takes only about 10-20 seconds.It is necessary to do calibration for every new user every time due to users different distinguishes ability and it is crucial for the accuracy of the results.It is automatic process managed by SMI software, available in modes with different number of circle points.When more points are elected, the calibration process is more accurate and takes more time.Generally mode with 5 points is sufficient; 9 or 13 points mode is selected for users with some special needs only.The process of calibration is divided in some parts: • Operator launches calibration mode and defines number of points (Figure 4) • User is asked by operator if is ready for calibration.
• Operator starts calibration by clicking the button on operator's laptop.
• On the user's monitor are randomly shown calibration points and user must look at each of them.
• When all points are shown, calibration process is done.
• If the tracking is evaluated successfully, any experiment can be started.If not, it is necessary to make calibrate again.
3) Non-contact control.The last step is real control by sight.The operator is not required for this part.User can randomly look at the map, when one of predefined direction areas (highlighted with an arrow) are localized, the map automatically moves to the way on which edge is localized on.When user localizes zoom area (highlighted with an icon) the map automatically zooms into another scale.Both procedures can be combinated and repeated infinitely.

Data analysis
When user working with eye-tracking device, all eye-trajectory is recorded and saved into external file(s).It provides great occasion for data analysis, reconstruction of eye-movement is possible as well.SMI devices save all experiment by default in

C:\Program Files (x86)\SMI\Experiment Suite 360\Experiment Center 2\Results\
For every new experiment new folder is automatically created, each stimulus is saved into new file.Eye-tracking data collected using SMI device are saved as a *.idf file (iView Data File).This is a proprietary SMI file format, which must be currently be opened by the BeGaze analysis software.Once loaded in BeGaze however, raw data can be exported in more common file formats if desired.Raw data generated from an eye-tracking device is stored in file, which is divided into two parts: header and coordinates data (Figures 5 and 6).Header contains all metadata about experiment.Firstly there are general information about experiment, like the name and the path of original data file; date; participant's name and identificator; etc. Sample rate show number of temporal resolution in Hz, number of samples means cumulative number of captured coordinates.Second part shows number and coordinates of calibration points.Device settings of concrete experiment are located in the last part.It is based on calibration part and it is very individual because it is strictly dependent on user and surrounding conditions.Very important is value "Saccade Length" which set minimal length for saccade (in pixels); lower values are not recorded into this file.
Most space of the file occupies coordinates data in columns (Figure 6).There are number of values, according to export specification, which are defined in the last row of header.Number of rows corresponds to number of samples, which is counted in header.Very important is the fact, that trajectory coordinates are counted form left or right eye, because all other columns are given for the same eye.The first column "Time" indicates time-stamp for each recorded sample, the fourth and the fifth columns show X and Y pupil coordinates.background can be overlap by saccades and fixations, if graphical interface is used for evaluation (see Figure 10).The pilot experiment was made for better comparison and interpretation in this chapter.One simple task was given: "move with map once to the right and then once zoom-in".Figure 2 shows design of AOI when new experiment is just creating and figure 7 shows the same AOI after accomplished experiment with some statistical attributes for analysis.The most important is value "dwell time" which shown how much time user spent on this AOI and "fixation count" also, it shows how many fixations were captured on AOI.Most time was spent on right AOI (42% is more than 5 seconds of total time in 6 fixations) and zoom-in AOI (24% in two fixations), which is in accordance with the task.Three AOIs are empty and user did not look at them, only one short fixation for 0.5 second was localized on the left AOI.Remaining time (30% of total time) user spent on the map field outside AOIs and it is calculated as "White Space" on the left bottom corner.Exact eye-movement is visualized by GazePlot.It is the trajectory of saccades, which connect fixations on the background of analyzed image.GazePlot shows fixation as circles with different size (the radius is equal to length of fixation) and saccade as line that connects these circles [14].There are some limitations of this method when large amounts of data are displayed, because visual identification is not possible, due to overlapping.For this cases is used GazeReplay which display information about fixation and saccades dynamically in time.

Netek R.: Possibilities of contactless control of web map applications by sight
Due to slider is possible shown trajectory for every moment of experiment.According to [3] the GazePlot/GazeReplay is most accurate method for eye-tracking data analysis, because there is no interpolation or generalization.GazeReplay shows position in time exactly.On the figure 9 are data visualized as GazePlot with two dominating circles, which indicate two dominating fixations on the right button and the zoom-in button.Just this method discovers reason, why so much total time user spent on the right button.There are six fixations, but five of them are too short, which is not possible to recognize on the HeatMap.Due to dwell-time specification (chapter "Conditions and limitations") these five fixations are not detect as order to movement and they are considered as accidental fixation.That is the real confirmation that GazePlot gives better results than previous discussed methods.Figure 10 illustrates what happens, if GazePlot trajectories of multiple participants are visualized together.The stimulus background can be fully overlapped, but on the other side, the verification is much easier.According to figure 10 it is clear, that participants were mostly focused on the right and the zoom-in buttons.
Figure 11 shows complete time analysis, with both graphical and alphanumerical records of the saccades and the fixations.Deviations in the graph indicate fixations, in this case periods, when user was focused on AOI.Higher deviation means higher fixation intensity.Two main deviations indicate two fixations detected as order to movement.
Sequence chart at figure 12 shows the strategy of some tested subjects (on Y axis) at the same stimulus (still the same task as previous was given).Each colour corresponds to different AOI and the fixations in AOI are visualized depending on the time (on X axis).It is only way how

Concept of independent approach
Previous approach is fully dependent on SMI Experimental Suite 360°software and for noncontact control.It is strictly needed to use it at the every moment, which can be quite complicated for some requirements.Because of this reason there is concept of another approach, which takes advantage of program independence.First of all it is necessary to say, that for this concept is SMI software used also, but only for users' calibration at the beginning of the process, managed from operators' laptop.The reason is due to technical core based on fact, that SMI eye-tracking device is commanded just by some SMI software (when device from another company, e.g.Tobii will be used, appropriate Tobii software has to be used as well).It can run on the background, both user and operator do not need it more.sight In fact the beginning (users' eye movement on the map and focus on "link area") and the end (map movement) of whole process is quite identical as in the first approach.Because of software-independent usage, there are some technological differences and constrains, this concept is still developed nowadays.
Figure 13: Concept of approach independent on SMI software Technically it is based on convert tracked movement in digital form into any keyboard emulator.Similarly as in previous approach user randomly looks at the map application, for this example, the same test map application was used.Eye position is tracked and recorded by SMI Experimental Suite 360°, which run on the background.Save and export into external textual file is standard feature of common eye-tracking devices (Figure 13).The precision depends on frequency of temporal resolution, in case of 60Hz the data are recorded 60 times per a second, every 16 ms.approximately.Because 60Hz device frequency is higher than distinguish frequency of human eye, the process can be considered as real-time.The core of described solution is some converter, which transform textual output into keyboard emulator.Very important note -the solution is based on real-time data transmission between eye-tracking-device-output and converter.If real-time transmission is not supported, whole process controls by sight does not work correctly!Unfortunately just real-time transfer is not supported by default in some of eye-tracking devices.Technically it is possible to solve this constrain by implementing some plug-in or software upgrade, actually it is the problem of financial cost, because every non-maintenance update is really expensive, as well as eye-tracking device.Compared to first approach, this one is more complicated.

Discussion
The present study investigates non-contact control of map applications by sight only.In every case of experiment is non-contact control strictly dependent on eye-tracking device.There is no other way how to control map applications by sight without the tracking device.That can bring some limitations in some cases, e.g. for handicapped user's.On the other side two basic kinds of device are available for research nowadays.The first one is static device mounted under the screen, used in this research as well.It is fixed, connection with computer is required all the time and stimulus is limited by user's monitor.This method is sufficient for indoor usage and for stimulus shown on the screen.This paper gives typical research suitable for fixed device, in all similar researches this is used just fixed device, because of easy control and availability.Another kind is mobile device mounted on a user's head, so called headset.Head-or helmet-mounted devices bring the inconvenience for users, but independence on computer gives great benefit.The most sophisticated devices look like common glasses.It can be used both outdoors and indoors for fully mobile eye tracking studies.Nowadays it is used for mobile studies like market or driving research only.When maps application will be projected on other medium than computer screen (cell phone, tablet, wall, interactive tables, combinations of multiple monitors, hologram, etc.) this kind of eye-tracking device is suitable.
Selected device seems to be the most limitation of non-contact control.Approach is developed and dependent on SMI device (chapter "Fixations and saccades"), so it is necessary to use the same or similar SMI device in every case, moreover SMI software is required.The reason is, that device from some manufacturer is commanded just by software made by the same manufacturer, e.g. when device from Tobii company is used, appropriate Tobii software has to be used as well.Not more than 5 companies develop commercial eye-tracking devices all over the world; generally devices SMI and Tobii are used only.In fact the software is necessary only for launch of experiment, for another working it can run on the background.The most important parameter for real-time transmission is temporal resolution.SMI devices are available at sampling rate of 60Hz or 120Hz.In case of 60Hz the data are recorded every 16 ms, which is higher than distinguish frequency of human eye.When 60Hz device is used, parameter of sampling rate does not play any important role for experiment accuracy.The dwell-time value significantly influences responses of control and it is necessary to set it individually for different age groups of users.According to testing, the value of 400 ms was set as time delay for users between 20-30 years old.It was verified by testing as sufficient, because shorter time period than 400ms does not guarantee accurate user's ability to purposeful focus, on the other side longer period than 500ms slows working with application.Higher values will be set for children, seniors or visual disabled users of course.Approximately 10 users tested by eye-tracking are enough for determination new dwell-time value.
Both approaches are based on "key-enable-control" applications.It means that application Netek R.: Possibilities of contactless control of web map applications by sight has to be enabled control by keyboard.Fortunately, this opportunity is standard feature of common map applications; in addition blind-friendly web specification it required for all web pages.Compared to the simpler approach, the concept of SMI independent controlling brings better solution with extended possibilities of control by sight only, but the best solution would be to implement all function of common map portals (e.g.zoom by slider, layer switcher) to one service control by sight.Other solutions than keyboard emulate should be possible, but it is clear that this requires further study.

Conclusion
This paper assesses possibility of non-contact control map applications by sight.Eye-tracking system provides great opportunity for design, analyze and evaluate both digital and analog maps.In addition, it can be used for control digital application without locomotive apparatus, which this paper is focused on.There were some previous researches, but none of them was strictly focused on maps.For research was used SMI iView 250 RED device with SMI Experimental Suite 360, actually it means, that same or similar SMI device is always required.The device is placed in Eye-tracking lab at Department of Geoinformatics, Palacký University in Olomouc, first eye-tracking lab in the Czech Republic for field of cartography and geoinformatics.The eye-position and eye-direction of view is detected by infrared light reflection from the cornea at 60Hz temporal resolution.System latency at 60Hz is < 25 ms which is quite sufficient value for real-time transsmission.Control of map application discussed in this paper is based on emulate specific keys on two different approaches.First model is fully dependent on SMI Experimental Suite 360°software.Due to internal AOI function is necessary only predefined so called Area of Interest (AOI) and add to every of these area specific key function (e.g.key arrows).Simpler interface and easy operators' background are great advantages, on the other side the disadvantage is limitation only on SMI software.For development was used test map application with four rectangle areas located on the edge of map indicate four main directions, and two small areas indicate zoom-in and zoom-out function.All six areas are matched with specific key.Because this approach is dependent on SMI software, it was given another concept, which takes advantage of program independence.Compared to the first approach, the second one is more complicated of course, because of converter, which transform textual output from tracking device into keyboard emulator.For common users' requirements is first solution fully sufficient and two-times tested and evaluated by 12 users in Eye-tracking lab.The aim was critical evaluation and functional verification by sample of typical users.The concept of second solution is in progress now, because the author hopes, that non-contact control of map application can be widespread in next few years.

Figure 1 :
Figure 1: SMI iView 250 RED device under the user's monitor (on the left) is connected with operator's laptops

Figure 2 :
Figure 2: Design of AOI in SMI Experimental Suite 360°F

Figure 3 :
Figure 3: Approach dependent on SMI software

Figure 4 :
Figure 4: During calibration process are randomly shown calibration points and user must look at each of them.

Figure 7 :
Figure 7: Analysis of AOI attributes -"dwell time" shows how much time user spent on AOI, "fixation count" shows how many fixations were captured on AOI.The task was: move with map once to the right and then once zoom-in.

Figure 8 :
Figure 8: HeatMap matrix with time values (on the left) and standard HeatMap (on the right) show intensity and position of fixations.The task was: move with map once to the right and then once zoom-in.

Figure 9 :
Figure 9: GazePlot visualizes trajectory of saccades and length of fixations, larger circle indicate longer fixation.The task was: move with map once to the right and then once zoom-in.

Figure 10 :Figure 11 :
Figure 10: GazePlot of the same task as previous -visualization of multiple participants can overlap the background.

Figure 12 :
Figure 12: Sequence chart indicate strategy of users in time.Blue color indicates focus on the right AOI; green color indicates focus on the zoom-in AOI.

Table 1 :
Emulated keys depending on predefined AOI function Netek R.: Possibilities of contactless control of web map applications by sight user can fully control digital map application only by sight.It is based on localization specific areas with predefined function (AOI) on the screen.The schedule of the whole process is divided into three parts.
scenarioPractically, usage this approach is very simple way how to accomplish assigned problem -Geoinformatics FCE CTU 2011

Table 2
Practice experiments are not limited by time, by number of fixations or any tasks.Furthermore more than one user can pass one experiment, because the number of users and time is not limited.In this case, there is huge number of data when experiment is analyzed.Whole Netek R.: Possibilities of contactless control of web map applications by sight Figure 5: Header of raw eye-tracking data file (*.idf) recorded during experiment, it is divided into three parts: metadata about experiment; number and coordinates of calibration points; device settings for concrete experiment based on calibration TYPE indicates if the row describes a sample (SMP) or a message (MSG) TIME time-stamp = number of seconds/milliseconds since 1st January 1970 00:00:

Table 2 :
Columns in eye-tracking data file (*.idf) 1) Calibration.First part is calibration users' eye sensitivity, which is started by operator, takes only about 20 seconds.It is necessary to do calibration for every new user every time due to users different distinguish ability.Only for calibration SMI software is used, and then it can run on the background.2) Non-contact control.Real control by sight.Operator isn't required anymore.User can randomly look at the map and focus on "link area" (highlited with arrows or icons).3) Direction/position calculation.Eye movement is tracked, calculated, recorded and also generated as textual output by SMI Experimental Suite Netek R.: Possibilities of contactless control of web map applications by sight 360°.4) Converter.Sequences of XY coordinates from output are real-time transformed into direction/zoom command for keyboard emulator 5) Keyboard emulator.Specific keys (four key arrows, key +, key -) are assigned to each command (four directions, zoom-in, zoom-out).6) Response.The map immediately moves and/or zooms into another scale.Both procedures can be combinated and repeated infinitely.Due to < 25 ms system latency at 60Hz it can be defined as real-time control (in fact there is still minimal delay, insignificant for users' perception).