<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.wiki.cogain.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Admin</id>
	<title>COGAIN: Communication by Gaze Interaction (hosted by the COGAIN Association) - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.wiki.cogain.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Admin"/>
	<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php/Special:Contributions/Admin"/>
	<updated>2026-04-20T06:02:03Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.33.4</generator>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=File:Test_video.mp4&amp;diff=2896</id>
		<title>File:Test video.mp4</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=File:Test_video.mp4&amp;diff=2896"/>
		<updated>2019-09-30T12:59:36Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
Video import test&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2895</id>
		<title>COGAIN2007 Video Isokoski</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2895"/>
		<updated>2019-09-29T18:53:50Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Videos]]&lt;br /&gt;
== Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes ==&lt;br /&gt;
&lt;br /&gt;
Poika Isokoski&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&amp;lt;HTML5video width=&amp;quot;384&amp;quot; height=&amp;quot;288&amp;quot; autoplay=&amp;quot;false&amp;quot;&amp;gt;cogain2007/cogain2007-isokoski&amp;lt;/HTML5video&amp;gt;&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;[[Media:cogain2007-isokoski.mp4]]&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This video, when viewed, will transfer approx. &amp;lt;b&amp;gt;53&amp;amp;nbsp;MB&amp;lt;/b&amp;gt; data to your device.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;span style=&amp;quot;font-size: 8pt;&amp;quot;&amp;gt;The video is displayed using current techniques (HTML5 video tag) which are supported by all current web browsers.&lt;br /&gt;
If you can't watch the video, please consider upgrading to a current web browser like [http://www.getfirefox.org Firefox 15+], [http://windows.microsoft.com/en-US/internet-explorer/downloads/ie Internet Explorer 9+], [http://www.opera.com/ Opera 12+], [http://www.apple.com/safari/ Safari 5+], [https://www.google.com/intl/en/chrome/browser/ Chrome 19+], [http://www.konqueror.org/download/ Konqueror] or any other up to date program for your system. (Version numbers as of late 2012)&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [http://www.cogain.org/cogain2007 Back to COGAIN 2007 webpage]&lt;br /&gt;
* [[COGAIN2007 Videos|Back to COGAIN 2007 Videos list]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=File:Bunny.mp4&amp;diff=2894</id>
		<title>File:Bunny.mp4</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=File:Bunny.mp4&amp;diff=2894"/>
		<updated>2019-09-29T18:21:55Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2893</id>
		<title>COGAIN2007 Video Isokoski</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2893"/>
		<updated>2019-09-29T18:07:56Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Videos]]&lt;br /&gt;
== Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes ==&lt;br /&gt;
&lt;br /&gt;
Poika Isokoski&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&amp;lt;HTML5video width=&amp;quot;384&amp;quot; height=&amp;quot;288&amp;quot; autoplay=&amp;quot;false&amp;quot;&amp;gt;cogain2007/cogain2007-isokoski&amp;lt;/HTML5video&amp;gt;&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;[[xxx:cogain2007/cogain2007-isokoski.mp4]]&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;[[Media:videos/bunny.mp4]]&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This video, when viewed, will transfer approx. &amp;lt;b&amp;gt;53&amp;amp;nbsp;MB&amp;lt;/b&amp;gt; data to your device.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;span style=&amp;quot;font-size: 8pt;&amp;quot;&amp;gt;The video is displayed using current techniques (HTML5 video tag) which are supported by all current web browsers.&lt;br /&gt;
If you can't watch the video, please consider upgrading to a current web browser like [http://www.getfirefox.org Firefox 15+], [http://windows.microsoft.com/en-US/internet-explorer/downloads/ie Internet Explorer 9+], [http://www.opera.com/ Opera 12+], [http://www.apple.com/safari/ Safari 5+], [https://www.google.com/intl/en/chrome/browser/ Chrome 19+], [http://www.konqueror.org/download/ Konqueror] or any other up to date program for your system. (Version numbers as of late 2012)&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [http://www.cogain.org/cogain2007 Back to COGAIN 2007 webpage]&lt;br /&gt;
* [[COGAIN2007 Videos|Back to COGAIN 2007 Videos list]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2892</id>
		<title>COGAIN2007 Video Isokoski</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2892"/>
		<updated>2019-09-27T14:30:12Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Videos]]&lt;br /&gt;
== Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes ==&lt;br /&gt;
&lt;br /&gt;
Poika Isokoski&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&amp;lt;HTML5video width=&amp;quot;384&amp;quot; height=&amp;quot;288&amp;quot; autoplay=&amp;quot;false&amp;quot;&amp;gt;cogain2007/cogain2007-isokoski&amp;lt;/HTML5video&amp;gt;&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;[[Media:videos/cogain2007/cogain2007-isokoski.mp4]]&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This video, when viewed, will transfer approx. &amp;lt;b&amp;gt;53&amp;amp;nbsp;MB&amp;lt;/b&amp;gt; data to your device.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;span style=&amp;quot;font-size: 8pt;&amp;quot;&amp;gt;The video is displayed using current techniques (HTML5 video tag) which are supported by all current web browsers.&lt;br /&gt;
If you can't watch the video, please consider upgrading to a current web browser like [http://www.getfirefox.org Firefox 15+], [http://windows.microsoft.com/en-US/internet-explorer/downloads/ie Internet Explorer 9+], [http://www.opera.com/ Opera 12+], [http://www.apple.com/safari/ Safari 5+], [https://www.google.com/intl/en/chrome/browser/ Chrome 19+], [http://www.konqueror.org/download/ Konqueror] or any other up to date program for your system. (Version numbers as of late 2012)&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [http://www.cogain.org/cogain2007 Back to COGAIN 2007 webpage]&lt;br /&gt;
* [[COGAIN2007 Videos|Back to COGAIN 2007 Videos list]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2891</id>
		<title>COGAIN2007 Video Isokoski</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2891"/>
		<updated>2019-09-27T14:19:29Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Videos]]&lt;br /&gt;
== Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes ==&lt;br /&gt;
&lt;br /&gt;
Poika Isokoski&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&amp;lt;HTML5video width=&amp;quot;384&amp;quot; height=&amp;quot;288&amp;quot; autoplay=&amp;quot;false&amp;quot;&amp;gt;cogain2007/cogain2007-isokoski&amp;lt;/HTML5video&amp;gt;&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;[[File:videos/cogain2007/cogain2007-isokoski.mp4]]&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This video, when viewed, will transfer approx. &amp;lt;b&amp;gt;53&amp;amp;nbsp;MB&amp;lt;/b&amp;gt; data to your device.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;span style=&amp;quot;font-size: 8pt;&amp;quot;&amp;gt;The video is displayed using current techniques (HTML5 video tag) which are supported by all current web browsers.&lt;br /&gt;
If you can't watch the video, please consider upgrading to a current web browser like [http://www.getfirefox.org Firefox 15+], [http://windows.microsoft.com/en-US/internet-explorer/downloads/ie Internet Explorer 9+], [http://www.opera.com/ Opera 12+], [http://www.apple.com/safari/ Safari 5+], [https://www.google.com/intl/en/chrome/browser/ Chrome 19+], [http://www.konqueror.org/download/ Konqueror] or any other up to date program for your system. (Version numbers as of late 2012)&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [http://www.cogain.org/cogain2007 Back to COGAIN 2007 webpage]&lt;br /&gt;
* [[COGAIN2007 Videos|Back to COGAIN 2007 Videos list]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2890</id>
		<title>COGAIN2007 Video Isokoski</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2890"/>
		<updated>2019-09-27T14:18:29Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Videos]]&lt;br /&gt;
== Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes ==&lt;br /&gt;
&lt;br /&gt;
Poika Isokoski&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&amp;lt;HTML5video width=&amp;quot;384&amp;quot; height=&amp;quot;288&amp;quot; autoplay=&amp;quot;false&amp;quot;&amp;gt;cogain2007/cogain2007-isokoski&amp;lt;/HTML5video&amp;gt;&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;[[File:videos/cogain2007/cogain2007-isokoski.mp4|384x288px|autoplay=false]]&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This video, when viewed, will transfer approx. &amp;lt;b&amp;gt;53&amp;amp;nbsp;MB&amp;lt;/b&amp;gt; data to your device.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;span style=&amp;quot;font-size: 8pt;&amp;quot;&amp;gt;The video is displayed using current techniques (HTML5 video tag) which are supported by all current web browsers.&lt;br /&gt;
If you can't watch the video, please consider upgrading to a current web browser like [http://www.getfirefox.org Firefox 15+], [http://windows.microsoft.com/en-US/internet-explorer/downloads/ie Internet Explorer 9+], [http://www.opera.com/ Opera 12+], [http://www.apple.com/safari/ Safari 5+], [https://www.google.com/intl/en/chrome/browser/ Chrome 19+], [http://www.konqueror.org/download/ Konqueror] or any other up to date program for your system. (Version numbers as of late 2012)&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [http://www.cogain.org/cogain2007 Back to COGAIN 2007 webpage]&lt;br /&gt;
* [[COGAIN2007 Videos|Back to COGAIN 2007 Videos list]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2885</id>
		<title>COGAIN2007 Video Isokoski</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=COGAIN2007_Video_Isokoski&amp;diff=2885"/>
		<updated>2019-09-26T14:18:15Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Videos]]&lt;br /&gt;
== Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes ==&lt;br /&gt;
&lt;br /&gt;
Poika Isokoski&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;&amp;lt;HTML5video width=&amp;quot;384&amp;quot; height=&amp;quot;288&amp;quot; autoplay=&amp;quot;false&amp;quot;&amp;gt;cogain2007/cogain2007-isokoski&amp;lt;/HTML5video&amp;gt;&amp;lt;/p&amp;gt;&lt;br /&gt;
&amp;lt;p&amp;gt;[[Media:cogain2007/cogain2007-isokoski.mp4|384x288px|autoplay=false]]&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This video, when viewed, will transfer approx. &amp;lt;b&amp;gt;53&amp;amp;nbsp;MB&amp;lt;/b&amp;gt; data to your device.&amp;lt;br/&amp;gt;&amp;lt;br/&amp;gt;&lt;br /&gt;
&amp;lt;span style=&amp;quot;font-size: 8pt;&amp;quot;&amp;gt;The video is displayed using current techniques (HTML5 video tag) which are supported by all current web browsers.&lt;br /&gt;
If you can't watch the video, please consider upgrading to a current web browser like [http://www.getfirefox.org Firefox 15+], [http://windows.microsoft.com/en-US/internet-explorer/downloads/ie Internet Explorer 9+], [http://www.opera.com/ Opera 12+], [http://www.apple.com/safari/ Safari 5+], [https://www.google.com/intl/en/chrome/browser/ Chrome 19+], [http://www.konqueror.org/download/ Konqueror] or any other up to date program for your system. (Version numbers as of late 2012)&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [http://www.cogain.org/cogain2007 Back to COGAIN 2007 webpage]&lt;br /&gt;
* [[COGAIN2007 Videos|Back to COGAIN 2007 Videos list]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2729</id>
		<title>Eye Trackers</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2729"/>
		<updated>2012-03-21T06:01:42Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Open source gaze tracking and freeware eye tracking */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
'''A catalogue of currently available eye trackers, categorized into systems for assistive technology, research purposes etc.'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Eye Trackers for Assistive Technology and AAC ==&lt;br /&gt;
&lt;br /&gt;
Commercial eye tracking systems that are used for controlling a computer or as communication aids by people with disabilities.&lt;br /&gt;
&lt;br /&gt;
* [[Eye Tracker Intelligaze|Alea Technologies Gmbh: Intelligaze IG-30]]&lt;br /&gt;
* [[Eye Tracker Eyemax|DynaVox Technologies: EyeMax System]]&lt;br /&gt;
* [[Eye Tracker Erica|Eye Response Technologies: ERICA]]&lt;br /&gt;
* [[Eye Tracker Eyetech|EyeTech Digital Systems: EyeTech TM3, TM4, VT1 and VT2]]&lt;br /&gt;
* [[Eye Tracker Eyecan|H.K. EyeCan: VisionKey (5+, 6V/H, 7)]]&lt;br /&gt;
* [http://eyecomcorp.com/eyecom-technology/assistive-communication/ Eye-Com] &lt;br /&gt;
* [[Eye Tracker Seetech|HumanElektronik GmbH: SeeTech]]&lt;br /&gt;
* [[Eye Tracker LCTechnologies|LC Technologies: The Eyegaze Communication System, Eyegaze Edge and Eyegaze Edge Tablet]]&lt;br /&gt;
* [[Eye Tracker Metrovision|Metrovision: VISIOBOARD]]&lt;br /&gt;
* [[Eye Tracker EagleEyes|Opportunity Foundation of America: EagleEyes]]&lt;br /&gt;
* [[Eye Tracker Ecopoint|PRC (Prentke Romich Company): ECOpoint]]&lt;br /&gt;
* [[Eye Tracker Technoworks|TechnoWorks CO.,LTD.: TE-9100 Nursing System for Enhancing Patients' Self-support]]&lt;br /&gt;
* [[Eye Tracker Tobii|Tobii Technology: Tobii C8, C12, CEye, MyTobii P10, D10]]&lt;br /&gt;
* [http://www.utechzone.com.tw/spring Utechzone: Spring]&lt;br /&gt;
&lt;br /&gt;
== Eyetrackers for eye movement research, analysis and evaluation ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.amtech.de/ AmTech GmbH], Compact Intergrated Pupillograph (CIP), Pupillograhic Sleepiness Test (PST), table mounted, monocular, video based systems&lt;br /&gt;
* [http://www.a-s-l.com/ Applied Science Laboratories], ASL, eye tracking and pupillometry systems, both IROG (limbus tracker) and VOG (video) based systems, both head mounted and remote tracking, also mobile tracking!&lt;br /&gt;
* [http://arringtonresearch.com/ Arrington Research], ViewPoint EyeTracker, both remote and head mounted, video based&lt;br /&gt;
* [http://www.crsltd.com/ Cambridge Research Systems Ltd.], MR-Eyetracker, a low-cost, contact-free eyetracker for fMRI &amp;amp;amp; MEG&lt;br /&gt;
* [http://www.chronos-vision.de/ Chronos Vision] eye tracking devices are used in e.g. neuroscience, ophthalmology, refractive surgery or clinical research. The classic Chronos Eye Tracker was deployed on the International Space Station (ISS) in early 2004 and is in continuous use for the study of eye and head coordination during long-term stays in the weightlessness of spaceflight.&lt;br /&gt;
* [http://www.clsprofakt.com/ CLS ProFakt Ltd], offers eye tracking services, analysis software and an integrated virtual shopping with eye-tracking tool for FMCG manufacturers&lt;br /&gt;
* [http://designinteractive.net/?p=517 easyGaze(R)], a low-cost high fidelity eye-tracker for research and training enhancement&lt;br /&gt;
* [http://www.interlog.com/~elmarinc/ EL-MAR Inc.], VISION 2000, portable head mounted video based eye-tracking systems&lt;br /&gt;
* [http://www.ergoneers.de/en/products/dlab-dikablis/overview.html Ergoneers Dikablis], Soft- and hardwaresuite  D-Lab &amp;amp; Dikablis for planning, performing and analyzing eye-tracking and behavioral experiments; fully automated gaze-data analysis in any environment without any restrictions in head and body movement for a motion range of 500m with Dikablis Wireless eye-tracking system.&lt;br /&gt;
* [http://eyecomcorp.com/ Eye-Com] wearable eye tracking and head tracking for clinical and human factors research. &lt;br /&gt;
* [http://www.eyetechds.com/ EyeTech Digital Systems], EyeTech TM3 Eye Tracker Add-on, Research Package, and MegaTracker with free API with full access to raw gaze data and eye metrics&lt;br /&gt;
* [http://www.eyetracking.com/ EyeTracking, Inc.], technology developed by [http://www.sci.sdsu.edu/cerf/content/Eyestudies.html Marshall &amp;amp;amp; CERF, San Diego State University]&lt;br /&gt;
* [http://www.fourward.com/ Fourward Technologies, Inc.], Dual-Purkinje-Image (DPI) Eyetracker, mainly for research purposes&lt;br /&gt;
* [http://www.brain.northwestern.edu/ilab/ ILAB], eye movement analysis software, works with a number of common eye trackers by ASL, ISCAN, and SMI, reads also CORTEX files&lt;br /&gt;
* [http://www.interactive-minds.com/ Interactive Minds], Eye tracking software and tools &lt;br /&gt;
* [http://www.is.cs.cmu.edu/mie/ Interactive Systems Labs], Model-based face and gaze tracking (from video image), Carnegie Mellon University&lt;br /&gt;
* [http://www.iota.se/ Iota AB], EyeTrace Systems, head mounted, binocular, video and IR based eye trace systems&lt;br /&gt;
* [http://www.iscaninc.com/ ISCAN], Eye &amp;amp;amp; Target Tracking Instrumentation, head mounted and remote eye tracking systems, single and multible target video tracking systems&lt;br /&gt;
* [http://www.lctinc.com LC Technologies Inc.], a remote video based eyegaze development system for human factors research&lt;br /&gt;
* [http://www.mangold-international.com/ Mangold International], MangoldVision for lightweight, portable eye tracking, solutions for both remote and head-mounted eye tracking. Software for data recording and analysis.&lt;br /&gt;
* [http://www.metrovision.fr/ Metrovision], MonEOG: Electro-oculography (EOG) potential measurement based gaze tracking, MonVOG1&amp;amp;amp;2: video-oculography (VOG) based gaze tracking&lt;br /&gt;
* [http://www.mirametrix.com/ Mirametrix], Portable, remote, USB based eye tracking for academic and market research with the S1 eye tracker and easy to use open standard API &lt;br /&gt;
* [http://www.nacinc.com/ NAC Image Technology], NAC EMR-8 eye path tracking (IROG based)&lt;br /&gt;
* [http://www.ober-consulting.com/9/lang/1/ Ober Consulting Poland: JAZZ-novo], portable multisensor system with IR based eye-tracker (1 kHz temporal resolution), head rotation and tilt measurement, blood pulse monitoring, voice recording and optional video context recording, designed to study human interaction with environment.&lt;br /&gt;
* [http://www.ober-consulting.com/11/lang/1/ Ober Consulting Poland: Saccadometer], portable eye movement laboratory for study on saccadic reactions using multiple diagnostic experiments, integrated stimulation and eye movement measurement and recording system, head mounted, IR based (1 kHz temporal resolution).&lt;br /&gt;
* [http://www.optom.de/ Optomotor Laboratories], Express-Eye, a stand-alone eye tracker with saccade analysis, and FixTrain, a small hand held device for daily training of saccadic eye movement control&lt;br /&gt;
* [http://www.primelec.ch/ Primelec, D. Florin], Angle-Meter NT, a digitally controlled scleral search coil system for the linear detection of 3D angular eye and head movements&lt;br /&gt;
* [http://www.seeingmachines.com/ Seeing Machines], faceLAB, a 3D head position and eye-gaze direction tracking system (VOG based)&lt;br /&gt;
* [http://www.smivision.com/en/eye-gaze-tracking-systems/home.html SensoMotoric Instruments GmbH], Remote (RED), head mounted (HED) and Hi-Speed eye and gaze tracking for research and applied science, open programming interface and comprehensive stimulus/analysis software.&lt;br /&gt;
* [http://www.skalar.nl/ Skalar Medical BV], head mounted Chronos and IRIS eye trackers, Scleral Search Coil Systems&lt;br /&gt;
* [http://www.smarteye.se/ Smart Eye AB], eye tracking analysis based on any standard camera(s), analog or digital&lt;br /&gt;
* [http://www.eyelinkinfo.com/ SR Research Ltd], EyeLink II, video based, head mounted eye tracking system&lt;br /&gt;
* [http://www.synthenv.com/eyetalk.htm Synthetic Environments, Inc.], EyeTalk integrates voice recognition and eye-tracking&lt;br /&gt;
* [http://www.testusability.com/ TestUsability], EyeCatcher system measures eye scanning and mouse clicking, a helmet fitted with cameras, optics and a microphone&lt;br /&gt;
* [http://www.thomasrecording.de/ Thomas RECORDING GmbH], Eye-Tracking-System (ET-49) system, constructed for neuro-scientific purposes and enables a laboratory to correlate the monkey's eye position&lt;br /&gt;
* [http://www.tobii.com/ Tobii Technology], Tobii T60 and T120 Eye Trackers - both integrated into a 17&amp;quot; TFT monitor, and Tobii X120 Eye Tracker - a standalone eye tracking unit designed for eye tracking studies relative to any surface.&lt;br /&gt;
&lt;br /&gt;
== Open source gaze tracking and freeware eye tracking ==&lt;br /&gt;
&lt;br /&gt;
This list contains low-cost, free and open source eye tracking systems and research prototypes, and information that should help in building your own eye tracker. Some of them are targeted at people with disabilities (eye-control systems), some for more general eye tracking and research.&lt;br /&gt;
&lt;br /&gt;
* [http://www.asterics.eu AsTeRICS], Assistive Technology Rapid Integration &amp;amp; Construction Set (see downloads)&lt;br /&gt;
* [http://www.eyewriter.org/ EyeWriter], low-cost eye-tracking apparatus &amp;amp; custom open-source software that allows graffiti writers and artists with paralysis to draw using only their eyes.&lt;br /&gt;
* [http://www.gazegroup.org/downloads/23-gazetracker/ ITU Gaze Tracker], works with a webcam or videocamera with nightvision and infrared illumination.&lt;br /&gt;
* [http://thirtysixthspan.com/openEyes/ openEyes], open-source open-hardware toolkit for low-cost real-time eye tracking&amp;lt;br /&amp;gt; See also the [http://joelclemens.colinr.ca/eyetrack/ Windows version] by Joel Clemens.&lt;br /&gt;
* [http://www.inference.phy.cam.ac.uk/opengazer/ Opengazer], open-source gaze tracker for ordinary webcams.&lt;br /&gt;
* [http://www.codeproject.com/KB/cpp/TrackEye.aspx TrackEye]&amp;lt;nowiki&amp;gt;: Real-Time Tracking Of Human Eyes Using a Webcam. Implemented in C++ using the &amp;lt;/nowiki&amp;gt;[http://sourceforge.net/project/showfiles.php?group_id=22870&amp;amp;package_id=16937 OpenCV library].&lt;br /&gt;
* [http://myeye.jimdo.com/ myEye], eye-tracking software to allow people with severe motor disabilities to use gaze as an input device for interacting with a computer. Beta version of the prototype software available for download.&lt;br /&gt;
&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/software.html Software for the automated classification of fixations and saccades], contains the implementation of five popular eye movement classification algorithms by. O. Komogortsev et al. at The Texas State University.&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/2010/08/how-to-build-low-cost-eye-tracking.html How to build low cost eye tracker] - instructions.&lt;br /&gt;
&lt;br /&gt;
== Low-cost eye tracking ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.ober-consulting.com/13/lang/1/ Bink-IT], a communication and environmental control system based on eye blinks by Ober Consulting.&lt;br /&gt;
* [http://www.youtube.com/user/dmardanbeigi Dias Eye Tracker], low cost eye tracker developed by Diako Mardanbeigi at the Iran University of Science &amp;amp; Technology.&lt;br /&gt;
* [http://cyber.felk.cvut.cz/i4c/en_system.html I4Control®], low-cost eye control system (under development)&lt;br /&gt;
* [http://www.magickey.ipg.pt/ Magic Key], low-cost eye control system (under development)&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyetrackingtest.com/ Eye Tracking Test], affordable eye tracking and usability services&lt;br /&gt;
&lt;br /&gt;
== Open source and freeware eye movement analysis tools ==&lt;br /&gt;
&lt;br /&gt;
* [http://thediemproject.wordpress.com/software/ CARPE] &amp;quot;Computational and Algorithmic Representation and Processing of Eye-movements&amp;quot; visualisation and analysis tool by the [http://thediemproject.wordpress.com/ DIEM] (Dynamic Images and Eye-Movements) Project&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/etud.html COGAIN ETU Driver], Eye-Tracking Universal (Standard) Driver, which helps the developer to '''build tracker-independent applications''' and test them off-line with a gaze data simulator!&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/icomp.html iComponent], tracker-independent analysis and visualization tool by Oleg Spakov.&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/gwm.html GWM: Gaze-to-Word Mapping Tool], a collection of gaze-to-word mapping engines, text mask creators and translation, and more...&lt;br /&gt;
&lt;br /&gt;
* [http://didaktik.physik.fu-berlin.de/projekte/ogama/ OGAMA (OpenGazeAndMouseAnalyzer)], an open source software designed to analyze eye and mouse movements in slideshow study designs&lt;br /&gt;
* [http://sourceforge.net/projects/ritcode/ RITCode] analysis tool for captured eye tracker video files, created by the Rochester Institute Of Technology Visual Perception Lab.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''See also: [[Eye Gaze Communication Board]], low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication need&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
Please email any corrections or additions to [mailto:office@cogain.org office@cogain.org].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2727</id>
		<title>Bibliography Eye Tracking Books</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2727"/>
		<updated>2012-02-02T11:10:48Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Applied Eye Tracking, Gaze Interaction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Reference]][[Category:Bibliography]]&lt;br /&gt;
&lt;br /&gt;
Books related to eye tracking, eye movement research, gaze interaction&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Theory and Practice ===&lt;br /&gt;
&lt;br /&gt;
* Duchowski, A. T. ''Eye Tracking Methodology, Theory and Practice'', Springer, 2003. [[http://andrewd.ces.clemson.edu/book/ More information ...]]&lt;br /&gt;
* Hammoud, R. I. (Ed.) ''Passive Eye Monitoring'', Springer, 2008. [[http://www.springerlink.com/content/x15261 SpringerLink]]&lt;br /&gt;
* Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., and Räihä, K-J. (Eds.) ''Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies'', IGI Global, 2011 [[http://www.igi-global.com/book/gaze-interaction-applications-eye-tracking/51941#table-of-contents contents]]&lt;br /&gt;
&lt;br /&gt;
=== Applied Eye Tracking, Gaze Interaction ===&lt;br /&gt;
&lt;br /&gt;
* Adam, P. S., Edmonds, R., Quinn, S., ''Eyetracking the News'', The Poynter Institute, 2007. [[http://www.poynter.org/shop/product_view.asp?id=1193 Poynter]]&lt;br /&gt;
* DiMattia, P. A., Curran, F. X., Gips, J. ''An Eye Control Teaching Device for Students Without Language Expressive Capacity: Eagleeyes'', Edwin Mellen Press, 2001. [[http://www.mellenpress.com/mellenpress.cfm?bookid=512&amp;amp;pc=9 Mellen Press]] [[http://www.amazon.com/Teaching-Students-Language-Expressive-Capacity/dp/0773476393/ref=sr_1_3?ie=UTF8&amp;amp;s=books&amp;amp;qid=1217493159&amp;amp;sr=1-3 Amazon]]&lt;br /&gt;
* Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., and Räihä, K-J. (Eds.) ''Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies'', IGI Global, 2011 [[http://www.igi-global.com/book/gaze-interaction-applications-eye-tracking/51941#table-of-contents contents]]&lt;br /&gt;
&lt;br /&gt;
=== Eye Movement Research, Vision Psychology, Perception ===&lt;br /&gt;
&lt;br /&gt;
* Findlay, J. M., Walker, R., and Kentridge, R. W. ''Eye Movement Research. Mechanisms, Processes and Applications'', Elsevier Science, 1995. [[http://www.elsevier.com/wps/find/bookdescription.cws_home/524669/description Elsevier]]&lt;br /&gt;
* Gregory, R. L. ''Eye and Brain, The Psychology of Seeing'', Fifth Edition. Princeton University Press, 1997. [[http://press.princeton.edu/titles/6016.html More information ...]]&lt;br /&gt;
* Hyönä, J., Radach, R., and Deubel, H. ''The mind's eye: Cognitive and applied aspects of eye movement research''. Amsterdam, The Netherlands: North-Holland, 2003. [[http://www.elsevier.com/locate/isbn/0444510206 Elsevier]]&lt;br /&gt;
* Underwood, G., ''Eye Guidance in Reading and Scene Perception'', Elsevier, 1998. [[http://www.elsevier.com/locate/inca/600997 Elsevier]]&lt;br /&gt;
* Wade, N. J. ''A Natural History of Vision'', Bradford Books (MIT), 1998. [[http://mitpress.mit.edu/catalog/author/default.asp?aid=369 MIT Press]]&lt;br /&gt;
* Alfred L. Yarbus (1967) ''Eye Movements and Vision''. Plenum Press: New York.&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking related Conference Proceedings ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Conference|COGAIN]] Proceedings: [[Media:COGAIN2005-proceedings.pdf|2005 (pdf)]], [[Media:COGAIN2006_Proceedings.pdf|2006 (pdf)]], [[Media:COGAIN2007Proceedings.pdf|2007 (pdf)]]&lt;br /&gt;
* [http://www.e-t-r-a.org ETRA] Proceedings in the [http://www.acm.org/dl ACM Digital Library]:  [http://portal.acm.org/toc.cfm?id=355017&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2000], [http://portal.acm.org/toc.cfm?id=507072&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2002], [http://portal.acm.org/toc.cfm?id=968363&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2004], [http://portal.acm.org/toc.cfm?id=1117309&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2006], [http://portal.acm.org/toc.cfm?id=1344471&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2008]&lt;br /&gt;
* [http://www.eyemovement.org/conference ECEM] Books of Abstracts: [http://www.phys.uu.nl/~ecem10/ECEM10.pdf 1999 (pdf)], [http://www.jemr.org/online/1/s2 2005], [http://www.jemr.org/online/1/s1 2007]&lt;br /&gt;
&lt;br /&gt;
=== Monographes, PhD and Master Theses ===&lt;br /&gt;
&lt;br /&gt;
* Bates, R. (2006) ''Enhancing the Performance of Eye and Head Mice: A Validated Assessment Method and an Investigation into the Performance of Eye and Head Based Assistive Technology Pointing Devices''. PhD Thesis, De Montfort University, May 2006. [[http://www.cse.dmu.ac.uk/~rbates/Richard%20Thesis.pdf PDF]]&lt;br /&gt;
* Drewes, H. (2010) ''Eye Gaze Tracking for Human Computer Interaction''. PhD Thesis, der Ludwig-Maximilians-Universität, München. [[http://edoc.ub.uni-muenchen.de/11591/1/Drewes_Heiko.pdf PDF]]&lt;br /&gt;
* Glenstrup, A.J. Engell-Nielsen, T (1995) ''Eye Controlled Media: Present and Future State''. University of Copenhagen. [[http://www.diku.dk/~panic/eyegaze/article.html Available online]]&lt;br /&gt;
* Hemmert, F. (2007) ''Those who want to see must close their eyes''. Master's Thesis. Potsdam University of Applied Sciences. [[http://www.eyesclosed.org/cover.html Available online]]&lt;br /&gt;
* Holsanova, J., ''Picture Viewing and Picture Description: Two Windows on the Mind''. Doctoral dissertation. ISSN 1101-8453. ISBN 91-856843-76. [[http://en.scientificcommons.org/7602363 More information ...&amp;quot;]]&lt;br /&gt;
* Hyrskykari, A. (2006) ''Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid''. Dissertations in Interactive Technology, Number 4, University of Tampere, 192 pages. [[http://acta.uta.fi/english/teos.phtml?10850 Available online]]&lt;br /&gt;
* Jönsson, E. (2005) ''If Looks Could Kill - an evaluation of Eye Tracking in Computer Games''. Master's Thesis, Royal Institute of Technology, Stockholm, Sweden. [[http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2005/rapporter05/jonsson_erika_05125.pdf PDF]]&lt;br /&gt;
* Kumar, M. (2007) ''Gaze-enhanced User Interface Design'' Dissertation submitted to Stanford University for the degree of Doctor of Philosophy, May 2007. [[http://hci.stanford.edu/research/GUIDe/publications/Manu%20Kumar%20Dissertation%20-%20Gaze-enhanced%20User%20Interface%20Design.pdf PDF]]&lt;br /&gt;
* Li, D. (2006). ''Low-cost eye-tracking for human computer interaction''. Masters Thesis, Iowa State University Human Computer Interaction Program. [[http://thirtysixthspan.com/openEyes/MS-Dongheng-Li-2006.pdf PDF]]&lt;br /&gt;
* Lukander, K., ''Mobile Usability - Measuring Gaze Point on Handheld Devices.'' Master's thesis, Helsinki University of Technology, 2003. [[http://www.soberit.hut.fi/T-121/shared/thesis/Thesis_Lukander.pdf PDF ]]&lt;br /&gt;
* Majaranta, P. (2009) ''Text Entry by Eye Gaze''. Dissertations in Interactive Technology, number 11, University of Tampere. Also available in Acta Electronica Universitatis Tamperensis; 869. [[http://acta.uta.fi/english/teos.php?id=11211 Available online]]&lt;br /&gt;
* Qvarfordt, P. (2004) ''Eyes on Multimodal Interaction''. Ph.D. thesis, Department of Computer and Information Science, Linköping University, Sweden, November 2004. [[http://www.ida.liu.se/~perqv/paper/thesis.pdf PDF]]&lt;br /&gt;
&lt;br /&gt;
=== Books written by the eyes ===&lt;br /&gt;
&lt;br /&gt;
* Vigand, Philippe and Stéphane, ''Only the Eyes Say Yes, A Love Story'', Arcade, 1997. [[http://www.amazon.com/Only-Eyes-Say-Philippe-Vigand/dp/1559705086 Amazong]] [[http://www.ebookmall.com/ebooks/only-the-eyes-say-yes-a-love-story-vigand-vigand-ebooks.htm eBookMall]]&lt;br /&gt;
* Martin, J. and Yockey, R., ''On Any Given Day.'' Blair, 2000. [[http://www.amazon.com/Any-Given-Day-Joe-Martin/dp/0895872331 Amazon]]&lt;br /&gt;
* Martin, J.. ''Fire in the Rock'', Ballantine Books, 2003. [[http://www.amazon.com/Fire-Rock-Ballantine-Readers-Circle/dp/0345456912 Amazon]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&amp;lt;br /&amp;gt; Know books that are missing from the list? Please send any additions or corrections to [mailto:office@cogain.org office (at) cogain.org]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2726</id>
		<title>Bibliography Eye Tracking Books</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2726"/>
		<updated>2012-02-02T11:09:55Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Theory and Practice */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Reference]][[Category:Bibliography]]&lt;br /&gt;
&lt;br /&gt;
Books related to eye tracking, eye movement research, gaze interaction&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Theory and Practice ===&lt;br /&gt;
&lt;br /&gt;
* Duchowski, A. T. ''Eye Tracking Methodology, Theory and Practice'', Springer, 2003. [[http://andrewd.ces.clemson.edu/book/ More information ...]]&lt;br /&gt;
* Hammoud, R. I. (Ed.) ''Passive Eye Monitoring'', Springer, 2008. [[http://www.springerlink.com/content/x15261 SpringerLink]]&lt;br /&gt;
* Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., and Räihä, K-J. (Eds.) ''Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies'', IGI Global, 2011 [[http://www.igi-global.com/book/gaze-interaction-applications-eye-tracking/51941#table-of-contents contents]]&lt;br /&gt;
&lt;br /&gt;
=== Applied Eye Tracking, Gaze Interaction ===&lt;br /&gt;
&lt;br /&gt;
* Adam, P. S., Edmonds, R., Quinn, S., ''Eyetracking the News'', The Poynter Institute, 2007. [[http://www.poynter.org/shop/product_view.asp?id=1193 Poynter]]&lt;br /&gt;
* DiMattia, P. A., Curran, F. X., Gips, J. ''An Eye Control Teaching Device for Students Without Language Expressive Capacity: Eagleeyes'', Edwin Mellen Press, 2001. [[http://www.mellenpress.com/mellenpress.cfm?bookid=512&amp;amp;pc=9 Mellen Press]] [[http://www.amazon.com/Teaching-Students-Language-Expressive-Capacity/dp/0773476393/ref=sr_1_3?ie=UTF8&amp;amp;s=books&amp;amp;qid=1217493159&amp;amp;sr=1-3 Amazon]]&lt;br /&gt;
&lt;br /&gt;
=== Eye Movement Research, Vision Psychology, Perception ===&lt;br /&gt;
&lt;br /&gt;
* Findlay, J. M., Walker, R., and Kentridge, R. W. ''Eye Movement Research. Mechanisms, Processes and Applications'', Elsevier Science, 1995. [[http://www.elsevier.com/wps/find/bookdescription.cws_home/524669/description Elsevier]]&lt;br /&gt;
* Gregory, R. L. ''Eye and Brain, The Psychology of Seeing'', Fifth Edition. Princeton University Press, 1997. [[http://press.princeton.edu/titles/6016.html More information ...]]&lt;br /&gt;
* Hyönä, J., Radach, R., and Deubel, H. ''The mind's eye: Cognitive and applied aspects of eye movement research''. Amsterdam, The Netherlands: North-Holland, 2003. [[http://www.elsevier.com/locate/isbn/0444510206 Elsevier]]&lt;br /&gt;
* Underwood, G., ''Eye Guidance in Reading and Scene Perception'', Elsevier, 1998. [[http://www.elsevier.com/locate/inca/600997 Elsevier]]&lt;br /&gt;
* Wade, N. J. ''A Natural History of Vision'', Bradford Books (MIT), 1998. [[http://mitpress.mit.edu/catalog/author/default.asp?aid=369 MIT Press]]&lt;br /&gt;
* Alfred L. Yarbus (1967) ''Eye Movements and Vision''. Plenum Press: New York.&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking related Conference Proceedings ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Conference|COGAIN]] Proceedings: [[Media:COGAIN2005-proceedings.pdf|2005 (pdf)]], [[Media:COGAIN2006_Proceedings.pdf|2006 (pdf)]], [[Media:COGAIN2007Proceedings.pdf|2007 (pdf)]]&lt;br /&gt;
* [http://www.e-t-r-a.org ETRA] Proceedings in the [http://www.acm.org/dl ACM Digital Library]:  [http://portal.acm.org/toc.cfm?id=355017&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2000], [http://portal.acm.org/toc.cfm?id=507072&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2002], [http://portal.acm.org/toc.cfm?id=968363&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2004], [http://portal.acm.org/toc.cfm?id=1117309&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2006], [http://portal.acm.org/toc.cfm?id=1344471&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2008]&lt;br /&gt;
* [http://www.eyemovement.org/conference ECEM] Books of Abstracts: [http://www.phys.uu.nl/~ecem10/ECEM10.pdf 1999 (pdf)], [http://www.jemr.org/online/1/s2 2005], [http://www.jemr.org/online/1/s1 2007]&lt;br /&gt;
&lt;br /&gt;
=== Monographes, PhD and Master Theses ===&lt;br /&gt;
&lt;br /&gt;
* Bates, R. (2006) ''Enhancing the Performance of Eye and Head Mice: A Validated Assessment Method and an Investigation into the Performance of Eye and Head Based Assistive Technology Pointing Devices''. PhD Thesis, De Montfort University, May 2006. [[http://www.cse.dmu.ac.uk/~rbates/Richard%20Thesis.pdf PDF]]&lt;br /&gt;
* Drewes, H. (2010) ''Eye Gaze Tracking for Human Computer Interaction''. PhD Thesis, der Ludwig-Maximilians-Universität, München. [[http://edoc.ub.uni-muenchen.de/11591/1/Drewes_Heiko.pdf PDF]]&lt;br /&gt;
* Glenstrup, A.J. Engell-Nielsen, T (1995) ''Eye Controlled Media: Present and Future State''. University of Copenhagen. [[http://www.diku.dk/~panic/eyegaze/article.html Available online]]&lt;br /&gt;
* Hemmert, F. (2007) ''Those who want to see must close their eyes''. Master's Thesis. Potsdam University of Applied Sciences. [[http://www.eyesclosed.org/cover.html Available online]]&lt;br /&gt;
* Holsanova, J., ''Picture Viewing and Picture Description: Two Windows on the Mind''. Doctoral dissertation. ISSN 1101-8453. ISBN 91-856843-76. [[http://en.scientificcommons.org/7602363 More information ...&amp;quot;]]&lt;br /&gt;
* Hyrskykari, A. (2006) ''Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid''. Dissertations in Interactive Technology, Number 4, University of Tampere, 192 pages. [[http://acta.uta.fi/english/teos.phtml?10850 Available online]]&lt;br /&gt;
* Jönsson, E. (2005) ''If Looks Could Kill - an evaluation of Eye Tracking in Computer Games''. Master's Thesis, Royal Institute of Technology, Stockholm, Sweden. [[http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2005/rapporter05/jonsson_erika_05125.pdf PDF]]&lt;br /&gt;
* Kumar, M. (2007) ''Gaze-enhanced User Interface Design'' Dissertation submitted to Stanford University for the degree of Doctor of Philosophy, May 2007. [[http://hci.stanford.edu/research/GUIDe/publications/Manu%20Kumar%20Dissertation%20-%20Gaze-enhanced%20User%20Interface%20Design.pdf PDF]]&lt;br /&gt;
* Li, D. (2006). ''Low-cost eye-tracking for human computer interaction''. Masters Thesis, Iowa State University Human Computer Interaction Program. [[http://thirtysixthspan.com/openEyes/MS-Dongheng-Li-2006.pdf PDF]]&lt;br /&gt;
* Lukander, K., ''Mobile Usability - Measuring Gaze Point on Handheld Devices.'' Master's thesis, Helsinki University of Technology, 2003. [[http://www.soberit.hut.fi/T-121/shared/thesis/Thesis_Lukander.pdf PDF ]]&lt;br /&gt;
* Majaranta, P. (2009) ''Text Entry by Eye Gaze''. Dissertations in Interactive Technology, number 11, University of Tampere. Also available in Acta Electronica Universitatis Tamperensis; 869. [[http://acta.uta.fi/english/teos.php?id=11211 Available online]]&lt;br /&gt;
* Qvarfordt, P. (2004) ''Eyes on Multimodal Interaction''. Ph.D. thesis, Department of Computer and Information Science, Linköping University, Sweden, November 2004. [[http://www.ida.liu.se/~perqv/paper/thesis.pdf PDF]]&lt;br /&gt;
&lt;br /&gt;
=== Books written by the eyes ===&lt;br /&gt;
&lt;br /&gt;
* Vigand, Philippe and Stéphane, ''Only the Eyes Say Yes, A Love Story'', Arcade, 1997. [[http://www.amazon.com/Only-Eyes-Say-Philippe-Vigand/dp/1559705086 Amazong]] [[http://www.ebookmall.com/ebooks/only-the-eyes-say-yes-a-love-story-vigand-vigand-ebooks.htm eBookMall]]&lt;br /&gt;
* Martin, J. and Yockey, R., ''On Any Given Day.'' Blair, 2000. [[http://www.amazon.com/Any-Given-Day-Joe-Martin/dp/0895872331 Amazon]]&lt;br /&gt;
* Martin, J.. ''Fire in the Rock'', Ballantine Books, 2003. [[http://www.amazon.com/Fire-Rock-Ballantine-Readers-Circle/dp/0345456912 Amazon]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&amp;lt;br /&amp;gt; Know books that are missing from the list? Please send any additions or corrections to [mailto:office@cogain.org office (at) cogain.org]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2725</id>
		<title>Bibliography Eye Tracking Books</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2725"/>
		<updated>2012-02-02T11:09:26Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Theory and Practice */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Reference]][[Category:Bibliography]]&lt;br /&gt;
&lt;br /&gt;
Books related to eye tracking, eye movement research, gaze interaction&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Theory and Practice ===&lt;br /&gt;
&lt;br /&gt;
* Duchowski, A. T. ''Eye Tracking Methodology, Theory and Practice'', Springer, 2003. [[http://andrewd.ces.clemson.edu/book/ More information ...]]&lt;br /&gt;
* Hammoud, R. I. ''Passive Eye Monitoring'', Springer, 2008. [[http://www.springerlink.com/content/x15261 SpringerLink]]&lt;br /&gt;
* Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., and Räihä, K-J. (Eds.) ''Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies'', IGI Global, 2011 [[http://www.igi-global.com/book/gaze-interaction-applications-eye-tracking/51941#table-of-contents contents]]&lt;br /&gt;
&lt;br /&gt;
=== Applied Eye Tracking, Gaze Interaction ===&lt;br /&gt;
&lt;br /&gt;
* Adam, P. S., Edmonds, R., Quinn, S., ''Eyetracking the News'', The Poynter Institute, 2007. [[http://www.poynter.org/shop/product_view.asp?id=1193 Poynter]]&lt;br /&gt;
* DiMattia, P. A., Curran, F. X., Gips, J. ''An Eye Control Teaching Device for Students Without Language Expressive Capacity: Eagleeyes'', Edwin Mellen Press, 2001. [[http://www.mellenpress.com/mellenpress.cfm?bookid=512&amp;amp;pc=9 Mellen Press]] [[http://www.amazon.com/Teaching-Students-Language-Expressive-Capacity/dp/0773476393/ref=sr_1_3?ie=UTF8&amp;amp;s=books&amp;amp;qid=1217493159&amp;amp;sr=1-3 Amazon]]&lt;br /&gt;
&lt;br /&gt;
=== Eye Movement Research, Vision Psychology, Perception ===&lt;br /&gt;
&lt;br /&gt;
* Findlay, J. M., Walker, R., and Kentridge, R. W. ''Eye Movement Research. Mechanisms, Processes and Applications'', Elsevier Science, 1995. [[http://www.elsevier.com/wps/find/bookdescription.cws_home/524669/description Elsevier]]&lt;br /&gt;
* Gregory, R. L. ''Eye and Brain, The Psychology of Seeing'', Fifth Edition. Princeton University Press, 1997. [[http://press.princeton.edu/titles/6016.html More information ...]]&lt;br /&gt;
* Hyönä, J., Radach, R., and Deubel, H. ''The mind's eye: Cognitive and applied aspects of eye movement research''. Amsterdam, The Netherlands: North-Holland, 2003. [[http://www.elsevier.com/locate/isbn/0444510206 Elsevier]]&lt;br /&gt;
* Underwood, G., ''Eye Guidance in Reading and Scene Perception'', Elsevier, 1998. [[http://www.elsevier.com/locate/inca/600997 Elsevier]]&lt;br /&gt;
* Wade, N. J. ''A Natural History of Vision'', Bradford Books (MIT), 1998. [[http://mitpress.mit.edu/catalog/author/default.asp?aid=369 MIT Press]]&lt;br /&gt;
* Alfred L. Yarbus (1967) ''Eye Movements and Vision''. Plenum Press: New York.&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking related Conference Proceedings ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Conference|COGAIN]] Proceedings: [[Media:COGAIN2005-proceedings.pdf|2005 (pdf)]], [[Media:COGAIN2006_Proceedings.pdf|2006 (pdf)]], [[Media:COGAIN2007Proceedings.pdf|2007 (pdf)]]&lt;br /&gt;
* [http://www.e-t-r-a.org ETRA] Proceedings in the [http://www.acm.org/dl ACM Digital Library]:  [http://portal.acm.org/toc.cfm?id=355017&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2000], [http://portal.acm.org/toc.cfm?id=507072&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2002], [http://portal.acm.org/toc.cfm?id=968363&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2004], [http://portal.acm.org/toc.cfm?id=1117309&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2006], [http://portal.acm.org/toc.cfm?id=1344471&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2008]&lt;br /&gt;
* [http://www.eyemovement.org/conference ECEM] Books of Abstracts: [http://www.phys.uu.nl/~ecem10/ECEM10.pdf 1999 (pdf)], [http://www.jemr.org/online/1/s2 2005], [http://www.jemr.org/online/1/s1 2007]&lt;br /&gt;
&lt;br /&gt;
=== Monographes, PhD and Master Theses ===&lt;br /&gt;
&lt;br /&gt;
* Bates, R. (2006) ''Enhancing the Performance of Eye and Head Mice: A Validated Assessment Method and an Investigation into the Performance of Eye and Head Based Assistive Technology Pointing Devices''. PhD Thesis, De Montfort University, May 2006. [[http://www.cse.dmu.ac.uk/~rbates/Richard%20Thesis.pdf PDF]]&lt;br /&gt;
* Drewes, H. (2010) ''Eye Gaze Tracking for Human Computer Interaction''. PhD Thesis, der Ludwig-Maximilians-Universität, München. [[http://edoc.ub.uni-muenchen.de/11591/1/Drewes_Heiko.pdf PDF]]&lt;br /&gt;
* Glenstrup, A.J. Engell-Nielsen, T (1995) ''Eye Controlled Media: Present and Future State''. University of Copenhagen. [[http://www.diku.dk/~panic/eyegaze/article.html Available online]]&lt;br /&gt;
* Hemmert, F. (2007) ''Those who want to see must close their eyes''. Master's Thesis. Potsdam University of Applied Sciences. [[http://www.eyesclosed.org/cover.html Available online]]&lt;br /&gt;
* Holsanova, J., ''Picture Viewing and Picture Description: Two Windows on the Mind''. Doctoral dissertation. ISSN 1101-8453. ISBN 91-856843-76. [[http://en.scientificcommons.org/7602363 More information ...&amp;quot;]]&lt;br /&gt;
* Hyrskykari, A. (2006) ''Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid''. Dissertations in Interactive Technology, Number 4, University of Tampere, 192 pages. [[http://acta.uta.fi/english/teos.phtml?10850 Available online]]&lt;br /&gt;
* Jönsson, E. (2005) ''If Looks Could Kill - an evaluation of Eye Tracking in Computer Games''. Master's Thesis, Royal Institute of Technology, Stockholm, Sweden. [[http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2005/rapporter05/jonsson_erika_05125.pdf PDF]]&lt;br /&gt;
* Kumar, M. (2007) ''Gaze-enhanced User Interface Design'' Dissertation submitted to Stanford University for the degree of Doctor of Philosophy, May 2007. [[http://hci.stanford.edu/research/GUIDe/publications/Manu%20Kumar%20Dissertation%20-%20Gaze-enhanced%20User%20Interface%20Design.pdf PDF]]&lt;br /&gt;
* Li, D. (2006). ''Low-cost eye-tracking for human computer interaction''. Masters Thesis, Iowa State University Human Computer Interaction Program. [[http://thirtysixthspan.com/openEyes/MS-Dongheng-Li-2006.pdf PDF]]&lt;br /&gt;
* Lukander, K., ''Mobile Usability - Measuring Gaze Point on Handheld Devices.'' Master's thesis, Helsinki University of Technology, 2003. [[http://www.soberit.hut.fi/T-121/shared/thesis/Thesis_Lukander.pdf PDF ]]&lt;br /&gt;
* Majaranta, P. (2009) ''Text Entry by Eye Gaze''. Dissertations in Interactive Technology, number 11, University of Tampere. Also available in Acta Electronica Universitatis Tamperensis; 869. [[http://acta.uta.fi/english/teos.php?id=11211 Available online]]&lt;br /&gt;
* Qvarfordt, P. (2004) ''Eyes on Multimodal Interaction''. Ph.D. thesis, Department of Computer and Information Science, Linköping University, Sweden, November 2004. [[http://www.ida.liu.se/~perqv/paper/thesis.pdf PDF]]&lt;br /&gt;
&lt;br /&gt;
=== Books written by the eyes ===&lt;br /&gt;
&lt;br /&gt;
* Vigand, Philippe and Stéphane, ''Only the Eyes Say Yes, A Love Story'', Arcade, 1997. [[http://www.amazon.com/Only-Eyes-Say-Philippe-Vigand/dp/1559705086 Amazong]] [[http://www.ebookmall.com/ebooks/only-the-eyes-say-yes-a-love-story-vigand-vigand-ebooks.htm eBookMall]]&lt;br /&gt;
* Martin, J. and Yockey, R., ''On Any Given Day.'' Blair, 2000. [[http://www.amazon.com/Any-Given-Day-Joe-Martin/dp/0895872331 Amazon]]&lt;br /&gt;
* Martin, J.. ''Fire in the Rock'', Ballantine Books, 2003. [[http://www.amazon.com/Fire-Rock-Ballantine-Readers-Circle/dp/0345456912 Amazon]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&amp;lt;br /&amp;gt; Know books that are missing from the list? Please send any additions or corrections to [mailto:office@cogain.org office (at) cogain.org]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2724</id>
		<title>Bibliography Eye Tracking Books</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Bibliography_Eye_Tracking_Books&amp;diff=2724"/>
		<updated>2012-02-02T11:08:27Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Theory and Practice */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Reference]][[Category:Bibliography]]&lt;br /&gt;
&lt;br /&gt;
Books related to eye tracking, eye movement research, gaze interaction&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Theory and Practice ===&lt;br /&gt;
&lt;br /&gt;
* Duchowski, A. T. ''Eye Tracking Methodology, Theory and Practice'', Springer, 2003. [[http://andrewd.ces.clemson.edu/book/ More information ...]]&lt;br /&gt;
* Hammoud, R. I. ''Passive Eye Monitoring'', Springer, 2008. [[http://www.springerlink.com/content/x15261 SpringerLink]]&lt;br /&gt;
* Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., and Räihä, K-J. (Eds.) Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies, IGI Global. Release date: October 2011. Copyright © 2012. 382 pages. [[http://www.igi-global.com/book/gaze-interaction-applications-eye-tracking/51941#table-of-contents contents]]&lt;br /&gt;
&lt;br /&gt;
=== Applied Eye Tracking, Gaze Interaction ===&lt;br /&gt;
&lt;br /&gt;
* Adam, P. S., Edmonds, R., Quinn, S., ''Eyetracking the News'', The Poynter Institute, 2007. [[http://www.poynter.org/shop/product_view.asp?id=1193 Poynter]]&lt;br /&gt;
* DiMattia, P. A., Curran, F. X., Gips, J. ''An Eye Control Teaching Device for Students Without Language Expressive Capacity: Eagleeyes'', Edwin Mellen Press, 2001. [[http://www.mellenpress.com/mellenpress.cfm?bookid=512&amp;amp;pc=9 Mellen Press]] [[http://www.amazon.com/Teaching-Students-Language-Expressive-Capacity/dp/0773476393/ref=sr_1_3?ie=UTF8&amp;amp;s=books&amp;amp;qid=1217493159&amp;amp;sr=1-3 Amazon]]&lt;br /&gt;
&lt;br /&gt;
=== Eye Movement Research, Vision Psychology, Perception ===&lt;br /&gt;
&lt;br /&gt;
* Findlay, J. M., Walker, R., and Kentridge, R. W. ''Eye Movement Research. Mechanisms, Processes and Applications'', Elsevier Science, 1995. [[http://www.elsevier.com/wps/find/bookdescription.cws_home/524669/description Elsevier]]&lt;br /&gt;
* Gregory, R. L. ''Eye and Brain, The Psychology of Seeing'', Fifth Edition. Princeton University Press, 1997. [[http://press.princeton.edu/titles/6016.html More information ...]]&lt;br /&gt;
* Hyönä, J., Radach, R., and Deubel, H. ''The mind's eye: Cognitive and applied aspects of eye movement research''. Amsterdam, The Netherlands: North-Holland, 2003. [[http://www.elsevier.com/locate/isbn/0444510206 Elsevier]]&lt;br /&gt;
* Underwood, G., ''Eye Guidance in Reading and Scene Perception'', Elsevier, 1998. [[http://www.elsevier.com/locate/inca/600997 Elsevier]]&lt;br /&gt;
* Wade, N. J. ''A Natural History of Vision'', Bradford Books (MIT), 1998. [[http://mitpress.mit.edu/catalog/author/default.asp?aid=369 MIT Press]]&lt;br /&gt;
* Alfred L. Yarbus (1967) ''Eye Movements and Vision''. Plenum Press: New York.&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking related Conference Proceedings ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Conference|COGAIN]] Proceedings: [[Media:COGAIN2005-proceedings.pdf|2005 (pdf)]], [[Media:COGAIN2006_Proceedings.pdf|2006 (pdf)]], [[Media:COGAIN2007Proceedings.pdf|2007 (pdf)]]&lt;br /&gt;
* [http://www.e-t-r-a.org ETRA] Proceedings in the [http://www.acm.org/dl ACM Digital Library]:  [http://portal.acm.org/toc.cfm?id=355017&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2000], [http://portal.acm.org/toc.cfm?id=507072&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2002], [http://portal.acm.org/toc.cfm?id=968363&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2004], [http://portal.acm.org/toc.cfm?id=1117309&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2006], [http://portal.acm.org/toc.cfm?id=1344471&amp;amp;coll=ACM&amp;amp;dl=ACM&amp;amp;type=proceeding&amp;amp;idx=SERIES053&amp;amp;part=series&amp;amp;WantType=Proceedings&amp;amp;title=ETRA&amp;amp;CFID=38979864&amp;amp;CFTOKEN=83799797 2008]&lt;br /&gt;
* [http://www.eyemovement.org/conference ECEM] Books of Abstracts: [http://www.phys.uu.nl/~ecem10/ECEM10.pdf 1999 (pdf)], [http://www.jemr.org/online/1/s2 2005], [http://www.jemr.org/online/1/s1 2007]&lt;br /&gt;
&lt;br /&gt;
=== Monographes, PhD and Master Theses ===&lt;br /&gt;
&lt;br /&gt;
* Bates, R. (2006) ''Enhancing the Performance of Eye and Head Mice: A Validated Assessment Method and an Investigation into the Performance of Eye and Head Based Assistive Technology Pointing Devices''. PhD Thesis, De Montfort University, May 2006. [[http://www.cse.dmu.ac.uk/~rbates/Richard%20Thesis.pdf PDF]]&lt;br /&gt;
* Drewes, H. (2010) ''Eye Gaze Tracking for Human Computer Interaction''. PhD Thesis, der Ludwig-Maximilians-Universität, München. [[http://edoc.ub.uni-muenchen.de/11591/1/Drewes_Heiko.pdf PDF]]&lt;br /&gt;
* Glenstrup, A.J. Engell-Nielsen, T (1995) ''Eye Controlled Media: Present and Future State''. University of Copenhagen. [[http://www.diku.dk/~panic/eyegaze/article.html Available online]]&lt;br /&gt;
* Hemmert, F. (2007) ''Those who want to see must close their eyes''. Master's Thesis. Potsdam University of Applied Sciences. [[http://www.eyesclosed.org/cover.html Available online]]&lt;br /&gt;
* Holsanova, J., ''Picture Viewing and Picture Description: Two Windows on the Mind''. Doctoral dissertation. ISSN 1101-8453. ISBN 91-856843-76. [[http://en.scientificcommons.org/7602363 More information ...&amp;quot;]]&lt;br /&gt;
* Hyrskykari, A. (2006) ''Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid''. Dissertations in Interactive Technology, Number 4, University of Tampere, 192 pages. [[http://acta.uta.fi/english/teos.phtml?10850 Available online]]&lt;br /&gt;
* Jönsson, E. (2005) ''If Looks Could Kill - an evaluation of Eye Tracking in Computer Games''. Master's Thesis, Royal Institute of Technology, Stockholm, Sweden. [[http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2005/rapporter05/jonsson_erika_05125.pdf PDF]]&lt;br /&gt;
* Kumar, M. (2007) ''Gaze-enhanced User Interface Design'' Dissertation submitted to Stanford University for the degree of Doctor of Philosophy, May 2007. [[http://hci.stanford.edu/research/GUIDe/publications/Manu%20Kumar%20Dissertation%20-%20Gaze-enhanced%20User%20Interface%20Design.pdf PDF]]&lt;br /&gt;
* Li, D. (2006). ''Low-cost eye-tracking for human computer interaction''. Masters Thesis, Iowa State University Human Computer Interaction Program. [[http://thirtysixthspan.com/openEyes/MS-Dongheng-Li-2006.pdf PDF]]&lt;br /&gt;
* Lukander, K., ''Mobile Usability - Measuring Gaze Point on Handheld Devices.'' Master's thesis, Helsinki University of Technology, 2003. [[http://www.soberit.hut.fi/T-121/shared/thesis/Thesis_Lukander.pdf PDF ]]&lt;br /&gt;
* Majaranta, P. (2009) ''Text Entry by Eye Gaze''. Dissertations in Interactive Technology, number 11, University of Tampere. Also available in Acta Electronica Universitatis Tamperensis; 869. [[http://acta.uta.fi/english/teos.php?id=11211 Available online]]&lt;br /&gt;
* Qvarfordt, P. (2004) ''Eyes on Multimodal Interaction''. Ph.D. thesis, Department of Computer and Information Science, Linköping University, Sweden, November 2004. [[http://www.ida.liu.se/~perqv/paper/thesis.pdf PDF]]&lt;br /&gt;
&lt;br /&gt;
=== Books written by the eyes ===&lt;br /&gt;
&lt;br /&gt;
* Vigand, Philippe and Stéphane, ''Only the Eyes Say Yes, A Love Story'', Arcade, 1997. [[http://www.amazon.com/Only-Eyes-Say-Philippe-Vigand/dp/1559705086 Amazong]] [[http://www.ebookmall.com/ebooks/only-the-eyes-say-yes-a-love-story-vigand-vigand-ebooks.htm eBookMall]]&lt;br /&gt;
* Martin, J. and Yockey, R., ''On Any Given Day.'' Blair, 2000. [[http://www.amazon.com/Any-Given-Day-Joe-Martin/dp/0895872331 Amazon]]&lt;br /&gt;
* Martin, J.. ''Fire in the Rock'', Ballantine Books, 2003. [[http://www.amazon.com/Fire-Rock-Ballantine-Readers-Circle/dp/0345456912 Amazon]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&amp;lt;br /&amp;gt; Know books that are missing from the list? Please send any additions or corrections to [mailto:office@cogain.org office (at) cogain.org]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2723</id>
		<title>Eye Trackers</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2723"/>
		<updated>2012-01-10T06:39:08Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eyetrackers for eye movement research, analysis and evaluation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
'''A catalogue of currently available eye trackers, categorized into systems for assistive technology, research purposes etc.'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Eye Trackers for Assistive Technology and AAC ==&lt;br /&gt;
&lt;br /&gt;
Commercial eye tracking systems that are used for controlling a computer or as communication aids by people with disabilities.&lt;br /&gt;
&lt;br /&gt;
* [[Eye Tracker Intelligaze|Alea Technologies Gmbh: Intelligaze IG-30]]&lt;br /&gt;
* [[Eye Tracker Eyemax|DynaVox Technologies: EyeMax System]]&lt;br /&gt;
* [[Eye Tracker Erica|Eye Response Technologies: ERICA]]&lt;br /&gt;
* [[Eye Tracker Eyetech|EyeTech Digital Systems: EyeTech TM3, TM4, VT1 and VT2]]&lt;br /&gt;
* [[Eye Tracker Eyecan|H.K. EyeCan: VisionKey (5+, 6V/H, 7)]]&lt;br /&gt;
* [http://eyecomcorp.com/eyecom-technology/assistive-communication/ Eye-Com] &lt;br /&gt;
* [[Eye Tracker Seetech|HumanElektronik GmbH: SeeTech]]&lt;br /&gt;
* [[Eye Tracker LCTechnologies|LC Technologies: The Eyegaze Communication System, Eyegaze Edge and Eyegaze Edge Tablet]]&lt;br /&gt;
* [[Eye Tracker Metrovision|Metrovision: VISIOBOARD]]&lt;br /&gt;
* [[Eye Tracker EagleEyes|Opportunity Foundation of America: EagleEyes]]&lt;br /&gt;
* [[Eye Tracker Ecopoint|PRC (Prentke Romich Company): ECOpoint]]&lt;br /&gt;
* [[Eye Tracker Technoworks|TechnoWorks CO.,LTD.: TE-9100 Nursing System for Enhancing Patients' Self-support]]&lt;br /&gt;
* [[Eye Tracker Tobii|Tobii Technology: Tobii C8, C12, CEye, MyTobii P10, D10]]&lt;br /&gt;
* [http://www.utechzone.com.tw/spring Utechzone: Spring]&lt;br /&gt;
&lt;br /&gt;
== Eyetrackers for eye movement research, analysis and evaluation ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.amtech.de/ AmTech GmbH], Compact Intergrated Pupillograph (CIP), Pupillograhic Sleepiness Test (PST), table mounted, monocular, video based systems&lt;br /&gt;
* [http://www.a-s-l.com/ Applied Science Laboratories], ASL, eye tracking and pupillometry systems, both IROG (limbus tracker) and VOG (video) based systems, both head mounted and remote tracking, also mobile tracking!&lt;br /&gt;
* [http://arringtonresearch.com/ Arrington Research], ViewPoint EyeTracker, both remote and head mounted, video based&lt;br /&gt;
* [http://www.crsltd.com/ Cambridge Research Systems Ltd.], MR-Eyetracker, a low-cost, contact-free eyetracker for fMRI &amp;amp;amp; MEG&lt;br /&gt;
* [http://www.chronos-vision.de/ Chronos Vision] eye tracking devices are used in e.g. neuroscience, ophthalmology, refractive surgery or clinical research. The classic Chronos Eye Tracker was deployed on the International Space Station (ISS) in early 2004 and is in continuous use for the study of eye and head coordination during long-term stays in the weightlessness of spaceflight.&lt;br /&gt;
* [http://www.clsprofakt.com/ CLS ProFakt Ltd], offers eye tracking services, analysis software and an integrated virtual shopping with eye-tracking tool for FMCG manufacturers&lt;br /&gt;
* [http://designinteractive.net/?p=517 easyGaze(R)], a low-cost high fidelity eye-tracker for research and training enhancement&lt;br /&gt;
* [http://www.interlog.com/~elmarinc/ EL-MAR Inc.], VISION 2000, portable head mounted video based eye-tracking systems&lt;br /&gt;
* [http://www.ergoneers.de/en/products/dlab-dikablis/overview.html Ergoneers Dikablis], Soft- and hardwaresuite  D-Lab &amp;amp; Dikablis for planning, performing and analyzing eye-tracking and behavioral experiments; fully automated gaze-data analysis in any environment without any restrictions in head and body movement for a motion range of 500m with Dikablis Wireless eye-tracking system.&lt;br /&gt;
* [http://eyecomcorp.com/ Eye-Com] wearable eye tracking and head tracking for clinical and human factors research. &lt;br /&gt;
* [http://www.eyetechds.com/ EyeTech Digital Systems], EyeTech TM3 Eye Tracker Add-on, Research Package, and MegaTracker with free API with full access to raw gaze data and eye metrics&lt;br /&gt;
* [http://www.eyetracking.com/ EyeTracking, Inc.], technology developed by [http://www.sci.sdsu.edu/cerf/content/Eyestudies.html Marshall &amp;amp;amp; CERF, San Diego State University]&lt;br /&gt;
* [http://www.fourward.com/ Fourward Technologies, Inc.], Dual-Purkinje-Image (DPI) Eyetracker, mainly for research purposes&lt;br /&gt;
* [http://www.brain.northwestern.edu/ilab/ ILAB], eye movement analysis software, works with a number of common eye trackers by ASL, ISCAN, and SMI, reads also CORTEX files&lt;br /&gt;
* [http://www.interactive-minds.com/ Interactive Minds], Eye tracking software and tools &lt;br /&gt;
* [http://www.is.cs.cmu.edu/mie/ Interactive Systems Labs], Model-based face and gaze tracking (from video image), Carnegie Mellon University&lt;br /&gt;
* [http://www.iota.se/ Iota AB], EyeTrace Systems, head mounted, binocular, video and IR based eye trace systems&lt;br /&gt;
* [http://www.iscaninc.com/ ISCAN], Eye &amp;amp;amp; Target Tracking Instrumentation, head mounted and remote eye tracking systems, single and multible target video tracking systems&lt;br /&gt;
* [http://www.lctinc.com LC Technologies Inc.], a remote video based eyegaze development system for human factors research&lt;br /&gt;
* [http://www.mangold-international.com/ Mangold International], MangoldVision for lightweight, portable eye tracking, solutions for both remote and head-mounted eye tracking. Software for data recording and analysis.&lt;br /&gt;
* [http://www.metrovision.fr/ Metrovision], MonEOG: Electro-oculography (EOG) potential measurement based gaze tracking, MonVOG1&amp;amp;amp;2: video-oculography (VOG) based gaze tracking&lt;br /&gt;
* [http://www.mirametrix.com/ Mirametrix], Portable, remote, USB based eye tracking for academic and market research with the S1 eye tracker and easy to use open standard API &lt;br /&gt;
* [http://www.nacinc.com/ NAC Image Technology], NAC EMR-8 eye path tracking (IROG based)&lt;br /&gt;
* [http://www.ober-consulting.com/9/lang/1/ Ober Consulting Poland: JAZZ-novo], portable multisensor system with IR based eye-tracker (1 kHz temporal resolution), head rotation and tilt measurement, blood pulse monitoring, voice recording and optional video context recording, designed to study human interaction with environment.&lt;br /&gt;
* [http://www.ober-consulting.com/11/lang/1/ Ober Consulting Poland: Saccadometer], portable eye movement laboratory for study on saccadic reactions using multiple diagnostic experiments, integrated stimulation and eye movement measurement and recording system, head mounted, IR based (1 kHz temporal resolution).&lt;br /&gt;
* [http://www.optom.de/ Optomotor Laboratories], Express-Eye, a stand-alone eye tracker with saccade analysis, and FixTrain, a small hand held device for daily training of saccadic eye movement control&lt;br /&gt;
* [http://www.primelec.ch/ Primelec, D. Florin], Angle-Meter NT, a digitally controlled scleral search coil system for the linear detection of 3D angular eye and head movements&lt;br /&gt;
* [http://www.seeingmachines.com/ Seeing Machines], faceLAB, a 3D head position and eye-gaze direction tracking system (VOG based)&lt;br /&gt;
* [http://www.smivision.com/en/eye-gaze-tracking-systems/home.html SensoMotoric Instruments GmbH], Remote (RED), head mounted (HED) and Hi-Speed eye and gaze tracking for research and applied science, open programming interface and comprehensive stimulus/analysis software.&lt;br /&gt;
* [http://www.skalar.nl/ Skalar Medical BV], head mounted Chronos and IRIS eye trackers, Scleral Search Coil Systems&lt;br /&gt;
* [http://www.smarteye.se/ Smart Eye AB], eye tracking analysis based on any standard camera(s), analog or digital&lt;br /&gt;
* [http://www.eyelinkinfo.com/ SR Research Ltd], EyeLink II, video based, head mounted eye tracking system&lt;br /&gt;
* [http://www.synthenv.com/eyetalk.htm Synthetic Environments, Inc.], EyeTalk integrates voice recognition and eye-tracking&lt;br /&gt;
* [http://www.testusability.com/ TestUsability], EyeCatcher system measures eye scanning and mouse clicking, a helmet fitted with cameras, optics and a microphone&lt;br /&gt;
* [http://www.thomasrecording.de/ Thomas RECORDING GmbH], Eye-Tracking-System (ET-49) system, constructed for neuro-scientific purposes and enables a laboratory to correlate the monkey's eye position&lt;br /&gt;
* [http://www.tobii.com/ Tobii Technology], Tobii T60 and T120 Eye Trackers - both integrated into a 17&amp;quot; TFT monitor, and Tobii X120 Eye Tracker - a standalone eye tracking unit designed for eye tracking studies relative to any surface.&lt;br /&gt;
&lt;br /&gt;
== Open source gaze tracking and freeware eye tracking ==&lt;br /&gt;
&lt;br /&gt;
This list contains low-cost, free and open source eye tracking systems and research prototypes, and information that should help in building your own eye tracker. Some of them are targeted at people with disabilities (eye-control systems), some for more general eye tracking and research.&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyewriter.org/ EyeWriter], low-cost eye-tracking apparatus &amp;amp; custom open-source software that allows graffiti writers and artists with paralysis to draw using only their eyes.&lt;br /&gt;
* [http://www.gazegroup.org/downloads/23-gazetracker/ ITU Gaze Tracker], works with a webcam or videocamera with nightvision and infrared illumination.&lt;br /&gt;
* [http://thirtysixthspan.com/openEyes/ openEyes], open-source open-hardware toolkit for low-cost real-time eye tracking&amp;lt;br /&amp;gt; See also the [http://joelclemens.colinr.ca/eyetrack/ Windows version] by Joel Clemens.&lt;br /&gt;
* [http://www.inference.phy.cam.ac.uk/opengazer/ Opengazer], open-source gaze tracker for ordinary webcams.&lt;br /&gt;
* [http://www.codeproject.com/KB/cpp/TrackEye.aspx TrackEye]&amp;lt;nowiki&amp;gt;: Real-Time Tracking Of Human Eyes Using a Webcam. Implemented in C++ using the &amp;lt;/nowiki&amp;gt;[http://sourceforge.net/project/showfiles.php?group_id=22870&amp;amp;package_id=16937 OpenCV library].&lt;br /&gt;
* [http://myeye.jimdo.com/ myEye], eye-tracking software to allow people with severe motor disabilities to use gaze as an input device for interacting with a computer. Beta version of the prototype software available for download.&lt;br /&gt;
&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/software.html Software for the automated classification of fixations and saccades], contains the implementation of five popular eye movement classification algorithms by. O. Komogortsev et al. at The Texas State University.&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/2010/08/how-to-build-low-cost-eye-tracking.html How to build low cost eye tracker] - instructions.&lt;br /&gt;
&lt;br /&gt;
== Low-cost eye tracking ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.ober-consulting.com/13/lang/1/ Bink-IT], a communication and environmental control system based on eye blinks by Ober Consulting.&lt;br /&gt;
* [http://www.youtube.com/user/dmardanbeigi Dias Eye Tracker], low cost eye tracker developed by Diako Mardanbeigi at the Iran University of Science &amp;amp; Technology.&lt;br /&gt;
* [http://cyber.felk.cvut.cz/i4c/en_system.html I4Control®], low-cost eye control system (under development)&lt;br /&gt;
* [http://www.magickey.ipg.pt/ Magic Key], low-cost eye control system (under development)&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyetrackingtest.com/ Eye Tracking Test], affordable eye tracking and usability services&lt;br /&gt;
&lt;br /&gt;
== Open source and freeware eye movement analysis tools ==&lt;br /&gt;
&lt;br /&gt;
* [http://thediemproject.wordpress.com/software/ CARPE] &amp;quot;Computational and Algorithmic Representation and Processing of Eye-movements&amp;quot; visualisation and analysis tool by the [http://thediemproject.wordpress.com/ DIEM] (Dynamic Images and Eye-Movements) Project&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/etud.html COGAIN ETU Driver], Eye-Tracking Universal (Standard) Driver, which helps the developer to '''build tracker-independent applications''' and test them off-line with a gaze data simulator!&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/icomp.html iComponent], tracker-independent analysis and visualization tool by Oleg Spakov.&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/gwm.html GWM: Gaze-to-Word Mapping Tool], a collection of gaze-to-word mapping engines, text mask creators and translation, and more...&lt;br /&gt;
&lt;br /&gt;
* [http://didaktik.physik.fu-berlin.de/projekte/ogama/ OGAMA (OpenGazeAndMouseAnalyzer)], an open source software designed to analyze eye and mouse movements in slideshow study designs&lt;br /&gt;
* [http://sourceforge.net/projects/ritcode/ RITCode] analysis tool for captured eye tracker video files, created by the Rochester Institute Of Technology Visual Perception Lab.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''See also: [[Eye Gaze Communication Board]], low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication need&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
Please email any corrections or additions to [mailto:office@cogain.org office@cogain.org].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2722</id>
		<title>Eye Trackers</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2722"/>
		<updated>2012-01-10T06:37:49Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eyetrackers for eye movement research, analysis and evaluation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
'''A catalogue of currently available eye trackers, categorized into systems for assistive technology, research purposes etc.'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Eye Trackers for Assistive Technology and AAC ==&lt;br /&gt;
&lt;br /&gt;
Commercial eye tracking systems that are used for controlling a computer or as communication aids by people with disabilities.&lt;br /&gt;
&lt;br /&gt;
* [[Eye Tracker Intelligaze|Alea Technologies Gmbh: Intelligaze IG-30]]&lt;br /&gt;
* [[Eye Tracker Eyemax|DynaVox Technologies: EyeMax System]]&lt;br /&gt;
* [[Eye Tracker Erica|Eye Response Technologies: ERICA]]&lt;br /&gt;
* [[Eye Tracker Eyetech|EyeTech Digital Systems: EyeTech TM3, TM4, VT1 and VT2]]&lt;br /&gt;
* [[Eye Tracker Eyecan|H.K. EyeCan: VisionKey (5+, 6V/H, 7)]]&lt;br /&gt;
* [http://eyecomcorp.com/eyecom-technology/assistive-communication/ Eye-Com] &lt;br /&gt;
* [[Eye Tracker Seetech|HumanElektronik GmbH: SeeTech]]&lt;br /&gt;
* [[Eye Tracker LCTechnologies|LC Technologies: The Eyegaze Communication System, Eyegaze Edge and Eyegaze Edge Tablet]]&lt;br /&gt;
* [[Eye Tracker Metrovision|Metrovision: VISIOBOARD]]&lt;br /&gt;
* [[Eye Tracker EagleEyes|Opportunity Foundation of America: EagleEyes]]&lt;br /&gt;
* [[Eye Tracker Ecopoint|PRC (Prentke Romich Company): ECOpoint]]&lt;br /&gt;
* [[Eye Tracker Technoworks|TechnoWorks CO.,LTD.: TE-9100 Nursing System for Enhancing Patients' Self-support]]&lt;br /&gt;
* [[Eye Tracker Tobii|Tobii Technology: Tobii C8, C12, CEye, MyTobii P10, D10]]&lt;br /&gt;
* [http://www.utechzone.com.tw/spring Utechzone: Spring]&lt;br /&gt;
&lt;br /&gt;
== Eyetrackers for eye movement research, analysis and evaluation ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.amtech.de/ AmTech GmbH], Compact Intergrated Pupillograph (CIP), Pupillograhic Sleepiness Test (PST), table mounted, monocular, video based systems&lt;br /&gt;
* [http://www.a-s-l.com/ Applied Science Laboratories], ASL, eye tracking and pupillometry systems, both IROG (limbus tracker) and VOG (video) based systems, both head mounted and remote tracking, also mobile tracking!&lt;br /&gt;
* [http://arringtonresearch.com/ Arrington Research], ViewPoint EyeTracker, both remote and head mounted, video based&lt;br /&gt;
* [http://www.crsltd.com/ Cambridge Research Systems Ltd.], MR-Eyetracker, a low-cost, contact-free eyetracker for fMRI &amp;amp;amp; MEG&lt;br /&gt;
* [http://www.chronos-vision.de/ Chronos Vision] eye tracking devices are used in e.g. neuroscience, ophthalmology, refractive surgery or clinical research. The classic Chronos Eye Tracker was deployed on the International Space Station (ISS) in early 2004 and is in continuous use for the study of eye and head coordination during long-term stays in the weightlessness of spaceflight.&lt;br /&gt;
* [http://www.clsprofakt.com/ CLS ProFakt Ltd], offers eye tracking services, analysis software and an integrated virtual shopping with eye-tracking tool for FMCG manufacturers&lt;br /&gt;
* [http://designinteractive.net/?p=517 easyGaze(R)], A low-cost high fidelity eye-tracker for research and training enhancement.&lt;br /&gt;
* [http://www.interlog.com/~elmarinc/ EL-MAR Inc.], VISION 2000, portable head mounted video based eye-tracking systems&lt;br /&gt;
* [http://www.ergoneers.de/en/products/dlab-dikablis/overview.html Ergoneers Dikablis], Soft- and hardwaresuite  D-Lab &amp;amp; Dikablis for planning, performing and analyzing eye-tracking and behavioral experiments; fully automated gaze-data analysis in any environment without any restrictions in head and body movement for a motion range of 500m with Dikablis Wireless eye-tracking system.&lt;br /&gt;
* [http://eyecomcorp.com/ Eye-Com] wearable eye tracking and head tracking for clinical and human factors research. &lt;br /&gt;
* [http://www.eyetechds.com/ EyeTech Digital Systems], EyeTech TM3 Eye Tracker Add-on, Research Package, and MegaTracker with free API with full access to raw gaze data and eye metrics&lt;br /&gt;
* [http://www.eyetracking.com/ EyeTracking, Inc.], technology developed by [http://www.sci.sdsu.edu/cerf/content/Eyestudies.html Marshall &amp;amp;amp; CERF, San Diego State University]&lt;br /&gt;
* [http://www.fourward.com/ Fourward Technologies, Inc.], Dual-Purkinje-Image (DPI) Eyetracker, mainly for research purposes&lt;br /&gt;
* [http://www.brain.northwestern.edu/ilab/ ILAB], eye movement analysis software, works with a number of common eye trackers by ASL, ISCAN, and SMI, reads also CORTEX files&lt;br /&gt;
* [http://www.interactive-minds.com/ Interactive Minds], Eye tracking software and tools &lt;br /&gt;
* [http://www.is.cs.cmu.edu/mie/ Interactive Systems Labs], Model-based face and gaze tracking (from video image), Carnegie Mellon University&lt;br /&gt;
* [http://www.iota.se/ Iota AB], EyeTrace Systems, head mounted, binocular, video and IR based eye trace systems&lt;br /&gt;
* [http://www.iscaninc.com/ ISCAN], Eye &amp;amp;amp; Target Tracking Instrumentation, head mounted and remote eye tracking systems, single and multible target video tracking systems&lt;br /&gt;
* [http://www.lctinc.com LC Technologies Inc.], a remote video based eyegaze development system for human factors research&lt;br /&gt;
* [http://www.mangold-international.com/ Mangold International], MangoldVision for lightweight, portable eye tracking, solutions for both remote and head-mounted eye tracking. Software for data recording and analysis.&lt;br /&gt;
* [http://www.metrovision.fr/ Metrovision], MonEOG: Electro-oculography (EOG) potential measurement based gaze tracking, MonVOG1&amp;amp;amp;2: video-oculography (VOG) based gaze tracking&lt;br /&gt;
* [http://www.mirametrix.com/ Mirametrix], Portable, remote, USB based eye tracking for academic and market research with the S1 eye tracker and easy to use open standard API &lt;br /&gt;
* [http://www.nacinc.com/ NAC Image Technology], NAC EMR-8 eye path tracking (IROG based)&lt;br /&gt;
* [http://www.ober-consulting.com/9/lang/1/ Ober Consulting Poland: JAZZ-novo], portable multisensor system with IR based eye-tracker (1 kHz temporal resolution), head rotation and tilt measurement, blood pulse monitoring, voice recording and optional video context recording, designed to study human interaction with environment.&lt;br /&gt;
* [http://www.ober-consulting.com/11/lang/1/ Ober Consulting Poland: Saccadometer], portable eye movement laboratory for study on saccadic reactions using multiple diagnostic experiments, integrated stimulation and eye movement measurement and recording system, head mounted, IR based (1 kHz temporal resolution).&lt;br /&gt;
* [http://www.optom.de/ Optomotor Laboratories], Express-Eye, a stand-alone eye tracker with saccade analysis, and FixTrain, a small hand held device for daily training of saccadic eye movement control&lt;br /&gt;
* [http://www.primelec.ch/ Primelec, D. Florin], Angle-Meter NT, a digitally controlled scleral search coil system for the linear detection of 3D angular eye and head movements&lt;br /&gt;
* [http://www.seeingmachines.com/ Seeing Machines], faceLAB, a 3D head position and eye-gaze direction tracking system (VOG based)&lt;br /&gt;
* [http://www.smivision.com/en/eye-gaze-tracking-systems/home.html SensoMotoric Instruments GmbH], Remote (RED), head mounted (HED) and Hi-Speed eye and gaze tracking for research and applied science, open programming interface and comprehensive stimulus/analysis software.&lt;br /&gt;
* [http://www.skalar.nl/ Skalar Medical BV], head mounted Chronos and IRIS eye trackers, Scleral Search Coil Systems&lt;br /&gt;
* [http://www.smarteye.se/ Smart Eye AB], eye tracking analysis based on any standard camera(s), analog or digital&lt;br /&gt;
* [http://www.eyelinkinfo.com/ SR Research Ltd], EyeLink II, video based, head mounted eye tracking system&lt;br /&gt;
* [http://www.synthenv.com/eyetalk.htm Synthetic Environments, Inc.], EyeTalk integrates voice recognition and eye-tracking&lt;br /&gt;
* [http://www.testusability.com/ TestUsability], EyeCatcher system measures eye scanning and mouse clicking, a helmet fitted with cameras, optics and a microphone&lt;br /&gt;
* [http://www.thomasrecording.de/ Thomas RECORDING GmbH], Eye-Tracking-System (ET-49) system, constructed for neuro-scientific purposes and enables a laboratory to correlate the monkey's eye position&lt;br /&gt;
* [http://www.tobii.com/ Tobii Technology], Tobii T60 and T120 Eye Trackers - both integrated into a 17&amp;quot; TFT monitor, and Tobii X120 Eye Tracker - a standalone eye tracking unit designed for eye tracking studies relative to any surface.&lt;br /&gt;
&lt;br /&gt;
== Open source gaze tracking and freeware eye tracking ==&lt;br /&gt;
&lt;br /&gt;
This list contains low-cost, free and open source eye tracking systems and research prototypes, and information that should help in building your own eye tracker. Some of them are targeted at people with disabilities (eye-control systems), some for more general eye tracking and research.&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyewriter.org/ EyeWriter], low-cost eye-tracking apparatus &amp;amp; custom open-source software that allows graffiti writers and artists with paralysis to draw using only their eyes.&lt;br /&gt;
* [http://www.gazegroup.org/downloads/23-gazetracker/ ITU Gaze Tracker], works with a webcam or videocamera with nightvision and infrared illumination.&lt;br /&gt;
* [http://thirtysixthspan.com/openEyes/ openEyes], open-source open-hardware toolkit for low-cost real-time eye tracking&amp;lt;br /&amp;gt; See also the [http://joelclemens.colinr.ca/eyetrack/ Windows version] by Joel Clemens.&lt;br /&gt;
* [http://www.inference.phy.cam.ac.uk/opengazer/ Opengazer], open-source gaze tracker for ordinary webcams.&lt;br /&gt;
* [http://www.codeproject.com/KB/cpp/TrackEye.aspx TrackEye]&amp;lt;nowiki&amp;gt;: Real-Time Tracking Of Human Eyes Using a Webcam. Implemented in C++ using the &amp;lt;/nowiki&amp;gt;[http://sourceforge.net/project/showfiles.php?group_id=22870&amp;amp;package_id=16937 OpenCV library].&lt;br /&gt;
* [http://myeye.jimdo.com/ myEye], eye-tracking software to allow people with severe motor disabilities to use gaze as an input device for interacting with a computer. Beta version of the prototype software available for download.&lt;br /&gt;
&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/software.html Software for the automated classification of fixations and saccades], contains the implementation of five popular eye movement classification algorithms by. O. Komogortsev et al. at The Texas State University.&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/2010/08/how-to-build-low-cost-eye-tracking.html How to build low cost eye tracker] - instructions.&lt;br /&gt;
&lt;br /&gt;
== Low-cost eye tracking ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.ober-consulting.com/13/lang/1/ Bink-IT], a communication and environmental control system based on eye blinks by Ober Consulting.&lt;br /&gt;
* [http://www.youtube.com/user/dmardanbeigi Dias Eye Tracker], low cost eye tracker developed by Diako Mardanbeigi at the Iran University of Science &amp;amp; Technology.&lt;br /&gt;
* [http://cyber.felk.cvut.cz/i4c/en_system.html I4Control®], low-cost eye control system (under development)&lt;br /&gt;
* [http://www.magickey.ipg.pt/ Magic Key], low-cost eye control system (under development)&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyetrackingtest.com/ Eye Tracking Test], affordable eye tracking and usability services&lt;br /&gt;
&lt;br /&gt;
== Open source and freeware eye movement analysis tools ==&lt;br /&gt;
&lt;br /&gt;
* [http://thediemproject.wordpress.com/software/ CARPE] &amp;quot;Computational and Algorithmic Representation and Processing of Eye-movements&amp;quot; visualisation and analysis tool by the [http://thediemproject.wordpress.com/ DIEM] (Dynamic Images and Eye-Movements) Project&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/etud.html COGAIN ETU Driver], Eye-Tracking Universal (Standard) Driver, which helps the developer to '''build tracker-independent applications''' and test them off-line with a gaze data simulator!&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/icomp.html iComponent], tracker-independent analysis and visualization tool by Oleg Spakov.&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/gwm.html GWM: Gaze-to-Word Mapping Tool], a collection of gaze-to-word mapping engines, text mask creators and translation, and more...&lt;br /&gt;
&lt;br /&gt;
* [http://didaktik.physik.fu-berlin.de/projekte/ogama/ OGAMA (OpenGazeAndMouseAnalyzer)], an open source software designed to analyze eye and mouse movements in slideshow study designs&lt;br /&gt;
* [http://sourceforge.net/projects/ritcode/ RITCode] analysis tool for captured eye tracker video files, created by the Rochester Institute Of Technology Visual Perception Lab.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''See also: [[Eye Gaze Communication Board]], low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication need&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
Please email any corrections or additions to [mailto:office@cogain.org office@cogain.org].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2720</id>
		<title>Eye Trackers</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Trackers&amp;diff=2720"/>
		<updated>2012-01-09T08:43:54Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eyetrackers for eye movement research, analysis and evaluation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
'''A catalogue of currently available eye trackers, categorized into systems for assistive technology, research purposes etc.'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Eye Trackers for Assistive Technology and AAC ==&lt;br /&gt;
&lt;br /&gt;
Commercial eye tracking systems that are used for controlling a computer or as communication aids by people with disabilities.&lt;br /&gt;
&lt;br /&gt;
* [[Eye Tracker Intelligaze|Alea Technologies Gmbh: Intelligaze IG-30]]&lt;br /&gt;
* [[Eye Tracker Eyemax|DynaVox Technologies: EyeMax System]]&lt;br /&gt;
* [[Eye Tracker Erica|Eye Response Technologies: ERICA]]&lt;br /&gt;
* [[Eye Tracker Eyetech|EyeTech Digital Systems: EyeTech TM3]]&lt;br /&gt;
* [[Eye Tracker Eyecan|H.K. EyeCan: VisionKey (5+, 6V/H, 7)]]&lt;br /&gt;
* [http://eyecomcorp.com/eyecom-technology/assistive-communication/ Eye-Com] &lt;br /&gt;
* [[Eye Tracker Seetech|HumanElektronik GmbH: SeeTech]]&lt;br /&gt;
* [[Eye Tracker LCTechnologies|LC Technologies: The Eyegaze Communication System, Eyegaze Edge and Eyegaze Edge Tablet]]&lt;br /&gt;
* [[Eye Tracker Metrovision|Metrovision: VISIOBOARD]]&lt;br /&gt;
* [[Eye Tracker EagleEyes|Opportunity Foundation of America: EagleEyes]]&lt;br /&gt;
* [[Eye Tracker Ecopoint|PRC (Prentke Romich Company): ECOpoint]]&lt;br /&gt;
* [[Eye Tracker Technoworks|TechnoWorks CO.,LTD.: TE-9100 Nursing System for Enhancing Patients' Self-support]]&lt;br /&gt;
* [[Eye Tracker Tobii|Tobii Technology: Tobii C8, C12, CEye, MyTobii P10, D10]]&lt;br /&gt;
* [http://www.utechzone.com.tw/spring Utechzone: Spring]&lt;br /&gt;
&lt;br /&gt;
== Eyetrackers for eye movement research, analysis and evaluation ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.amtech.de/ AmTech GmbH], Compact Intergrated Pupillograph (CIP), Pupillograhic Sleepiness Test (PST), table mounted, monocular, video based systems&lt;br /&gt;
* [http://www.a-s-l.com/ Applied Science Laboratories], ASL, eye tracking and pupillometry systems, both IROG (limbus tracker) and VOG (video) based systems, both head mounted and remote tracking, also mobile tracking!&lt;br /&gt;
* [http://arringtonresearch.com/ Arrington Research], ViewPoint EyeTracker, both remote and head mounted, video based&lt;br /&gt;
* [http://www.crsltd.com/ Cambridge Research Systems Ltd.], MR-Eyetracker, a low-cost, contact-free eyetracker for fMRI &amp;amp;amp; MEG&lt;br /&gt;
* [http://www.chronos-vision.de/ Chronos Vision] eye tracking devices are used in e.g. neuroscience, ophthalmology, refractive surgery or clinical research. The classic Chronos Eye Tracker was deployed on the International Space Station (ISS) in early 2004 and is in continuous use for the study of eye and head coordination during long-term stays in the weightlessness of spaceflight.&lt;br /&gt;
* [http://www.clsprofakt.com/ CLS ProFakt Ltd], offers eye tracking services, analysis software and an integrated virtual shopping with eye-tracking tool for FMCG manufacturers&lt;br /&gt;
* [http://designinteractive.net/?p=517 easyGaze(R)], an eye tracker for evaluation and interaction.&lt;br /&gt;
* [http://www.interlog.com/~elmarinc/ EL-MAR Inc.], VISION 2000, portable head mounted video based eye-tracking systems&lt;br /&gt;
* [http://www.ergoneers.de/en/products/dlab-dikablis/overview.html Ergoneers Dikablis], Soft- and hardwaresuite  D-Lab &amp;amp; Dikablis for planning, performing and analyzing eye-tracking and behavioral experiments; fully automated gaze-data analysis in any environment without any restrictions in head and body movement for a motion range of 500m with Dikablis Wireless eye-tracking system.&lt;br /&gt;
* [http://eyecomcorp.com/ Eye-Com] wearable eye tracking and head tracking for clinical and human factors research. &lt;br /&gt;
* [http://www.eyetechds.com/ EyeTech Digital Systems], EyeTech TM3 Eye Tracker Add-on, Research Package, and MegaTracker with free API with full access to raw gaze data and eye metrics&lt;br /&gt;
* [http://www.eyetracking.com/ EyeTracking, Inc.], technology developed by [http://www.sci.sdsu.edu/cerf/content/Eyestudies.html Marshall &amp;amp;amp; CERF, San Diego State University]&lt;br /&gt;
* [http://www.fourward.com/ Fourward Technologies, Inc.], Dual-Purkinje-Image (DPI) Eyetracker, mainly for research purposes&lt;br /&gt;
* [http://www.brain.northwestern.edu/ilab/ ILAB], eye movement analysis software, works with a number of common eye trackers by ASL, ISCAN, and SMI, reads also CORTEX files&lt;br /&gt;
* [http://www.interactive-minds.com/ Interactive Minds], Eye tracking software and tools &lt;br /&gt;
* [http://www.is.cs.cmu.edu/mie/ Interactive Systems Labs], Model-based face and gaze tracking (from video image), Carnegie Mellon University&lt;br /&gt;
* [http://www.iota.se/ Iota AB], EyeTrace Systems, head mounted, binocular, video and IR based eye trace systems&lt;br /&gt;
* [http://www.iscaninc.com/ ISCAN], Eye &amp;amp;amp; Target Tracking Instrumentation, head mounted and remote eye tracking systems, single and multible target video tracking systems&lt;br /&gt;
* [http://www.lctinc.com LC Technologies Inc.], a remote video based eyegaze development system for human factors research&lt;br /&gt;
* [http://www.mangold-international.com/ Mangold International], MangoldVision for lightweight, portable eye tracking, solutions for both remote and head-mounted eye tracking. Software for data recording and analysis.&lt;br /&gt;
* [http://www.metrovision.fr/ Metrovision], MonEOG: Electro-oculography (EOG) potential measurement based gaze tracking, MonVOG1&amp;amp;amp;2: video-oculography (VOG) based gaze tracking&lt;br /&gt;
* [http://www.mirametrix.com/ Mirametrix], Portable, remote, USB based eye tracking for academic and market research with the S1 eye tracker and easy to use open standard API &lt;br /&gt;
* [http://www.nacinc.com/ NAC Image Technology], NAC EMR-8 eye path tracking (IROG based)&lt;br /&gt;
* [http://www.ober-consulting.com/9/lang/1/ Ober Consulting Poland: JAZZ-novo], portable multisensor system with IR based eye-tracker (1 kHz temporal resolution), head rotation and tilt measurement, blood pulse monitoring, voice recording and optional video context recording, designed to study human interaction with environment.&lt;br /&gt;
* [http://www.ober-consulting.com/11/lang/1/ Ober Consulting Poland: Saccadometer], portable eye movement laboratory for study on saccadic reactions using multiple diagnostic experiments, integrated stimulation and eye movement measurement and recording system, head mounted, IR based (1 kHz temporal resolution).&lt;br /&gt;
* [http://www.optom.de/ Optomotor Laboratories], Express-Eye, a stand-alone eye tracker with saccade analysis, and FixTrain, a small hand held device for daily training of saccadic eye movement control&lt;br /&gt;
* [http://www.primelec.ch/ Primelec, D. Florin], Angle-Meter NT, a digitally controlled scleral search coil system for the linear detection of 3D angular eye and head movements&lt;br /&gt;
* [http://www.seeingmachines.com/ Seeing Machines], faceLAB, a 3D head position and eye-gaze direction tracking system (VOG based)&lt;br /&gt;
* [http://www.smivision.com/en/eye-gaze-tracking-systems/home.html SensoMotoric Instruments GmbH], Remote (RED), head mounted (HED) and Hi-Speed eye and gaze tracking for research and applied science, open programming interface and comprehensive stimulus/analysis software.&lt;br /&gt;
* [http://www.skalar.nl/ Skalar Medical BV], head mounted Chronos and IRIS eye trackers, Scleral Search Coil Systems&lt;br /&gt;
* [http://www.smarteye.se/ Smart Eye AB], eye tracking analysis based on any standard camera(s), analog or digital&lt;br /&gt;
* [http://www.eyelinkinfo.com/ SR Research Ltd], EyeLink II, video based, head mounted eye tracking system&lt;br /&gt;
* [http://www.synthenv.com/eyetalk.htm Synthetic Environments, Inc.], EyeTalk integrates voice recognition and eye-tracking&lt;br /&gt;
* [http://www.testusability.com/ TestUsability], EyeCatcher system measures eye scanning and mouse clicking, a helmet fitted with cameras, optics and a microphone&lt;br /&gt;
* [http://www.thomasrecording.de/ Thomas RECORDING GmbH], Eye-Tracking-System (ET-49) system, constructed for neuro-scientific purposes and enables a laboratory to correlate the monkey's eye position&lt;br /&gt;
* [http://www.tobii.com/ Tobii Technology], Tobii T60 and T120 Eye Trackers - both integrated into a 17&amp;quot; TFT monitor, and Tobii X120 Eye Tracker - a standalone eye tracking unit designed for eye tracking studies relative to any surface.&lt;br /&gt;
&lt;br /&gt;
== Open source gaze tracking and freeware eye tracking ==&lt;br /&gt;
&lt;br /&gt;
This list contains low-cost, free and open source eye tracking systems and research prototypes, and information that should help in building your own eye tracker. Some of them are targeted at people with disabilities (eye-control systems), some for more general eye tracking and research.&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyewriter.org/ EyeWriter], low-cost eye-tracking apparatus &amp;amp; custom open-source software that allows graffiti writers and artists with paralysis to draw using only their eyes.&lt;br /&gt;
* [http://www.gazegroup.org/downloads/23-gazetracker/ ITU Gaze Tracker], works with a webcam or videocamera with nightvision and infrared illumination.&lt;br /&gt;
* [http://thirtysixthspan.com/openEyes/ openEyes], open-source open-hardware toolkit for low-cost real-time eye tracking&amp;lt;br /&amp;gt; See also the [http://joelclemens.colinr.ca/eyetrack/ Windows version] by Joel Clemens.&lt;br /&gt;
* [http://www.inference.phy.cam.ac.uk/opengazer/ Opengazer], open-source gaze tracker for ordinary webcams.&lt;br /&gt;
* [http://www.codeproject.com/KB/cpp/TrackEye.aspx TrackEye]&amp;lt;nowiki&amp;gt;: Real-Time Tracking Of Human Eyes Using a Webcam. Implemented in C++ using the &amp;lt;/nowiki&amp;gt;[http://sourceforge.net/project/showfiles.php?group_id=22870&amp;amp;package_id=16937 OpenCV library].&lt;br /&gt;
* [http://myeye.jimdo.com/ myEye], eye-tracking software to allow people with severe motor disabilities to use gaze as an input device for interacting with a computer. Beta version of the prototype software available for download.&lt;br /&gt;
&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/software.html Software for the automated classification of fixations and saccades], contains the implementation of five popular eye movement classification algorithms by. O. Komogortsev et al. at The Texas State University.&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/2010/08/how-to-build-low-cost-eye-tracking.html How to build low cost eye tracker] - instructions.&lt;br /&gt;
&lt;br /&gt;
== Low-cost eye tracking ==&lt;br /&gt;
&lt;br /&gt;
* [http://www.ober-consulting.com/13/lang/1/ Bink-IT], a communication and environmental control system based on eye blinks by Ober Consulting.&lt;br /&gt;
* [http://www.youtube.com/user/dmardanbeigi Dias Eye Tracker], low cost eye tracker developed by Diako Mardanbeigi at the Iran University of Science &amp;amp; Technology.&lt;br /&gt;
* [http://cyber.felk.cvut.cz/i4c/en_system.html I4Control®], low-cost eye control system (under development)&lt;br /&gt;
* [http://www.magickey.ipg.pt/ Magic Key], low-cost eye control system (under development)&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyetrackingtest.com/ Eye Tracking Test], affordable eye tracking and usability services&lt;br /&gt;
&lt;br /&gt;
== Open source and freeware eye movement analysis tools ==&lt;br /&gt;
&lt;br /&gt;
* [http://thediemproject.wordpress.com/software/ CARPE] &amp;quot;Computational and Algorithmic Representation and Processing of Eye-movements&amp;quot; visualisation and analysis tool by the [http://thediemproject.wordpress.com/ DIEM] (Dynamic Images and Eye-Movements) Project&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/etud.html COGAIN ETU Driver], Eye-Tracking Universal (Standard) Driver, which helps the developer to '''build tracker-independent applications''' and test them off-line with a gaze data simulator!&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/icomp.html iComponent], tracker-independent analysis and visualization tool by Oleg Spakov.&lt;br /&gt;
* [http://www.cs.uta.fi/~oleg/gwm.html GWM: Gaze-to-Word Mapping Tool], a collection of gaze-to-word mapping engines, text mask creators and translation, and more...&lt;br /&gt;
&lt;br /&gt;
* [http://didaktik.physik.fu-berlin.de/projekte/ogama/ OGAMA (OpenGazeAndMouseAnalyzer)], an open source software designed to analyze eye and mouse movements in slideshow study designs&lt;br /&gt;
* [http://sourceforge.net/projects/ritcode/ RITCode] analysis tool for captured eye tracker video files, created by the Rochester Institute Of Technology Visual Perception Lab.&lt;br /&gt;
&lt;br /&gt;
== See also ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''See also: [[Eye Gaze Communication Board]], low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication need&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
Please email any corrections or additions to [mailto:office@cogain.org office@cogain.org].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2719</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2719"/>
		<updated>2011-11-24T09:56:46Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[http://??.??.?? ECEM 2013, to be organized in Lund in collaboration with COGAIN]] [[https://sites.google.com/a/univ-provence.fr/ecem2011/ ECEM 2011]] [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 2010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2718</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2718"/>
		<updated>2011-11-24T09:56:32Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[http:// ECEM 2013, to be organized in Lund in collaboration with COGAIN]] [[https://sites.google.com/a/univ-provence.fr/ecem2011/ ECEM 2011]] [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 2010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2717</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2717"/>
		<updated>2011-11-24T09:55:18Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[ECEM 2013, to be organized in Lund in collaboration with COGAIN]] [[https://sites.google.com/a/univ-provence.fr/ecem2011/ ECEM 2011]] [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 2010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2716</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2716"/>
		<updated>2011-11-24T09:55:08Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[?? ECEM 2013, to be organized in Lund in collaboration with COGAIN]] [[https://sites.google.com/a/univ-provence.fr/ecem2011/ ECEM 2011]] [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 2010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2715</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2715"/>
		<updated>2011-11-24T09:54:32Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[ECEM 2013 - organized in Lund in collaboration with COGAIN]] [[https://sites.google.com/a/univ-provence.fr/ecem2011/ ECEM 2011]] [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 2010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2714</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2714"/>
		<updated>2011-11-24T09:53:22Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[https://sites.google.com/a/univ-provence.fr/ecem2011/ ECEM 2011]] [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 2010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2713</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2713"/>
		<updated>2011-11-24T09:52:38Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 2010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2712</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2712"/>
		<updated>2011-11-24T09:52:23Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://etra.cs.uta.fi/ ETRA 20010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2711</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2711"/>
		<updated>2011-11-24T09:51:23Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.etra2012.org/ ETRA 2012]] [[http://www.e-t-r-a.org/20010/ ETRA 20010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2710</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2710"/>
		<updated>2011-11-24T09:50:54Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[http://www.cogain.org/cogain2009 2009]] [[http://www.cogain.org/cogain2008 2008]] [[http://www.cogain.org/cogain2007 2007]] [[http://www.cogain.org/cogain2006 2006]] [[http://www.cogain.org/wiki/COGAIN_Camp2005_Program 2005]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[http://www.ecem2009.org/ ECEM 2009]] [[http://www.ecem2007.org/ ECEM 2007]] [[http://www.ecem.ch/ ECEM13 2005]] [[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]] [[http://congress.utu.fi/ecem11/ ECEM11 2001]] [[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[http://www.e-t-r-a.org/2012/ ETRA 2012]] [[http://www.e-t-r-a.org/20010/ ETRA 20010]] [[http://www.e-t-r-a.org/2008/ ETRA 2008]] [[http://www.e-t-r-a.org/2006/ ETRA 2006]] [[http://www.e-t-r-a.org/2004/ ETRA 2004]] [[http://www.e-t-r-a.org/2002/ ETRA 2002]] [[http://www.e-t-r-a.org/2000/ ETRA 2000]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[http://www.ecvp2008.org/ ECVP 2008]] [[http://www.ecvp.org/meetings.html Archive of previous meetings]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]] [[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]] [[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]] [[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]] [[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2709</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2709"/>
		<updated>2011-11-24T09:47:26Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[http://www.cogain.org/wiki/COGAIN_Camp2005_Program COGAIN 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2708</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2708"/>
		<updated>2011-11-24T09:46:27Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* '''ETA''' [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2707</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2707"/>
		<updated>2011-11-24T09:45:50Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012 (ETA2012)]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2706</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2706"/>
		<updated>2011-11-24T09:45:31Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye Tracking Conferences and Meetings */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
* [http://www.cqu.edu.au/eyetrackaustralia EyeTrack Austria 2012 (ETA2012]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2705</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2705"/>
		<updated>2011-11-23T05:39:20Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2704</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2704"/>
		<updated>2011-11-23T05:38:54Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)], database of eye movement recordings&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2703</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2703"/>
		<updated>2011-11-23T05:37:57Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)].v1 database is composed of 358 eye movement recordings collected from 59 unique individuals&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2702</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2702"/>
		<updated>2011-11-23T05:37:36Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Software */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
* [http://www.cs.txstate.edu/~ok11/embd_v1.html Eye Movement Biometric Database (EMBD)].v1 database is composed of 358 eye movement recordings collected from 59 unique individuals&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Gaze-Controlled_Games&amp;diff=2701</id>
		<title>Gaze-Controlled Games</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Gaze-Controlled_Games&amp;diff=2701"/>
		<updated>2011-11-11T11:21:03Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* YouTube video clips on gaze &amp;amp;amp; games */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]][[Category:Leisure Applications]] [[Category:Bibliography]]&lt;br /&gt;
Online information resources on how to use gaze for the control of games and other leisure applications&lt;br /&gt;
&lt;br /&gt;
=== Papers I: Evaluation of using gaze control in games and other leisure applications ===&lt;br /&gt;
&lt;br /&gt;
==== Gaze Controlled Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The quality and availability of eye tracking equipment has been increasing while costs have been decreasing. These trends increase the possibility of using eye trackers for entertainment purposes. Games that can be controlled solely through movement of the eyes would be accessible to persons with decreased limb mobility or control. On the other hand, use of eye tracking can change the gaming experience for all players, by offering richer input and enabling attention-aware games. Eye tracking is not currently widely supported in gaming, and games specifically developed for use with an eye tracker are rare. This paper reviews past work on eye tracker gaming and charts future development possibilities in different sub-domains within. It argues that based on the user input requirements and gaming contexts, conventional computer games can be classified into groups that offer fundamentally different opportunities for eye tracker input. In addition to the inherent design issues, there are challenges and varying levels of support for eye tracker use in the technical implementations of the games.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Isokoski P., Joos, M., Spakov, O., &amp;amp;amp; Martin, B. (2009). Gaze Controlled Games. Universal Access in the Information Society 8(4). Springer.&amp;lt;br /&amp;gt; Link: [http://dx.doi.org/10.1007/s10209-009-0146-3]&lt;br /&gt;
&lt;br /&gt;
==== Eye Tracker Input in First Person Shooter Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We report ongoing work on using an eye tracker as an input device in first person shooter (FPS) games. In these games player moves in a three-dimensional virtual world that is rendered from the player's point of view. The player interacts with the objects he or she encounters mainly by shooting at them. Typical game storylines reward killing and punish other forms of interaction. The reported work is a part of an effort to evaluate a range of input devices in this context. Our results on the other devices in the same game allow us to compare the efficiency of eye trackers as game controllers against more conventional devices. Our goal regarding eye trackers is to see whether they can help players perform better. Some FPS games are played competitively over the Internet. If using an eye tracker gives an edge in competitive play, players may want to acquire eye tracking equipment. Eye trackers as input devices in FPS games have been investigated before (Jönsson, 2005), but that investigation focused on user impressions rather than on the efficiency and effectiveness of eye trackers in this domain. However, Jönsson's results on eye tracker efficiency in a non-FPS game were encouraging.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Isokoski, P., &amp;amp;amp; Martin, B. (2006). Eye Tracker Input in First Person Shooter Games. In Proceedings of COGAIN 2006: Gazing into the Future, 78-81.&amp;lt;br /&amp;gt; Link: http://www.cs.uta.fi/~poika/cogain2006/cogain2006.pdf&lt;br /&gt;
&lt;br /&gt;
==== Use of Eye Movements for Video Game Control ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We present a study that explores the use of a commercially available eye tracker as a control device for video games. We examine its use across multiple gaming genres and present games that utilize the eye tracker in a variety of ways. First, we describe a first person shooter that uses the eyes to control orientation. Second, we study the use of eye movements for more natural interaction with characters in a role playing game. And lastly, we examine the use of eye tracking as a means to control a modified version of the classic action/arcade game Missile Command. Our results indicate that the use of an eye tracker can increase the immersion of a video game and can significantly alter the gameplay experience.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Smith, J. D., &amp;amp;amp; Graham, T. C. N. (2006). Use of Eye Movements for Video Game Control. In ACM Advancements in Computer Entertainment Technology (Hollywood, CA, USA, June 14 - 16, 2006). ACE 2006. ACM Press, New York, NY. &amp;lt;br /&amp;gt; Link: http://www.cs.queensu.ca/~smith/papers/ace2006.pdf&lt;br /&gt;
&lt;br /&gt;
==== If Looks Could Kill - An Evaluation of Eye Tracking in Computer Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The possibility to track human eye gaze is not new. Different eye tracking devices have been available for several years. The technology has for instance been used in psychological research, usability evaluation and in equipment for disabled people. The devices have often required the user to utilize a chinrest, a bite board or other cumbersome equipment. Hence, the use of eye tracking has been limited to restricted environments.&lt;br /&gt;
&lt;br /&gt;
In recent years, new non-intrusive eye tracking technology has become available. This has made it possible to use eye tracking in new, natural environments. The aim of this study was to evaluate the use of eye tracking in computer games. A literature study was made to gather information about eye tracker systems, existing eye gaze interfaces and computer games. The analysis phase included interviews with people working with human-computer interaction and game development, a focus group session and an evaluation of computer games. The result from the analysis constituted of a summary of interaction sequences, presumable suitable to control with the eyes. Three different prototypes of eye controlled computer games were developed. The first was a shoot'em up game where the player aimed with his eyes to shoot monsters that appeared in random places. The two other prototypes were developed with the Half Life Software Development Kit. In the first Half Life prototype, the player aimed a weapon with his eyes. In the second, the view of sight was controlled with the eyes. The different eye controlled game prototypes were evaluated in a usability study. The subjects played the different prototypes with mouse and eyes respectively. Their experience was evaluated with the thinking aloud method, questionnaires and an interview. The result showed that interaction with the eyes is very fast, easy to learn and perceived to be natural and relaxed. According to the usability study, eye control can provide a more fun and committing gaming experience than ordinary mouse control. Eye controlled computer games is a very new area that needs to be further developed and evaluated. The result of this study suggests that eye based interaction may be very successful in computer games.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Jönsson, E. (2005). If Looks Could Kill - An Evaluation of Eye Tracking in Computer Games. Master's Thesis, Department of Numerical Analysis and Computer Science, Royal Insittute of Technology, Stockholm, Sweden.&amp;lt;br /&amp;gt; Link: &amp;lt;br /&amp;gt;http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2005/rapporter05/jonsson_erika_05125.pdf&lt;br /&gt;
&lt;br /&gt;
==== EyeChess: the tutoring game with visual attentive interface ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Advances in eye tracking have enabled the physically challenged people to type, draw, and control the environment with their eyes. However, entertainment applications for this user group are still few. The EyeChess project described in this paper is a PC based tutorial to assist novices in playing chess endgames. The player always starts first and has to checkmate the Black King in three moves. First, to make a move the player selects a piece and then its destination square. To indicate that some squares could be activated, while other ones were forbidden for selection, color highlighting was applied. A square with a green highlight indicated a valid action, and the red color denoted invalid action. There were three options to make a selection: blinking, eye gesture (i.e., gazing at offscreen targets), and dwell time. If the player does not know how to solve the task, or s/he plays by making mistakes, the tutorial provides a hint. This shows up a blinking green highlight when the gaze points at the right square. Preliminary evaluation of the system revealed that dwell time was the preferred selection technique. The participants reported that the game was fun and easy to play using this method. Meanwhile, both the blinking and eye gesture methods were characterized as quite fatiguing. The tutorial was rated helpful in guiding the decision making process and training the novice users in gaze interaction.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Spakov, O. (2005). EyeChess: the tutoring game with visual attentive interface. Alternative Access: Feelings and Games 2005, Department of Computer Sciences, University of Tampere, Finland.&amp;lt;br /&amp;gt; Link: [http://www.cs.uta.fi/~oleg/docs/Spakov__2005__EyeChess_The_Tutoring_Game_With_Visual_Attentive_Interface_(Internal,%20AAFG).pdf http://www.cs.uta.fi/~oleg/]&lt;br /&gt;
&lt;br /&gt;
==== EyeDraw: A System for Drawing Pictures with Eye Movements ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper describes the design and development of EyeDraw, a software program that will enable children with severe mobility impairments to use an eye tracker to draw pictures with their eyes so that they can have the same creative developmental experiences as nondisabled children. EyeDraw incorporates computer-control and software application advances that address the special needs of people with motor impairments, with emphasis on the needs of children. The contributions of the project include (a) a new technique for using the eyes to control the computer when accomplishing a spatial task, (b) the crafting of task-relevant functionality to support this new technique in its application to drawing pictures, and (c) a user-tested implementation of the idea within a working computer program. User testing with nondisabled users suggests that we have designed and built an eye-cursor and eye-drawing control system that can be used by almost anyone with normal control of their eyes. The core technique will be generally useful for a range of computer control tasks such as selecting a group of icons on the desktop by drawing a box around them.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Hornof, A., Cavender, A., &amp;amp;amp; Hoselton, R. (2004). EyeDraw: A System for Drawing Pictures with Eye Movements. Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. &amp;lt;br /&amp;gt; Link: http://www.cs.uoregon.edu/~hornof/downloads/ASSETS04.pdf&lt;br /&gt;
&lt;br /&gt;
==== Design of a computer game using an eye-tracking device for eye's activity rehabilitation ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; An eye mouse interface that can be used to operate a computer using the movement of the eyes is described. We developed this eye-tracking system for eye motion disability rehabilitation. When the user watches the screen of a computer, a charge-coupled device will catch images of the user's eye and transmit it to the computer. A program, based on a new cross-line tracking and stabilizing algorithm, will locate the center point of the pupil in the images. The calibration factors and energy factors are designed for coordinate mapping and blink functions. After the system transfers the coordinates of pupil center in the images to the display coordinate, it will determine the point at which the user gazed on the display, then transfer that location to the game subroutine program. We used this eye-tracking system as a joystick to play a game with an application program in a multimedia environment. The experimental results verify the feasibility and validity of this eye-game system and the rehabilitation effects for the user's visual movement.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Lin, C.-S., Huan C.-C., Chan C.-N., Yeh, M.-S., &amp;amp;amp; Chiu C.-C. (2004). Design of a computer game using an eye-tracking device for eye's activity rehabilitation, Optics and lasers in engineering, 42(1), 91-108, Elsevier. &amp;lt;br /&amp;gt; Link: &amp;lt;br /&amp;gt;http://www.foylearts.net/jmagee/Bdes/des514m1/06brf514/Lin%20et%20al%202002-%20eye%20tracking.pdf&lt;br /&gt;
&lt;br /&gt;
==== Eye Tracking as an Aiming Device in a Computer Game ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper describes an experiment in the application of eye tracking to facilitate aiming in computer gaming. A simple 3D computer game was run under varying conditions to test the effect of gaze-contingent gaming on a player’s performance, measured by accuracy in selecting targets within the game and completion time. The game was run using a traditional mouse and with the Tobii ET-1750 eye tracker as aiming devices in timed and un-timed trials. The results showed that subjects had better performance in completing their objectives when using the mouse instead of the eye-tracker as an aiming device. However, difficulties with the calibration process suggest that the experiment may yield different results if run with a modified calibration process.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Leyba, J. and Malcolm, J. (2004) Eye Tracking as an Aiming Device in a Computer Game. Course work ([http://andrewd.ces.clemson.edu/courses/cpsc412/fall04/ CPSC 412/612] Eye Tracking Methodology and Applications by A.Duchowski), Clemson University. &amp;lt;br /&amp;gt; Link: &amp;lt;br /&amp;gt;http://andrewd.ces.clemson.edu/courses/cpsc412/fall04/teams/reports/group2.pdf&lt;br /&gt;
&lt;br /&gt;
==== Eye movements in an Action Game Tutorial ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Action games are controversial and discussed, at the same time they fascinate players all over the world. One way to find out what this attraction is about is to use eye tracking to explore them. This method can show explicit eye gaze direction within the game environment and at the same time point out what the mind determine as important in the different interactions in an action game tutorial. This study wants to lay out the foundations of players´ eye behaviours in the light of training, learning, social behaviour and if there are any visual reinforcements between interactive media compared to a natural situation. Action games are today classified as entertainment products with built in simulation paths at the same time as some organisations bring in commercial games for professional training or evaluating its profit. A study made last year at Rochester University showed that non-video game players could improve their visual attention. In this study, eight subjects were playing and the recording tracked every eye movement and step in choice. The results revealed that facial interest is secondary in task progression, eye behaviour patterns are similair to eye behaviour in car driving and re-fixations occured after search and shooting partly independent of background.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Sennersten, C. (2004). Eye movements in an Action Game Tutorial. Student Paper. Department of Cognitive Science. Lund University, Sweden. &amp;lt;br /&amp;gt; Link: http://www.sol.lu.se/humlab/eyetracking/Studentpapers/CharlotteSennersten.pdf&lt;br /&gt;
&lt;br /&gt;
==== 3D First Person Shooting Game by Using Eye Gaze Tracking ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Lee, Eui-Chul and Park, Kang-Ryoung (2005) ''The KIPS transactions. Part B, Volume b12, Issue 4'', Korea Information Processing Society (August 2005, ISSN 1598-284x), pp. 465-472. &amp;lt;br /&amp;gt; Link: [http://www.koreascience.or.kr/article/articleresultdetail.jsp?no=34663577&amp;amp;searchtype=JSB&amp;amp;listlen=18&amp;amp;listno=13 http://www.koreascience.or.kr/article/articleresultdetail.jsp?no=34663577&amp;amp;amp;searchtype=JSB&amp;amp;amp;listlen=18&amp;amp;amp;listno=13]&lt;br /&gt;
&lt;br /&gt;
==== A preliminary investigation into eye gaze data in a first person shooter game ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper describes a study carried out in which the eye gaze data of several users playing a simple First Person Shooter (FPS) game has been recorded. This work shows the design and implementation of a simple game and how the execution of the game can be synchronized with an eye tracking system. The motivation behind this work is to determine the existence of visual psycho-perceptual phenomena, which may be of some use in developing appropriate information limits for distributed interactive media compression algorithms. Only 2 degrees of the 140 degrees of human vision has a high level of detail. It may be possible to determine the areas of the screen that a user is focusing on and render it in high detail or pay particular attention to its contents so as to set appropriate dead reckoning limits. Our experiment shows that eye tracking may allow for improvements in rendering and new compression algorithms to be created for an online FPS game.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Kenny, A., Koesling, H., Delaney, D., McLoone, S. and Ward, T. (2005) A preliminary investigation into eye gaze data in a first person shooter game, in Proceedings of the 19th European Conference on Modelling and Simulation (ECMS '05), Riga, Latvia, June 2005. &amp;lt;br /&amp;gt; Link: http://eprints.nuim.ie/282/&lt;br /&gt;
&lt;br /&gt;
==== Verification of an Experimental Platform Integrating a Tobii Eyetracking System with the HiFi Game Engine ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Playing a commercial PC or console game is a highly visual activity, regardless of whether the purpose is entertainment or situated learning as discussed in the Serious Games field. If more information about the visual attention of the player can be recorded and easily analysed, important design information can be extracted. A range of different eyetracking equipment exists on the market and has been used in many studies over the years. However, very few studies describe dynamic stimuli involving the visual interaction of the user/player with a moving 3D object displayed on a computer screen. The reasons for this are that methods and software developed for eyetracking studies of static 2D stimuli are inappropriate for dynamic 3D stimuli, and manual analysis of dynamic 3D visual interaction is extremely time consuming. In order to address this, the authors have developed a software interface between the Tobii(TM) eyetracking system and the HiFi Game Engine for use in automated logging of dynamic 3D objects of gaze attention. This report describes the verification study performed to assess the performance of this integration between the eyetracker, logging tools and game engine. Detailed analysis shows effective results within the derived accuracy range, which is certainly sufficient for studies from a small scale to large scales necessary for extensive statistical analysis. The work presented in the report has been conducted in collaboration between FOI, Blekinge Institute of Technology and Gotland College.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Sennersten, C., Alfredseon, J., Castor, M., Hedström, J., Lindhal, B, Lindley, C., and Svensson, E. (2007) Verification of an Experimental Platform Integrating a Tobii Eyetracking System with the HiFi Game Engine. Command and Control Systems, Methodology Report, FOI-R--2227-SE, ISSN 1650-1942, FOI Devence Research Agency, February 2007. &amp;lt;br /&amp;gt; Link: http://www2.foi.se/rapp/foir2227.pdf&lt;br /&gt;
&lt;br /&gt;
==== Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The aim of this thesis is to investigate whether gaze-based interaction is a suitable means of input for problem solving games. Where a player has to use his/her eyes not only to select objects, but also to visually perceive the puzzle and plan his/her next move in order to solve the puzzle. Two common problem solving puzzles were implemented, the Sudoku and the Tile Slide puzzle (or 15 puzzle). Each puzzle can be played with eye gaze or with the mouse. Although test subjects found gaze interesting, the mouse was still the preferred mode of interaction. We found that gaze selection is more erroneous than mouse selection and that these errors can cause a player to lose concentration from the task at hand. We also found that the user interface and the interaction sequence influences both the planning strategy that the player would use and the amount of time it takes him/her to complete the task.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Gowases, T. (2007) Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games. Master’s thesis May 2, 2007. Department of Computer Science, University of Joensuu, Finland. &amp;lt;br /&amp;gt; Link: ftp://cs.joensuu.fi/pub/Theses/2007_MSc_Gowases_Tersia.pdf&lt;br /&gt;
&lt;br /&gt;
==== Eye gaze assistance for a Game-like interactive task ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Human beings communicate in abbreviated ways dependent on prior interactions and shared knowledge. Furthermore, humans share information about intentions and future actions using eye gaze. Among primates, humans are unique in the whiteness of the sclera and amount of sclera shown, essential for communication via interpretation of eye gaze. This paper extends our previous work in a Game-like interactive task by the use of computerised recognition of eye gaze and fuzzy signature based interpretation of possible intentions. This extends our notion of robot instinctive behaviour to intentional behaviour. We show a good improvement of speed of response in a simple use of eye gaze information. We also show a significant and more sophisticated use of the eye gaze information, which eliminates the need for control actions on the user’s part. We also make a suggestion as to returning visibility of control to the user in these cases.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Tom Gedeon, Dingyun Zhu, and Sumudu Mendis (2008) Eye gaze assistance for a Game-like interactive task. International Journal of Computer Games Technology. &amp;lt;br /&amp;gt; Link: http://www.hindawi.com/journals/ijcgt/aip.623725.html&lt;br /&gt;
&lt;br /&gt;
==== Invisible eni: using gaze and pupil size to control a game ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We present an eyes-only computer game, Invisible Eni, which uses gaze, blinking and as a novelty pupil size to affect game state. Pupil size can be indirectly controlled by physical activation, strong emotional experiences and cognitive effort. Invisible Eni maps the pupil size variations to the game mechanics and allows players to control game objects by use of willpower. We present the design rationale behind the interaction in Invisible Eni and consider the design implications of using pupil measurements in the interface. We discuss limitations for pupil based interaction and provide suggestions for using pupil size as an active input modality.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Ekman, I. M., Poikola, A. W., and Mäkäräinen, M. K. (2008) Invisible eni: using gaze and pupil size to control a game. In ''CHI '08 Extended Abstracts on Human Factors in Computing Systems, CHI '08''. ACM, New York, NY, 3135-3140. &amp;lt;br /&amp;gt; Link: http://doi.acm.org/10.1145/1358628.1358820&lt;br /&gt;
&lt;br /&gt;
==== Voluntary pupil size change as control in eyes only interaction ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We investigate consciously controlled pupil size as an input modality. Pupil size is affected by various processes, e.g., physical activation, strong emotional experiences and cognitive effort. Our hypothesis is that given continuous feedback, users can learn to control pupil size via physical and psychological self-regulation. We test it by measuring the magnitude of self evoked pupil size changes following seven different instructions, while providing real time graphical feedback on pupil size. Results show that some types of voluntary effort affect pupil size on a statistically significant level. A second controlled experiment confirms that subjects can produce pupil dilation and construction on demand during paced tasks. Applications and limitations to using voluntary pupil size manipulation as an input modality are discussed.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Ekman, I., Poikola, A., Mäkäräinen, M., Takala, T., and Hämäläinen, P. (2008) Voluntary pupil size change as control in eyes only interaction. In ''Proceedings of the 2008 Symposium on Eye Tracking Research &amp;amp;amp; Applications - ETRA '08''. ACM, New York, NY, 115-118. &amp;lt;br /&amp;gt; Link: http://doi.acm.org/10.1145/1344471.1344501&lt;br /&gt;
&lt;br /&gt;
==== Snap Clutch, a Moded Approach to Solving the Midas Touch Problem ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper proposes a simple approach to an old problem, that of the 'Midas Touch'. This uses modes to enable different types of mouse behavior to be emulated with gaze and by using gestures to switch between these modes. A light weight gesture is also used to switch gaze control off when it is not needed, thereby removing a major cause of the problem. The ideas have been trialed in Second Life, which is characterized by a feature-rich of set of interaction techniques and a 3D graphical world. The use of gaze with this type of virtual community is of great relevance to severely disabled people as it can enable them to be in the community on a similar basis to able-bodied participants. The assumption here though is that this group will use gaze as a single modality and that dwell will be an important selection technique. The Midas Touch Problem needs to be considered in the context of fast dwell-based interaction. The solution proposed here, Snap Clutch, is incorporated into the mouse emulator software. The user trials reported here show this to be a very promising way in dealing with some of the interaction problems that users of these complex interfaces face when using gaze by dwell.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; H.O. Istance, R Bates, A. Hyrskykari and S. Vickers (2008) Snap Clutch, a Moded Approach to Solving the Midas Touch Problem. ''Proceedings of the 2008 symposium on Eye tracking research &amp;amp;amp; applications ETRA '08'', ACM Press, Savannah, March 2008. &amp;lt;br /&amp;gt;&amp;lt;br /&amp;gt;'''See also''' the related article on &amp;quot;Eye-tracking interface means gamers' looks can kill&amp;quot; in New Scientist Tech, 5 May 2008. &amp;lt;br /&amp;gt;http://technology.newscientist.com/article/dn13830-eyetracking-interface-means-gamers-looks-can-kill.html&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of Real-time Eye Gaze Logging by a 3D Game Engine ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Human Computer Interaction studies of visual attention in dynamic 3D computer gameplay can be greatly facilitated by automated gaze object logging implemented by integration of eye gaze tracking systems with game engines. This verification study reports the spatial and temporal accuracy of such an integrated system.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Charlotte Sennersten and Craig Lindley (2008) Evaluation of Real-time Eye Gaze Logging by a 3D Game Engine. In Proc. 12th IMEKO TC1 &amp;amp;amp; TC7 Joint Symposium on Man Science &amp;amp;amp; Measurement, Annecy, 2008.&amp;lt;br /&amp;gt; Link: http://www.bth.se/fou/forskinfo.nsf/alfs/14fc0ac35cfea843c1257464003e9022&lt;br /&gt;
&lt;br /&gt;
==== A Psychophysiological Logging System for a Digital Game Modification ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This student thesis intends to facilitate cognitive experiments for gameplay experience studies. To achieve this a psychophysiological logging framework was developed, which automatically reports the occurrence of specific game events to a log file and to the parallel port. Via the parallel port the communication with psychophysiological systems is possible.&lt;br /&gt;
&lt;br /&gt;
Thus, psychophysiological data can be correlated with in-game data in real time. In addition, this framework is able to log viewed game objects via an eye tracker integration. This gives some information on how certain game elements affect the player's attention. For the development of this system the Source SDK, the game engine of Half-Life 2, has been used. Consequently, custom-built Half-Life 2 levels had to be developed, which are suitable for cognitive experiments. In this context, tools for level editing will be introduced.&lt;br /&gt;
&lt;br /&gt;
This thesis shapes the basis for further research work in the area of psychophysiological software development and is intended to facilitate this for future scholars facing these issues.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Stellmach, S. (2007). A Psychophysiological Logging System for a Digital Game Modification. Technical Bachelor's Report. &amp;lt;br /&amp;gt; Link: http://www.gamecareerguide.com/thesis/080527_stellmach.pdf&lt;br /&gt;
&lt;br /&gt;
==== A Framework for Psychophysiological Data Acquisition in Digital Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; In order to rapidly develop digital games for psychophysiological experiments, a coherent and flexible development environment is required. Something that allows researchers to design their experiments, build the stimulus game and easily integrate all required data acquisition functionality into it.&lt;br /&gt;
&lt;br /&gt;
This thesis shows the design and implementation of such a framework. Methods for gathering player-related data are compared to establish a theoretical foundation for the framework. The logging framework is implemented as a set of Torque X components and an example game is developed in order to demonstrate the framework and the different logging components.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Sasse, D. (2008). A Framework for Psychophysiological Data Acquisition in Digital Games. Master's Thesis. &amp;lt;br /&amp;gt; Link: http://www.gamecareerguide.com/thesis/080520_sasse.pdf&lt;br /&gt;
&lt;br /&gt;
==== Keeping an Eye on the Game: Eye Gaze Interaction with Massively Multiplayer Online Games and Virtual Communities for Motor Impaired Users ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Online virtual communities are becoming increasingly popular both within the able-bodied and disabled user communities. These games assume the use of keyboard and mouse as standard input devices, which in some cases is not appropriate for users with a disability. This paper explores gaze-based interaction methods and highlights the problems associated with gaze control of online virtual worlds. The paper then presents a novel 'Snap Clutch' software tool that addresses these problems and enables gaze control. The tool is tested with an experiment showing that effective gaze control is possible although task times are longer. Errors caused by gaze control are identified and potential methods for reducing these are discussed. Finally, the paper demonstrates that gaze driven locomotion can potentially achieve parity with mouse and keyboard driven locomotion, and shows that gaze is a viable modality for game based locomotion both for able-bodied and disabled users alike.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Vickers, S., Istance, H., Hyrskykari, A. Ali, N., and Bates, R. (2008). Keeping an Eye on the Game: Eye Gaze Interaction with Massively Multiplayer Online Games and Virtual Communities for Motor Impaired Users. Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies; ICDVRAT 2008, Maia, Portugal, 8th-10th September 2008 &amp;lt;br /&amp;gt; Link: http://www.icdvrat.reading.ac.uk/2008/papers/ICDVRAT2008_S04_N05_Vickers_Istance_et_al.pdf&amp;lt;br /&amp;gt;(according to the [http://www.icdvrat.reading.ac.uk/2008/abstracts.htm ICDVRAT2008 web page], the link will activate on 1 March 2009)&lt;br /&gt;
&lt;br /&gt;
==== Gaze and voice based game interaction: the revenge of the killer penguins ====&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Wilcox, T., Evans, M., Pearce, C., Pollard, N., and Sundstedt, V. 2008. Gaze and voice based game interaction: the revenge of the killer penguins. In ACM SIGGRAPH 2008 Posters (Los Angeles, California, August 11 - 15, 2008). SIGGRAPH '08. ACM, New York, NY, 1-1. &amp;lt;br /&amp;gt; Link: http://doi.acm.org/10.1145/1400885.1400972&lt;br /&gt;
&lt;br /&gt;
==== Gaze vs. Mouse in Games: The Effects on User Experience ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The possibilities of eye-tracking technologies in educational gaming are seemingly endless. The question we need to ask is what the effects of gaze-based interaction on user experience, strategy during learning and problem solving are. In this paper we evaluate the effects of two gaze based input techniques and mouse based interaction on user experience and immersion. In a between-subject study we found that although mouse interaction is the easiest and most natural way to interact during problem-solving, gaze-based interaction brings more subjective immersion. The findings provide a support for gaze interaction methods into computer-based educational environments.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Gowases, T., Bednarik, R., and Tukiainen, M. (2008) Gaze vs. Mouse in Games: The Effects on User Experience. In Proceedings of the International Conference on Computers in Education, ICCE 2008, pp. 773-777. &amp;lt;br /&amp;gt; Link: http://www.apsce.net/icce2008/papers/ICCE2008-paper280.pdf&amp;lt;br /&amp;gt;See also the video presentation of the paper at http://www.youtube.com/watch?v=HeHh3y3Z4Do&lt;br /&gt;
&lt;br /&gt;
==== EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Physical activity has emerged as a novel input modality for so-called active video games. Input devices such as music instruments, dance mats or the Wii accessories allow for novel ways of interaction and a more immersive gaming experience. In this work we describe how eye movements recognised from electrooculographic (EOG) signals can be used for gaming purposes in three different scenarios. In contrast to common video-based systems, EOG can be implemented as a wearable and light-weight system which allows for long-term use with unconstrained simultaneous physical activity. In a stationary computer game we show that eye gestures of varying complexity can be recognised online with equal performance to a state-of-the-art video-based system. For pervasive gaming scenarios, we show how eye movements can be recognised in the presence of signal artefacts caused by physical activity such as walking. Finally, we describe possible future context-aware games which exploit unconscious eye movements and show which possibilities this new input modality may open up.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Bulling, A., Roggen, D., and Tröster, G. (2008) EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography. In the Second International Conference on Fun and Games, LNCS 5294. pp 33-45. Springer. &amp;lt;br /&amp;gt; Link: http://dx.doi.org/10.1007/978-3-540-88322-7_4&lt;br /&gt;
&lt;br /&gt;
==== Measuring and defining the experience of immersion in games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Despite the word's common usage by gamers and reviewers alike, it is still not clear what immersion means. This paper explores immersion further by investigating whether immersion can be defined quantitatively, describing three experiments in total. The first experiment investigated participants’ abilities to switch from an immersive to a non-immersive task. The second experiment investigated whether there were changes in participants’ eye movements during an immersive task. The third experiment investigated the effect of an externally imposed pace of interaction on immersion and affective measures (state anxiety, positive affect, negative affect). Overall the findings suggest that immersion can be measured subjectively (through questionnaires) as well as objectively (task completion time, eye movements). Furthermore, immersion is not only viewed as a positive experience: negative emotions and uneasiness (i.e. anxiety) also run high.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Jennett, C., Cox, A.L., Cairns, P., Dhoparee, S., Epps, A., Tijs, T., and Walton, A. (2008) Measuring and defining the experience of immersion in games. Int. J. Hum.-Comput. Stud. 66(9), 641-661. &amp;lt;br /&amp;gt; Link: http://dx.doi.org/10.1016/j.ijhcs.2008.04.004&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
=== Papers II: Tracking Gaze in Virtual Environments ===&lt;br /&gt;
&lt;br /&gt;
==== Computational mechanisms for gaze direction in interactive visual environments ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Next-generation immersive virtual environments and video games will require virtual agents with human-like visual attention and gaze behaviors. A critical step is to devise efficient visual processing heuristics to select locations that would attract human gaze in complex dynamic environments. One promising approach to designing such heuristics draws on ideas from computational neuroscience. We compared several such heuristics with eye movement recordings from five observers playing video games, and found that heuristics which detect outliers from the global distribution of visual features were better predictors of human gaze than were purely local heuristics. Heuristics sensitive to dynamic events performed best overall. Further, heuristic prediction power differed more between games than between different human observers. Our findings suggest simple neurally inspired algorithmic methods to predict where humans look while playing video games.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Peters, R. J., &amp;amp;amp; Itti, L. (2006). Computational mechanisms for gaze direction in interactive visual environments. Proceedings of the ACM Eye Tracking Research and Applications (ETRA) Symposium, 2006. 20-27. &amp;lt;br /&amp;gt; Link: http://ilab.usc.edu/publications/doc/Peters_Itti06etra.pdf&lt;br /&gt;
&lt;br /&gt;
==== Towards eye based virtual environment interaction for users with high-level motor disabilities ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; An experiment is reported which extends earlier work on the enhancement of eye pointing in 2D environments through the addition of a zoom facility, to its use in virtual 3D environments using a similar enhancement. A comparison between hand pointing and eye pointing without any enhancement shows a performance advantage for hand based pointing. However, the addition of a 'fly' or 'zoom' enhancement increases both eye and hand based performance, and reduces greatly the difference between these devices. Initial attempts at 'intelligent' fly mechanisms and further enhancements are evaluated.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Bates, R., &amp;amp;amp; Istance, H. O. (2005). Towards eye based virtual environment interaction for users with high-level motor disabilities. Special Issue of International Journal of Disability &amp;amp;amp; Human Development: The International Conference Series on Disability, Virtual Reality and Associated Technologies, Vol. 4(3).&amp;lt;br /&amp;gt; Link: http://www.icdvrat.rdg.ac.uk/2004/papers/S09_N2_Bates_Istance_ICDVRAT2004.pdf&lt;br /&gt;
&lt;br /&gt;
==== Gaze- vs. Hand-Based Pointing in Virtual Environment ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper contributes to the nascent body of literature on pointing performance in Virtual Environments (VEs), comparing gaze- and hand-based pointing. Contrary to previous findings, preliminary results indicate that gaze-based pointing is slower than hand-based pointing for distant objects.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Cournia, N., Smith, J.D., &amp;amp;amp; Duchowski, A.T. (2003). Gaze- vs. Hand-Based Pointing in Virtual Environment, in Proc. SIGCHI 2003 (Short Talks &amp;amp;amp; Interactive Posters), April 5-10, 2003, Ft. Lauderdale, FL. &amp;lt;br /&amp;gt; Link: http://andrewd.ces.clemson.edu/research/vislab/docs/chi03-short.pdf&lt;br /&gt;
&lt;br /&gt;
==== Evaluating gaze-contingent level of detail rendering of virtual environments using visual search ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Level of detail rendering reduces the geometric complexity of objects in virtual reality in order to reduce the computational load on the rendering system. Although the resultant increase in rendering speed is desirable, the behavioral consequences of these techniques for humans performing realistic tasks in complex virtual environments are not well understood. The current study examines the behavior of human observers in virtual environments rendered using a gaze-contingent level of detail criterion. This method takes advantage of the fact that the visual sensitivity of the human visual system is greater at the point of gaze than in the periphery by rendering objects in the periphery with less detail than objects at the point of gaze. In the experiment, participants performed a &amp;quot;virtual search&amp;quot; task, i.e. a visual search task where participants are required to pan the viewport to find a target object among distractors in a virtual environment. Gaze-contingent rendering was employed where the level of detail dropped continuously from the point of gaze. The time to detect and localize the target was measured as a function of the rate of decline in visual detail. Frame rates were allowed to increase with decreasing detail, thus keeping computational load approximately constant. Reaction times to detect the target increased with decreasing detail while reaction times to localize the target decreased with decreasing detail. These results suggest that reduced detail impedes target identification while the increased frame rates due to the reduction in detail facilitates interaction with virtual environments. Overall, these results indicate that the behavioural performance costs of gaze-contingent level of detail techniques can be offset by the behavioural performance gains due to increased rendering speed.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Parkhurst, D., Law, I., &amp;amp;amp; Niebur, E. (2001). Evaluating gaze-contingent level of detail rendering of virtual environments using visual search. In Lab Technical Report 2001-02, 1-6. &amp;lt;br /&amp;gt; Link: http://cnslab.mb.jhu.edu/pubs/Parkhurst_etal01c.pdf&lt;br /&gt;
&lt;br /&gt;
==== Interacting with Eye Movements in Virtual Environments ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Eye movement-based interaction offers the potential of easy, natural, and fast ways of interacting in virtual environments. However, there is little empirical evidence about the advantages or disadvantages of this approach. We developed a new interaction technique for eye movement interaction in a virtual environment and compared it to more conventional 3-D pointing. We conducted an experiment to compare performance of the two interaction types and to assess their impacts on spatial memory of subjects and to explore subjects' satisfaction with the two types of interactions. We found that the eye movement based interaction was faster than pointing, especially for distant objects. However, subjects' ability to recall spatial information was weaker in the eye condition than the pointing one. Subjects reported equal satisfaction with both types of interactions, despite the technology limitations of current eye tracking equipment.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Tanriverdi, V., &amp;amp;amp; Jacob, R. J. K. (2000). Interacting with Eye Movements in Virtual Environments. In CHI '00 Proceedings, ACM, 265-272.&amp;lt;br /&amp;gt; Link: http://www.cs.tufts.edu/~jacob/papers/chi00.tanriverdi.pdf&lt;br /&gt;
&lt;br /&gt;
==== Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users ====&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; R Bates, H.O. Istance and S. Vickers (2008) Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users. In Proceedings of the 4th Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT), University of Cambridge, 13th-16th April 2008.&amp;lt;br /&amp;gt; Link: http://www.cse.dmu.ac.uk/~svickers/pdf/CWUAAT%202008.pdf&lt;br /&gt;
&lt;br /&gt;
==== User Performance of Gaze-based Interaction with On-line Virtual Communities ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We present the results of an investigation into gaze-based interaction techniques with on-line virtual communities. The purpose of this study was to gain a better understanding of user performance with a gaze interaction technique developed for interacting with 3D graphical on-line communities and games. The study involved 12 participants each of whom carried out 2 equivalent sets of 3 tasks in a world created in Second Life. One set was carried out using a keystroke and mouse emulator driven by gaze, and the other set was carried out with the normal keyboard and mouse.. The study demonstrates that subjects were easily able to perform a set of tasks with eye gaze with only a minimal amount of training. It has also identified the causes of user errors and the amount of performance improvement that could be expected if the causes of these errors can be designed out.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Istance, H., Hyrskykari, A., Vickers, S., and Ali, N. (2008) User Performance of Gaze-based Interaction with On-line Virtual Communities. In Proceedings of the 4th Conference on Communication by Gaze Interaction; COGAIN 2008, Prague, CZ, 2nd-3rd September, pp. 28-32. &amp;lt;br /&amp;gt; Link: [[../../cogain2008/COGAIN2008-Proceedings.pdf http://www.cogain.org/cogain2008/COGAIN2008-Proceedings.pdf]]&lt;br /&gt;
&lt;br /&gt;
==== Gazing into a Second Life: Gaze-Driven Adventures, Control Barriers, and the Need for Disability Privacy in an Online Virtual World ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Online virtual worlds such as Second Life and World of Warcraft offer users the chance to participate in potentially limitless virtual worlds, all via a standard desktop pc, mouse and keyboard. This paper addresses some of the interaction barriers and privacy concerns that people with disabilities may encounter when using these worlds, and introduces an avatar Turing test that should be passed for worlds to be accessible for all users. The paper then focuses on the needs of high-level motor disabled users who may use gaze control as an input modality for computer interaction. A taxonomy and survey of interaction are introduced, and an experiment in gaze based interaction is conducted within these virtual worlds. The results of the survey highlight the barriers where people with disabilities cannot interact as efficiently as able-bodied users. Finally, the paper discusses methods for enabling gaze based interaction for high-level motor disabled users and calls for game designers to consider disabled users when designing game interfaces.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Vickers, S., Bates, R., Istance, H. (2008). Gazing into a Second Life: Gaze-Driven Adventures, Control Barriers, and the Need for Disability Privacy in an Online Virtual World. Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies; ICDVRAT 2008, Maia, Portuagal, 8th-10th September 2008. &amp;lt;br /&amp;gt; Link: http://www.icdvrat.reading.ac.uk/2008/papers/ICDVRAT2008_S04_N04_Vickers_Bates_et_al.pdf&amp;lt;br /&amp;gt;(according to the [http://www.icdvrat.reading.ac.uk/2008/abstracts.htm ICDVRAT2008 web page], the link will activate on 1 March 2009)&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
=== Relevant Websites ===&lt;br /&gt;
&lt;br /&gt;
==== COGAIN - Leisure Applications ====&lt;br /&gt;
&lt;br /&gt;
Gaze-controlled games and leisure applications available via the COGAIN web portal&amp;lt;br /&amp;gt;Link: [[../../downloads/leisure-applications.1 http://www.cogain.org/downloads/leisure-applications]]&lt;br /&gt;
&lt;br /&gt;
==== SpecialEffect GameBase ====&lt;br /&gt;
&lt;br /&gt;
The [http://www.specialeffect.org.uk/ SpecialEffect] GameBase provides links to accessible computer games. A group of youngsters, all with disabilities, reviewed these games and helped in the creation of the website. It is hoped that the information on this site will help other young players to find games that are suitable for their interests and abilities. Each game review provides information about how a game is controlled, how fast it is, how much it is likely to cost etc., which saves our players from having to spend time/money on games that would not be suitable for them in the first place. Our 'Comments' section provides a format where players can keep adding tips and tricks for each other. &amp;lt;br /&amp;gt; Link: http://www.specialeffect.org.uk/pages/gamebase.htm&lt;br /&gt;
&lt;br /&gt;
==== Game Accessibility ====&lt;br /&gt;
&lt;br /&gt;
All about game accessibility. For ALL disabled and interested gamers. Contains contributions to gaze controlled computer games.&amp;lt;br /&amp;gt; See especially:&amp;lt;br /&amp;gt; Resources (papers, videos etc.): http://www.game-accessibility.com/index.php?pagefile=papers&amp;lt;br /&amp;gt; Forum: http://www.game-accessibility.com/forum/index.php&lt;br /&gt;
&lt;br /&gt;
==== OneSwitch.Org ====&lt;br /&gt;
&lt;br /&gt;
A resource of fun ideas and 'assistive technology' aimed at moderate to severely learning/physically disabled people.&amp;lt;br /&amp;gt; Link: http://www.oneswitch.org.uk/&lt;br /&gt;
&lt;br /&gt;
'''Design Tips For: Eye Tracker Games'''&amp;lt;br /&amp;gt;Link: http://switchgaming.blogspot.com/2008/08/design-tips-for-eye-tracker-games.html&lt;br /&gt;
&lt;br /&gt;
'''Head, Mouth and Eye Controls'''&amp;lt;br /&amp;gt; At oneswitch.org you can find a list detailing a number of different styles of head, mouth and eye operated controllers. Most of these are for PCs and Apple computers, but there are alternatives for games consoles too. Some of these devices are very expensive, so it is always worth trying to track down some way of trying them out before you buy. &amp;lt;br /&amp;gt; See http://www.oneswitch.org.uk/1/AGS/AGS-head.htm.&lt;br /&gt;
&lt;br /&gt;
'''Software Downloads'''&amp;lt;br /&amp;gt; Oneswitch.org provides more than 70 one switch games of different types (adventure, arcade classics, platformers, puzzle &amp;amp;amp; skill games, race games, shoot-em-ups…) for free download. In all likelihood a large number of these are suitible for gaze control. Alongside this you will find articles, instructions and more at &amp;lt;br /&amp;gt;http://www.oneswitch.org.uk/2library.htm&amp;lt;br /&amp;gt;http://www.oneswitch.org.uk/4/games/0index.htm&lt;br /&gt;
&lt;br /&gt;
==== levelgames.net ====&lt;br /&gt;
&lt;br /&gt;
A website focusing on switch games designed to be widely accessible for players who have Muscular Dystrophy, Cerebral Palsy, Spinal Injury, Head Injury or other physical disabilities.&amp;lt;br /&amp;gt; Link: http://www.levelgames.net/&lt;br /&gt;
&lt;br /&gt;
==== Games designed for the MyTobii Eye Tracking System (by Oleg Špakov) ====&lt;br /&gt;
&lt;br /&gt;
This page contains a list of applications developed to run specifically in the MyTobii environment. Each application registeres itself on installation to be recognized by MyTobii as MyTobii Partner Application.&amp;lt;br /&amp;gt; Link: http://www.cs.uta.fi/~oleg/mytobii.html&lt;br /&gt;
&lt;br /&gt;
==== World of Warcraft Percept Interface (by Oleg Komogortsev) ====&lt;br /&gt;
&lt;br /&gt;
Oleg Komogortsev created an interface that allows to play computer games using gaze control without the use of mouse or keyboard. The interface was tested with the virtual reality game World of Warcraft.&amp;lt;br /&amp;gt; Link: http://www.cs.kent.edu/~okomogor/wowpercept/wowpercept.htm&lt;br /&gt;
&lt;br /&gt;
==== Adventure Game Studio ====&lt;br /&gt;
&lt;br /&gt;
Adventure Game Studio (AGS for short) allows you to create your own point-and-click adventure games, similar to the early 90's Sierra and Lucasarts adventures. It consists of an easy-to-use development environment, and run-time engine. AGS is free. You need no programming experience to make a game using AGS - setting most game options is just a matter of point-and-click (though scripting is of course available if you prefer). &amp;lt;br /&amp;gt; Link: http://www.adventuregamestudio.co.uk/&lt;br /&gt;
&lt;br /&gt;
==== Entertainment Software designed for an EOG based Eye Tracking System: EagleEyes ====&lt;br /&gt;
&lt;br /&gt;
This website contains various applications software designed to be run with EagleEyes and Camera Mouse and other similar systems. (Includes Games, Spell and Speak, System)&amp;lt;br /&amp;gt; Link: http://www.bc.edu/schools/csom/eagleeyes/downloads.html&lt;br /&gt;
&lt;br /&gt;
==== List of open source games ====&lt;br /&gt;
&lt;br /&gt;
Open source games are computer games assembled using open-source software and open content. These games are open to modifications, such as implementing gaze control.&amp;lt;br /&amp;gt; Link: http://en.wikipedia.org/wiki/List_of_open_source_games&lt;br /&gt;
&lt;br /&gt;
==== Game Accessibility Suite ====&lt;br /&gt;
&lt;br /&gt;
Code library and utilities to enhance accessibility to existing and future games. &amp;lt;br /&amp;gt; Link: http://sourceforge.net/projects/gameaccess/&lt;br /&gt;
&lt;br /&gt;
==== Retro Remakes Forum ====&lt;br /&gt;
&lt;br /&gt;
Forum on game accessibility containing threads on eye- and head control. &amp;lt;br /&amp;gt; Link: http://www.retroremakes.com/forum2/forumdisplay.php?f=84&lt;br /&gt;
&lt;br /&gt;
==== Eye Trackers ====&lt;br /&gt;
&lt;br /&gt;
Catalogue of currently available eye trackers for interactive applications. &amp;lt;br /&amp;gt; Link: [[http://www.cogain.org/wiki/Eye_Trackers http://www.cogain.org/wiki/Eye_Trackers]]&lt;br /&gt;
&lt;br /&gt;
==== Gaze-aware Space Vampires (by Chris Schmelzle) ====&lt;br /&gt;
&lt;br /&gt;
Chris Schmelzle trialled how adding information from eye movements to improve the game's artificial intelligence can enhance the gaming experience -- when the game enemies know where the player is looking at. &amp;lt;br /&amp;gt; Link: http://www.cschmelzle.net/eye.html&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
=== Multimedia ===&lt;br /&gt;
&lt;br /&gt;
==== YouTube video clips on gaze &amp;amp;amp; games ====&lt;br /&gt;
&lt;br /&gt;
* [http://www.getacd.org/listen_YPbhSoj3ZFM/bejeweled_2_advanced_player_eye_tracking_study Bejeweled 2 Advanced Player Eye-tracking Study], video &amp;amp; information&lt;br /&gt;
* [http://www.youtube.com/watch?v=6PZpsWzjnvE Dreamhack Eye-Tracking Experiment], by the BTH game research group.&lt;br /&gt;
* [http://www.youtube.com/watch?v=3pRWYE2LRhk Eye Based Video Game Control: Quake 2]&lt;br /&gt;
* [http://www.youtube.com/watch?v=IX6H83ZgYGE Eye Based Video Game Control: Neverwinter Nights]&lt;br /&gt;
* [http://www.youtube.com/watch?v=3JkdFFxdlsw Eye Based Video Game Control: Missile Command]&lt;br /&gt;
* [http://www.youtube.com/watch?v=qbotg30L0rc Eye Gaze Computer Game Crazy Taxi via Tobii PCEye]&lt;br /&gt;
* Eye Gaze Driven Second Life - [http://www.youtube.com/watch?v=ClPAFITx9yY Camera], [http://www.youtube.com/watch?v=UFrvl-eFsAQ&amp;amp;eurl Locomotion]. &amp;lt;br /&amp;gt; See also http://www.cse.dmu.ac.uk/~svickers/scvideos.html&lt;br /&gt;
* [http://www.youtube.com/watch?v=NBIjWA8CHls Eye Gaze Interaction with World of Warcraft]&lt;br /&gt;
* [http://www.youtube.com/watch?v=0Nz68kz51Os Gaze-Controlled Applications at University of Tampere: Board Games]&lt;br /&gt;
* [http://www.youtube.com/watch?v=ldw3HugJ2rE Gaze-Controlled First-Person-Shooter], Eye Tracking with IntelliGaze(tm) technology&lt;br /&gt;
* [http://www.youtube.com/watch?v=lGehsY7pcrc House of the Dead with Eye Tracking]&lt;br /&gt;
* [http://www.youtube.com/watch?v=OfnQDJW6xXA Interaction using Gaze Direction], Gaze (for direction changes) + key press&lt;br /&gt;
* [http://www.youtube.com/watch?v=3j2bEWGkDKE Playing Angry Birds with Gaze Control], IntelliGaze, Desktop 2.0&lt;br /&gt;
* [http://www.youtube.com/watch?v=QmvrR5z4NOA Playing Unreal Tournament with an eye-tracker]&lt;br /&gt;
* [http://www.youtube.com/watch?v=XwMoAqgikRM Solitaire with an eye-tracker]&lt;br /&gt;
* [http://www.youtube.com/watch?v=i0xgyBNVrHk Tobii EyeAsteroids eye-controlled arcade game]&lt;br /&gt;
* [http://www.youtube.com/watch?v=tfxxoN_RbJ8 Low Cost Eye Tracking for Commercial Gaming: eyeAsteroids and eyeShoot in the Dark!]&lt;br /&gt;
&lt;br /&gt;
=== Organisations ===&lt;br /&gt;
==== SpecialEffect ====&lt;br /&gt;
SpecialEffect is a charitable organisation dedicated to helping ALL young people with disabilities to enjoy computer games. For these children, the majority of computer games are simply too quick or too difficult to play, and we can help them and their parents to find out which games they CAN play, and how to adapt those games that they can't. &amp;lt;br /&amp;gt; Link: http://www.specialeffect.org.uk/&lt;br /&gt;
==== IGDA (International Game Developers Association) Game Accessibility Special Interest group (GA-SIG) ====&lt;br /&gt;
The GA-SIG was formed to help the game community strive towards creating mainstream games that are universally accessible to all, regardless of disability.&amp;lt;br /&amp;gt; Link: http://www.igda.org/wiki/index.php/Game_Accessibility_SIG&lt;br /&gt;
==== Pin Interactive ====&lt;br /&gt;
Game Accessibility Development company&amp;lt;br /&amp;gt; Link: http://www.pininteractive.com/&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
Please email any additions or corrections to [mailto:office@cogain.org office (at) cogain (dot) org]&lt;br /&gt;
&lt;br /&gt;
NOTE: This web resource is part of the ''COGAIN Deliverable D4.5 Online information resources on how to use gaze for the control of selected games'' by Michael Heubner (Technical University of Dresden), Fiona Mulvey (Technical University of Dresden) and Päivi Majaranta (University of Tampere). Thanks to: Faten Ahmed (Technical University of Dresden), Oleg Špakov (University of Tampere) and Barrie Ellis ([http://oneswitch.org.uk/ Oneswitch.org.uk]). The original version was prepared in August 2007 and delivered in October 2007. New material is added as it appears. &lt;br /&gt;
&lt;br /&gt;
See also the more general '''[[Bibliography Gaze Interaction|Gaze Interaction Bibliography]]'''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Gaze-Controlled_Games&amp;diff=2700</id>
		<title>Gaze-Controlled Games</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Gaze-Controlled_Games&amp;diff=2700"/>
		<updated>2011-11-09T05:23:57Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* YouTube video clips on gaze &amp;amp;amp; games */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]][[Category:Leisure Applications]] [[Category:Bibliography]]&lt;br /&gt;
Online information resources on how to use gaze for the control of games and other leisure applications&lt;br /&gt;
&lt;br /&gt;
=== Papers I: Evaluation of using gaze control in games and other leisure applications ===&lt;br /&gt;
&lt;br /&gt;
==== Gaze Controlled Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The quality and availability of eye tracking equipment has been increasing while costs have been decreasing. These trends increase the possibility of using eye trackers for entertainment purposes. Games that can be controlled solely through movement of the eyes would be accessible to persons with decreased limb mobility or control. On the other hand, use of eye tracking can change the gaming experience for all players, by offering richer input and enabling attention-aware games. Eye tracking is not currently widely supported in gaming, and games specifically developed for use with an eye tracker are rare. This paper reviews past work on eye tracker gaming and charts future development possibilities in different sub-domains within. It argues that based on the user input requirements and gaming contexts, conventional computer games can be classified into groups that offer fundamentally different opportunities for eye tracker input. In addition to the inherent design issues, there are challenges and varying levels of support for eye tracker use in the technical implementations of the games.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Isokoski P., Joos, M., Spakov, O., &amp;amp;amp; Martin, B. (2009). Gaze Controlled Games. Universal Access in the Information Society 8(4). Springer.&amp;lt;br /&amp;gt; Link: [http://dx.doi.org/10.1007/s10209-009-0146-3]&lt;br /&gt;
&lt;br /&gt;
==== Eye Tracker Input in First Person Shooter Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We report ongoing work on using an eye tracker as an input device in first person shooter (FPS) games. In these games player moves in a three-dimensional virtual world that is rendered from the player's point of view. The player interacts with the objects he or she encounters mainly by shooting at them. Typical game storylines reward killing and punish other forms of interaction. The reported work is a part of an effort to evaluate a range of input devices in this context. Our results on the other devices in the same game allow us to compare the efficiency of eye trackers as game controllers against more conventional devices. Our goal regarding eye trackers is to see whether they can help players perform better. Some FPS games are played competitively over the Internet. If using an eye tracker gives an edge in competitive play, players may want to acquire eye tracking equipment. Eye trackers as input devices in FPS games have been investigated before (Jönsson, 2005), but that investigation focused on user impressions rather than on the efficiency and effectiveness of eye trackers in this domain. However, Jönsson's results on eye tracker efficiency in a non-FPS game were encouraging.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Isokoski, P., &amp;amp;amp; Martin, B. (2006). Eye Tracker Input in First Person Shooter Games. In Proceedings of COGAIN 2006: Gazing into the Future, 78-81.&amp;lt;br /&amp;gt; Link: http://www.cs.uta.fi/~poika/cogain2006/cogain2006.pdf&lt;br /&gt;
&lt;br /&gt;
==== Use of Eye Movements for Video Game Control ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We present a study that explores the use of a commercially available eye tracker as a control device for video games. We examine its use across multiple gaming genres and present games that utilize the eye tracker in a variety of ways. First, we describe a first person shooter that uses the eyes to control orientation. Second, we study the use of eye movements for more natural interaction with characters in a role playing game. And lastly, we examine the use of eye tracking as a means to control a modified version of the classic action/arcade game Missile Command. Our results indicate that the use of an eye tracker can increase the immersion of a video game and can significantly alter the gameplay experience.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Smith, J. D., &amp;amp;amp; Graham, T. C. N. (2006). Use of Eye Movements for Video Game Control. In ACM Advancements in Computer Entertainment Technology (Hollywood, CA, USA, June 14 - 16, 2006). ACE 2006. ACM Press, New York, NY. &amp;lt;br /&amp;gt; Link: http://www.cs.queensu.ca/~smith/papers/ace2006.pdf&lt;br /&gt;
&lt;br /&gt;
==== If Looks Could Kill - An Evaluation of Eye Tracking in Computer Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The possibility to track human eye gaze is not new. Different eye tracking devices have been available for several years. The technology has for instance been used in psychological research, usability evaluation and in equipment for disabled people. The devices have often required the user to utilize a chinrest, a bite board or other cumbersome equipment. Hence, the use of eye tracking has been limited to restricted environments.&lt;br /&gt;
&lt;br /&gt;
In recent years, new non-intrusive eye tracking technology has become available. This has made it possible to use eye tracking in new, natural environments. The aim of this study was to evaluate the use of eye tracking in computer games. A literature study was made to gather information about eye tracker systems, existing eye gaze interfaces and computer games. The analysis phase included interviews with people working with human-computer interaction and game development, a focus group session and an evaluation of computer games. The result from the analysis constituted of a summary of interaction sequences, presumable suitable to control with the eyes. Three different prototypes of eye controlled computer games were developed. The first was a shoot'em up game where the player aimed with his eyes to shoot monsters that appeared in random places. The two other prototypes were developed with the Half Life Software Development Kit. In the first Half Life prototype, the player aimed a weapon with his eyes. In the second, the view of sight was controlled with the eyes. The different eye controlled game prototypes were evaluated in a usability study. The subjects played the different prototypes with mouse and eyes respectively. Their experience was evaluated with the thinking aloud method, questionnaires and an interview. The result showed that interaction with the eyes is very fast, easy to learn and perceived to be natural and relaxed. According to the usability study, eye control can provide a more fun and committing gaming experience than ordinary mouse control. Eye controlled computer games is a very new area that needs to be further developed and evaluated. The result of this study suggests that eye based interaction may be very successful in computer games.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Jönsson, E. (2005). If Looks Could Kill - An Evaluation of Eye Tracking in Computer Games. Master's Thesis, Department of Numerical Analysis and Computer Science, Royal Insittute of Technology, Stockholm, Sweden.&amp;lt;br /&amp;gt; Link: &amp;lt;br /&amp;gt;http://www.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2005/rapporter05/jonsson_erika_05125.pdf&lt;br /&gt;
&lt;br /&gt;
==== EyeChess: the tutoring game with visual attentive interface ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Advances in eye tracking have enabled the physically challenged people to type, draw, and control the environment with their eyes. However, entertainment applications for this user group are still few. The EyeChess project described in this paper is a PC based tutorial to assist novices in playing chess endgames. The player always starts first and has to checkmate the Black King in three moves. First, to make a move the player selects a piece and then its destination square. To indicate that some squares could be activated, while other ones were forbidden for selection, color highlighting was applied. A square with a green highlight indicated a valid action, and the red color denoted invalid action. There were three options to make a selection: blinking, eye gesture (i.e., gazing at offscreen targets), and dwell time. If the player does not know how to solve the task, or s/he plays by making mistakes, the tutorial provides a hint. This shows up a blinking green highlight when the gaze points at the right square. Preliminary evaluation of the system revealed that dwell time was the preferred selection technique. The participants reported that the game was fun and easy to play using this method. Meanwhile, both the blinking and eye gesture methods were characterized as quite fatiguing. The tutorial was rated helpful in guiding the decision making process and training the novice users in gaze interaction.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Spakov, O. (2005). EyeChess: the tutoring game with visual attentive interface. Alternative Access: Feelings and Games 2005, Department of Computer Sciences, University of Tampere, Finland.&amp;lt;br /&amp;gt; Link: [http://www.cs.uta.fi/~oleg/docs/Spakov__2005__EyeChess_The_Tutoring_Game_With_Visual_Attentive_Interface_(Internal,%20AAFG).pdf http://www.cs.uta.fi/~oleg/]&lt;br /&gt;
&lt;br /&gt;
==== EyeDraw: A System for Drawing Pictures with Eye Movements ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper describes the design and development of EyeDraw, a software program that will enable children with severe mobility impairments to use an eye tracker to draw pictures with their eyes so that they can have the same creative developmental experiences as nondisabled children. EyeDraw incorporates computer-control and software application advances that address the special needs of people with motor impairments, with emphasis on the needs of children. The contributions of the project include (a) a new technique for using the eyes to control the computer when accomplishing a spatial task, (b) the crafting of task-relevant functionality to support this new technique in its application to drawing pictures, and (c) a user-tested implementation of the idea within a working computer program. User testing with nondisabled users suggests that we have designed and built an eye-cursor and eye-drawing control system that can be used by almost anyone with normal control of their eyes. The core technique will be generally useful for a range of computer control tasks such as selecting a group of icons on the desktop by drawing a box around them.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Hornof, A., Cavender, A., &amp;amp;amp; Hoselton, R. (2004). EyeDraw: A System for Drawing Pictures with Eye Movements. Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. &amp;lt;br /&amp;gt; Link: http://www.cs.uoregon.edu/~hornof/downloads/ASSETS04.pdf&lt;br /&gt;
&lt;br /&gt;
==== Design of a computer game using an eye-tracking device for eye's activity rehabilitation ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; An eye mouse interface that can be used to operate a computer using the movement of the eyes is described. We developed this eye-tracking system for eye motion disability rehabilitation. When the user watches the screen of a computer, a charge-coupled device will catch images of the user's eye and transmit it to the computer. A program, based on a new cross-line tracking and stabilizing algorithm, will locate the center point of the pupil in the images. The calibration factors and energy factors are designed for coordinate mapping and blink functions. After the system transfers the coordinates of pupil center in the images to the display coordinate, it will determine the point at which the user gazed on the display, then transfer that location to the game subroutine program. We used this eye-tracking system as a joystick to play a game with an application program in a multimedia environment. The experimental results verify the feasibility and validity of this eye-game system and the rehabilitation effects for the user's visual movement.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Lin, C.-S., Huan C.-C., Chan C.-N., Yeh, M.-S., &amp;amp;amp; Chiu C.-C. (2004). Design of a computer game using an eye-tracking device for eye's activity rehabilitation, Optics and lasers in engineering, 42(1), 91-108, Elsevier. &amp;lt;br /&amp;gt; Link: &amp;lt;br /&amp;gt;http://www.foylearts.net/jmagee/Bdes/des514m1/06brf514/Lin%20et%20al%202002-%20eye%20tracking.pdf&lt;br /&gt;
&lt;br /&gt;
==== Eye Tracking as an Aiming Device in a Computer Game ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper describes an experiment in the application of eye tracking to facilitate aiming in computer gaming. A simple 3D computer game was run under varying conditions to test the effect of gaze-contingent gaming on a player’s performance, measured by accuracy in selecting targets within the game and completion time. The game was run using a traditional mouse and with the Tobii ET-1750 eye tracker as aiming devices in timed and un-timed trials. The results showed that subjects had better performance in completing their objectives when using the mouse instead of the eye-tracker as an aiming device. However, difficulties with the calibration process suggest that the experiment may yield different results if run with a modified calibration process.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Leyba, J. and Malcolm, J. (2004) Eye Tracking as an Aiming Device in a Computer Game. Course work ([http://andrewd.ces.clemson.edu/courses/cpsc412/fall04/ CPSC 412/612] Eye Tracking Methodology and Applications by A.Duchowski), Clemson University. &amp;lt;br /&amp;gt; Link: &amp;lt;br /&amp;gt;http://andrewd.ces.clemson.edu/courses/cpsc412/fall04/teams/reports/group2.pdf&lt;br /&gt;
&lt;br /&gt;
==== Eye movements in an Action Game Tutorial ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Action games are controversial and discussed, at the same time they fascinate players all over the world. One way to find out what this attraction is about is to use eye tracking to explore them. This method can show explicit eye gaze direction within the game environment and at the same time point out what the mind determine as important in the different interactions in an action game tutorial. This study wants to lay out the foundations of players´ eye behaviours in the light of training, learning, social behaviour and if there are any visual reinforcements between interactive media compared to a natural situation. Action games are today classified as entertainment products with built in simulation paths at the same time as some organisations bring in commercial games for professional training or evaluating its profit. A study made last year at Rochester University showed that non-video game players could improve their visual attention. In this study, eight subjects were playing and the recording tracked every eye movement and step in choice. The results revealed that facial interest is secondary in task progression, eye behaviour patterns are similair to eye behaviour in car driving and re-fixations occured after search and shooting partly independent of background.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Sennersten, C. (2004). Eye movements in an Action Game Tutorial. Student Paper. Department of Cognitive Science. Lund University, Sweden. &amp;lt;br /&amp;gt; Link: http://www.sol.lu.se/humlab/eyetracking/Studentpapers/CharlotteSennersten.pdf&lt;br /&gt;
&lt;br /&gt;
==== 3D First Person Shooting Game by Using Eye Gaze Tracking ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Lee, Eui-Chul and Park, Kang-Ryoung (2005) ''The KIPS transactions. Part B, Volume b12, Issue 4'', Korea Information Processing Society (August 2005, ISSN 1598-284x), pp. 465-472. &amp;lt;br /&amp;gt; Link: [http://www.koreascience.or.kr/article/articleresultdetail.jsp?no=34663577&amp;amp;searchtype=JSB&amp;amp;listlen=18&amp;amp;listno=13 http://www.koreascience.or.kr/article/articleresultdetail.jsp?no=34663577&amp;amp;amp;searchtype=JSB&amp;amp;amp;listlen=18&amp;amp;amp;listno=13]&lt;br /&gt;
&lt;br /&gt;
==== A preliminary investigation into eye gaze data in a first person shooter game ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper describes a study carried out in which the eye gaze data of several users playing a simple First Person Shooter (FPS) game has been recorded. This work shows the design and implementation of a simple game and how the execution of the game can be synchronized with an eye tracking system. The motivation behind this work is to determine the existence of visual psycho-perceptual phenomena, which may be of some use in developing appropriate information limits for distributed interactive media compression algorithms. Only 2 degrees of the 140 degrees of human vision has a high level of detail. It may be possible to determine the areas of the screen that a user is focusing on and render it in high detail or pay particular attention to its contents so as to set appropriate dead reckoning limits. Our experiment shows that eye tracking may allow for improvements in rendering and new compression algorithms to be created for an online FPS game.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Kenny, A., Koesling, H., Delaney, D., McLoone, S. and Ward, T. (2005) A preliminary investigation into eye gaze data in a first person shooter game, in Proceedings of the 19th European Conference on Modelling and Simulation (ECMS '05), Riga, Latvia, June 2005. &amp;lt;br /&amp;gt; Link: http://eprints.nuim.ie/282/&lt;br /&gt;
&lt;br /&gt;
==== Verification of an Experimental Platform Integrating a Tobii Eyetracking System with the HiFi Game Engine ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Playing a commercial PC or console game is a highly visual activity, regardless of whether the purpose is entertainment or situated learning as discussed in the Serious Games field. If more information about the visual attention of the player can be recorded and easily analysed, important design information can be extracted. A range of different eyetracking equipment exists on the market and has been used in many studies over the years. However, very few studies describe dynamic stimuli involving the visual interaction of the user/player with a moving 3D object displayed on a computer screen. The reasons for this are that methods and software developed for eyetracking studies of static 2D stimuli are inappropriate for dynamic 3D stimuli, and manual analysis of dynamic 3D visual interaction is extremely time consuming. In order to address this, the authors have developed a software interface between the Tobii(TM) eyetracking system and the HiFi Game Engine for use in automated logging of dynamic 3D objects of gaze attention. This report describes the verification study performed to assess the performance of this integration between the eyetracker, logging tools and game engine. Detailed analysis shows effective results within the derived accuracy range, which is certainly sufficient for studies from a small scale to large scales necessary for extensive statistical analysis. The work presented in the report has been conducted in collaboration between FOI, Blekinge Institute of Technology and Gotland College.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Sennersten, C., Alfredseon, J., Castor, M., Hedström, J., Lindhal, B, Lindley, C., and Svensson, E. (2007) Verification of an Experimental Platform Integrating a Tobii Eyetracking System with the HiFi Game Engine. Command and Control Systems, Methodology Report, FOI-R--2227-SE, ISSN 1650-1942, FOI Devence Research Agency, February 2007. &amp;lt;br /&amp;gt; Link: http://www2.foi.se/rapp/foir2227.pdf&lt;br /&gt;
&lt;br /&gt;
==== Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The aim of this thesis is to investigate whether gaze-based interaction is a suitable means of input for problem solving games. Where a player has to use his/her eyes not only to select objects, but also to visually perceive the puzzle and plan his/her next move in order to solve the puzzle. Two common problem solving puzzles were implemented, the Sudoku and the Tile Slide puzzle (or 15 puzzle). Each puzzle can be played with eye gaze or with the mouse. Although test subjects found gaze interesting, the mouse was still the preferred mode of interaction. We found that gaze selection is more erroneous than mouse selection and that these errors can cause a player to lose concentration from the task at hand. We also found that the user interface and the interaction sequence influences both the planning strategy that the player would use and the amount of time it takes him/her to complete the task.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Gowases, T. (2007) Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games. Master’s thesis May 2, 2007. Department of Computer Science, University of Joensuu, Finland. &amp;lt;br /&amp;gt; Link: ftp://cs.joensuu.fi/pub/Theses/2007_MSc_Gowases_Tersia.pdf&lt;br /&gt;
&lt;br /&gt;
==== Eye gaze assistance for a Game-like interactive task ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Human beings communicate in abbreviated ways dependent on prior interactions and shared knowledge. Furthermore, humans share information about intentions and future actions using eye gaze. Among primates, humans are unique in the whiteness of the sclera and amount of sclera shown, essential for communication via interpretation of eye gaze. This paper extends our previous work in a Game-like interactive task by the use of computerised recognition of eye gaze and fuzzy signature based interpretation of possible intentions. This extends our notion of robot instinctive behaviour to intentional behaviour. We show a good improvement of speed of response in a simple use of eye gaze information. We also show a significant and more sophisticated use of the eye gaze information, which eliminates the need for control actions on the user’s part. We also make a suggestion as to returning visibility of control to the user in these cases.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Tom Gedeon, Dingyun Zhu, and Sumudu Mendis (2008) Eye gaze assistance for a Game-like interactive task. International Journal of Computer Games Technology. &amp;lt;br /&amp;gt; Link: http://www.hindawi.com/journals/ijcgt/aip.623725.html&lt;br /&gt;
&lt;br /&gt;
==== Invisible eni: using gaze and pupil size to control a game ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We present an eyes-only computer game, Invisible Eni, which uses gaze, blinking and as a novelty pupil size to affect game state. Pupil size can be indirectly controlled by physical activation, strong emotional experiences and cognitive effort. Invisible Eni maps the pupil size variations to the game mechanics and allows players to control game objects by use of willpower. We present the design rationale behind the interaction in Invisible Eni and consider the design implications of using pupil measurements in the interface. We discuss limitations for pupil based interaction and provide suggestions for using pupil size as an active input modality.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Ekman, I. M., Poikola, A. W., and Mäkäräinen, M. K. (2008) Invisible eni: using gaze and pupil size to control a game. In ''CHI '08 Extended Abstracts on Human Factors in Computing Systems, CHI '08''. ACM, New York, NY, 3135-3140. &amp;lt;br /&amp;gt; Link: http://doi.acm.org/10.1145/1358628.1358820&lt;br /&gt;
&lt;br /&gt;
==== Voluntary pupil size change as control in eyes only interaction ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We investigate consciously controlled pupil size as an input modality. Pupil size is affected by various processes, e.g., physical activation, strong emotional experiences and cognitive effort. Our hypothesis is that given continuous feedback, users can learn to control pupil size via physical and psychological self-regulation. We test it by measuring the magnitude of self evoked pupil size changes following seven different instructions, while providing real time graphical feedback on pupil size. Results show that some types of voluntary effort affect pupil size on a statistically significant level. A second controlled experiment confirms that subjects can produce pupil dilation and construction on demand during paced tasks. Applications and limitations to using voluntary pupil size manipulation as an input modality are discussed.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Ekman, I., Poikola, A., Mäkäräinen, M., Takala, T., and Hämäläinen, P. (2008) Voluntary pupil size change as control in eyes only interaction. In ''Proceedings of the 2008 Symposium on Eye Tracking Research &amp;amp;amp; Applications - ETRA '08''. ACM, New York, NY, 115-118. &amp;lt;br /&amp;gt; Link: http://doi.acm.org/10.1145/1344471.1344501&lt;br /&gt;
&lt;br /&gt;
==== Snap Clutch, a Moded Approach to Solving the Midas Touch Problem ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper proposes a simple approach to an old problem, that of the 'Midas Touch'. This uses modes to enable different types of mouse behavior to be emulated with gaze and by using gestures to switch between these modes. A light weight gesture is also used to switch gaze control off when it is not needed, thereby removing a major cause of the problem. The ideas have been trialed in Second Life, which is characterized by a feature-rich of set of interaction techniques and a 3D graphical world. The use of gaze with this type of virtual community is of great relevance to severely disabled people as it can enable them to be in the community on a similar basis to able-bodied participants. The assumption here though is that this group will use gaze as a single modality and that dwell will be an important selection technique. The Midas Touch Problem needs to be considered in the context of fast dwell-based interaction. The solution proposed here, Snap Clutch, is incorporated into the mouse emulator software. The user trials reported here show this to be a very promising way in dealing with some of the interaction problems that users of these complex interfaces face when using gaze by dwell.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; H.O. Istance, R Bates, A. Hyrskykari and S. Vickers (2008) Snap Clutch, a Moded Approach to Solving the Midas Touch Problem. ''Proceedings of the 2008 symposium on Eye tracking research &amp;amp;amp; applications ETRA '08'', ACM Press, Savannah, March 2008. &amp;lt;br /&amp;gt;&amp;lt;br /&amp;gt;'''See also''' the related article on &amp;quot;Eye-tracking interface means gamers' looks can kill&amp;quot; in New Scientist Tech, 5 May 2008. &amp;lt;br /&amp;gt;http://technology.newscientist.com/article/dn13830-eyetracking-interface-means-gamers-looks-can-kill.html&lt;br /&gt;
&lt;br /&gt;
==== Evaluation of Real-time Eye Gaze Logging by a 3D Game Engine ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Human Computer Interaction studies of visual attention in dynamic 3D computer gameplay can be greatly facilitated by automated gaze object logging implemented by integration of eye gaze tracking systems with game engines. This verification study reports the spatial and temporal accuracy of such an integrated system.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Charlotte Sennersten and Craig Lindley (2008) Evaluation of Real-time Eye Gaze Logging by a 3D Game Engine. In Proc. 12th IMEKO TC1 &amp;amp;amp; TC7 Joint Symposium on Man Science &amp;amp;amp; Measurement, Annecy, 2008.&amp;lt;br /&amp;gt; Link: http://www.bth.se/fou/forskinfo.nsf/alfs/14fc0ac35cfea843c1257464003e9022&lt;br /&gt;
&lt;br /&gt;
==== A Psychophysiological Logging System for a Digital Game Modification ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This student thesis intends to facilitate cognitive experiments for gameplay experience studies. To achieve this a psychophysiological logging framework was developed, which automatically reports the occurrence of specific game events to a log file and to the parallel port. Via the parallel port the communication with psychophysiological systems is possible.&lt;br /&gt;
&lt;br /&gt;
Thus, psychophysiological data can be correlated with in-game data in real time. In addition, this framework is able to log viewed game objects via an eye tracker integration. This gives some information on how certain game elements affect the player's attention. For the development of this system the Source SDK, the game engine of Half-Life 2, has been used. Consequently, custom-built Half-Life 2 levels had to be developed, which are suitable for cognitive experiments. In this context, tools for level editing will be introduced.&lt;br /&gt;
&lt;br /&gt;
This thesis shapes the basis for further research work in the area of psychophysiological software development and is intended to facilitate this for future scholars facing these issues.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Stellmach, S. (2007). A Psychophysiological Logging System for a Digital Game Modification. Technical Bachelor's Report. &amp;lt;br /&amp;gt; Link: http://www.gamecareerguide.com/thesis/080527_stellmach.pdf&lt;br /&gt;
&lt;br /&gt;
==== A Framework for Psychophysiological Data Acquisition in Digital Games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; In order to rapidly develop digital games for psychophysiological experiments, a coherent and flexible development environment is required. Something that allows researchers to design their experiments, build the stimulus game and easily integrate all required data acquisition functionality into it.&lt;br /&gt;
&lt;br /&gt;
This thesis shows the design and implementation of such a framework. Methods for gathering player-related data are compared to establish a theoretical foundation for the framework. The logging framework is implemented as a set of Torque X components and an example game is developed in order to demonstrate the framework and the different logging components.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Sasse, D. (2008). A Framework for Psychophysiological Data Acquisition in Digital Games. Master's Thesis. &amp;lt;br /&amp;gt; Link: http://www.gamecareerguide.com/thesis/080520_sasse.pdf&lt;br /&gt;
&lt;br /&gt;
==== Keeping an Eye on the Game: Eye Gaze Interaction with Massively Multiplayer Online Games and Virtual Communities for Motor Impaired Users ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Online virtual communities are becoming increasingly popular both within the able-bodied and disabled user communities. These games assume the use of keyboard and mouse as standard input devices, which in some cases is not appropriate for users with a disability. This paper explores gaze-based interaction methods and highlights the problems associated with gaze control of online virtual worlds. The paper then presents a novel 'Snap Clutch' software tool that addresses these problems and enables gaze control. The tool is tested with an experiment showing that effective gaze control is possible although task times are longer. Errors caused by gaze control are identified and potential methods for reducing these are discussed. Finally, the paper demonstrates that gaze driven locomotion can potentially achieve parity with mouse and keyboard driven locomotion, and shows that gaze is a viable modality for game based locomotion both for able-bodied and disabled users alike.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Vickers, S., Istance, H., Hyrskykari, A. Ali, N., and Bates, R. (2008). Keeping an Eye on the Game: Eye Gaze Interaction with Massively Multiplayer Online Games and Virtual Communities for Motor Impaired Users. Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies; ICDVRAT 2008, Maia, Portugal, 8th-10th September 2008 &amp;lt;br /&amp;gt; Link: http://www.icdvrat.reading.ac.uk/2008/papers/ICDVRAT2008_S04_N05_Vickers_Istance_et_al.pdf&amp;lt;br /&amp;gt;(according to the [http://www.icdvrat.reading.ac.uk/2008/abstracts.htm ICDVRAT2008 web page], the link will activate on 1 March 2009)&lt;br /&gt;
&lt;br /&gt;
==== Gaze and voice based game interaction: the revenge of the killer penguins ====&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Wilcox, T., Evans, M., Pearce, C., Pollard, N., and Sundstedt, V. 2008. Gaze and voice based game interaction: the revenge of the killer penguins. In ACM SIGGRAPH 2008 Posters (Los Angeles, California, August 11 - 15, 2008). SIGGRAPH '08. ACM, New York, NY, 1-1. &amp;lt;br /&amp;gt; Link: http://doi.acm.org/10.1145/1400885.1400972&lt;br /&gt;
&lt;br /&gt;
==== Gaze vs. Mouse in Games: The Effects on User Experience ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; The possibilities of eye-tracking technologies in educational gaming are seemingly endless. The question we need to ask is what the effects of gaze-based interaction on user experience, strategy during learning and problem solving are. In this paper we evaluate the effects of two gaze based input techniques and mouse based interaction on user experience and immersion. In a between-subject study we found that although mouse interaction is the easiest and most natural way to interact during problem-solving, gaze-based interaction brings more subjective immersion. The findings provide a support for gaze interaction methods into computer-based educational environments.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Gowases, T., Bednarik, R., and Tukiainen, M. (2008) Gaze vs. Mouse in Games: The Effects on User Experience. In Proceedings of the International Conference on Computers in Education, ICCE 2008, pp. 773-777. &amp;lt;br /&amp;gt; Link: http://www.apsce.net/icce2008/papers/ICCE2008-paper280.pdf&amp;lt;br /&amp;gt;See also the video presentation of the paper at http://www.youtube.com/watch?v=HeHh3y3Z4Do&lt;br /&gt;
&lt;br /&gt;
==== EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Physical activity has emerged as a novel input modality for so-called active video games. Input devices such as music instruments, dance mats or the Wii accessories allow for novel ways of interaction and a more immersive gaming experience. In this work we describe how eye movements recognised from electrooculographic (EOG) signals can be used for gaming purposes in three different scenarios. In contrast to common video-based systems, EOG can be implemented as a wearable and light-weight system which allows for long-term use with unconstrained simultaneous physical activity. In a stationary computer game we show that eye gestures of varying complexity can be recognised online with equal performance to a state-of-the-art video-based system. For pervasive gaming scenarios, we show how eye movements can be recognised in the presence of signal artefacts caused by physical activity such as walking. Finally, we describe possible future context-aware games which exploit unconscious eye movements and show which possibilities this new input modality may open up.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Bulling, A., Roggen, D., and Tröster, G. (2008) EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography. In the Second International Conference on Fun and Games, LNCS 5294. pp 33-45. Springer. &amp;lt;br /&amp;gt; Link: http://dx.doi.org/10.1007/978-3-540-88322-7_4&lt;br /&gt;
&lt;br /&gt;
==== Measuring and defining the experience of immersion in games ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Despite the word's common usage by gamers and reviewers alike, it is still not clear what immersion means. This paper explores immersion further by investigating whether immersion can be defined quantitatively, describing three experiments in total. The first experiment investigated participants’ abilities to switch from an immersive to a non-immersive task. The second experiment investigated whether there were changes in participants’ eye movements during an immersive task. The third experiment investigated the effect of an externally imposed pace of interaction on immersion and affective measures (state anxiety, positive affect, negative affect). Overall the findings suggest that immersion can be measured subjectively (through questionnaires) as well as objectively (task completion time, eye movements). Furthermore, immersion is not only viewed as a positive experience: negative emotions and uneasiness (i.e. anxiety) also run high.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Jennett, C., Cox, A.L., Cairns, P., Dhoparee, S., Epps, A., Tijs, T., and Walton, A. (2008) Measuring and defining the experience of immersion in games. Int. J. Hum.-Comput. Stud. 66(9), 641-661. &amp;lt;br /&amp;gt; Link: http://dx.doi.org/10.1016/j.ijhcs.2008.04.004&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
=== Papers II: Tracking Gaze in Virtual Environments ===&lt;br /&gt;
&lt;br /&gt;
==== Computational mechanisms for gaze direction in interactive visual environments ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Next-generation immersive virtual environments and video games will require virtual agents with human-like visual attention and gaze behaviors. A critical step is to devise efficient visual processing heuristics to select locations that would attract human gaze in complex dynamic environments. One promising approach to designing such heuristics draws on ideas from computational neuroscience. We compared several such heuristics with eye movement recordings from five observers playing video games, and found that heuristics which detect outliers from the global distribution of visual features were better predictors of human gaze than were purely local heuristics. Heuristics sensitive to dynamic events performed best overall. Further, heuristic prediction power differed more between games than between different human observers. Our findings suggest simple neurally inspired algorithmic methods to predict where humans look while playing video games.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Peters, R. J., &amp;amp;amp; Itti, L. (2006). Computational mechanisms for gaze direction in interactive visual environments. Proceedings of the ACM Eye Tracking Research and Applications (ETRA) Symposium, 2006. 20-27. &amp;lt;br /&amp;gt; Link: http://ilab.usc.edu/publications/doc/Peters_Itti06etra.pdf&lt;br /&gt;
&lt;br /&gt;
==== Towards eye based virtual environment interaction for users with high-level motor disabilities ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; An experiment is reported which extends earlier work on the enhancement of eye pointing in 2D environments through the addition of a zoom facility, to its use in virtual 3D environments using a similar enhancement. A comparison between hand pointing and eye pointing without any enhancement shows a performance advantage for hand based pointing. However, the addition of a 'fly' or 'zoom' enhancement increases both eye and hand based performance, and reduces greatly the difference between these devices. Initial attempts at 'intelligent' fly mechanisms and further enhancements are evaluated.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Bates, R., &amp;amp;amp; Istance, H. O. (2005). Towards eye based virtual environment interaction for users with high-level motor disabilities. Special Issue of International Journal of Disability &amp;amp;amp; Human Development: The International Conference Series on Disability, Virtual Reality and Associated Technologies, Vol. 4(3).&amp;lt;br /&amp;gt; Link: http://www.icdvrat.rdg.ac.uk/2004/papers/S09_N2_Bates_Istance_ICDVRAT2004.pdf&lt;br /&gt;
&lt;br /&gt;
==== Gaze- vs. Hand-Based Pointing in Virtual Environment ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; This paper contributes to the nascent body of literature on pointing performance in Virtual Environments (VEs), comparing gaze- and hand-based pointing. Contrary to previous findings, preliminary results indicate that gaze-based pointing is slower than hand-based pointing for distant objects.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Cournia, N., Smith, J.D., &amp;amp;amp; Duchowski, A.T. (2003). Gaze- vs. Hand-Based Pointing in Virtual Environment, in Proc. SIGCHI 2003 (Short Talks &amp;amp;amp; Interactive Posters), April 5-10, 2003, Ft. Lauderdale, FL. &amp;lt;br /&amp;gt; Link: http://andrewd.ces.clemson.edu/research/vislab/docs/chi03-short.pdf&lt;br /&gt;
&lt;br /&gt;
==== Evaluating gaze-contingent level of detail rendering of virtual environments using visual search ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Level of detail rendering reduces the geometric complexity of objects in virtual reality in order to reduce the computational load on the rendering system. Although the resultant increase in rendering speed is desirable, the behavioral consequences of these techniques for humans performing realistic tasks in complex virtual environments are not well understood. The current study examines the behavior of human observers in virtual environments rendered using a gaze-contingent level of detail criterion. This method takes advantage of the fact that the visual sensitivity of the human visual system is greater at the point of gaze than in the periphery by rendering objects in the periphery with less detail than objects at the point of gaze. In the experiment, participants performed a &amp;quot;virtual search&amp;quot; task, i.e. a visual search task where participants are required to pan the viewport to find a target object among distractors in a virtual environment. Gaze-contingent rendering was employed where the level of detail dropped continuously from the point of gaze. The time to detect and localize the target was measured as a function of the rate of decline in visual detail. Frame rates were allowed to increase with decreasing detail, thus keeping computational load approximately constant. Reaction times to detect the target increased with decreasing detail while reaction times to localize the target decreased with decreasing detail. These results suggest that reduced detail impedes target identification while the increased frame rates due to the reduction in detail facilitates interaction with virtual environments. Overall, these results indicate that the behavioural performance costs of gaze-contingent level of detail techniques can be offset by the behavioural performance gains due to increased rendering speed.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Parkhurst, D., Law, I., &amp;amp;amp; Niebur, E. (2001). Evaluating gaze-contingent level of detail rendering of virtual environments using visual search. In Lab Technical Report 2001-02, 1-6. &amp;lt;br /&amp;gt; Link: http://cnslab.mb.jhu.edu/pubs/Parkhurst_etal01c.pdf&lt;br /&gt;
&lt;br /&gt;
==== Interacting with Eye Movements in Virtual Environments ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Eye movement-based interaction offers the potential of easy, natural, and fast ways of interacting in virtual environments. However, there is little empirical evidence about the advantages or disadvantages of this approach. We developed a new interaction technique for eye movement interaction in a virtual environment and compared it to more conventional 3-D pointing. We conducted an experiment to compare performance of the two interaction types and to assess their impacts on spatial memory of subjects and to explore subjects' satisfaction with the two types of interactions. We found that the eye movement based interaction was faster than pointing, especially for distant objects. However, subjects' ability to recall spatial information was weaker in the eye condition than the pointing one. Subjects reported equal satisfaction with both types of interactions, despite the technology limitations of current eye tracking equipment.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Tanriverdi, V., &amp;amp;amp; Jacob, R. J. K. (2000). Interacting with Eye Movements in Virtual Environments. In CHI '00 Proceedings, ACM, 265-272.&amp;lt;br /&amp;gt; Link: http://www.cs.tufts.edu/~jacob/papers/chi00.tanriverdi.pdf&lt;br /&gt;
&lt;br /&gt;
==== Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users ====&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; R Bates, H.O. Istance and S. Vickers (2008) Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users. In Proceedings of the 4th Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT), University of Cambridge, 13th-16th April 2008.&amp;lt;br /&amp;gt; Link: http://www.cse.dmu.ac.uk/~svickers/pdf/CWUAAT%202008.pdf&lt;br /&gt;
&lt;br /&gt;
==== User Performance of Gaze-based Interaction with On-line Virtual Communities ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; We present the results of an investigation into gaze-based interaction techniques with on-line virtual communities. The purpose of this study was to gain a better understanding of user performance with a gaze interaction technique developed for interacting with 3D graphical on-line communities and games. The study involved 12 participants each of whom carried out 2 equivalent sets of 3 tasks in a world created in Second Life. One set was carried out using a keystroke and mouse emulator driven by gaze, and the other set was carried out with the normal keyboard and mouse.. The study demonstrates that subjects were easily able to perform a set of tasks with eye gaze with only a minimal amount of training. It has also identified the causes of user errors and the amount of performance improvement that could be expected if the causes of these errors can be designed out.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Istance, H., Hyrskykari, A., Vickers, S., and Ali, N. (2008) User Performance of Gaze-based Interaction with On-line Virtual Communities. In Proceedings of the 4th Conference on Communication by Gaze Interaction; COGAIN 2008, Prague, CZ, 2nd-3rd September, pp. 28-32. &amp;lt;br /&amp;gt; Link: [[../../cogain2008/COGAIN2008-Proceedings.pdf http://www.cogain.org/cogain2008/COGAIN2008-Proceedings.pdf]]&lt;br /&gt;
&lt;br /&gt;
==== Gazing into a Second Life: Gaze-Driven Adventures, Control Barriers, and the Need for Disability Privacy in an Online Virtual World ====&lt;br /&gt;
&lt;br /&gt;
'''Abstract:'''&amp;lt;br /&amp;gt; Online virtual worlds such as Second Life and World of Warcraft offer users the chance to participate in potentially limitless virtual worlds, all via a standard desktop pc, mouse and keyboard. This paper addresses some of the interaction barriers and privacy concerns that people with disabilities may encounter when using these worlds, and introduces an avatar Turing test that should be passed for worlds to be accessible for all users. The paper then focuses on the needs of high-level motor disabled users who may use gaze control as an input modality for computer interaction. A taxonomy and survey of interaction are introduced, and an experiment in gaze based interaction is conducted within these virtual worlds. The results of the survey highlight the barriers where people with disabilities cannot interact as efficiently as able-bodied users. Finally, the paper discusses methods for enabling gaze based interaction for high-level motor disabled users and calls for game designers to consider disabled users when designing game interfaces.&lt;br /&gt;
&lt;br /&gt;
'''Reference:'''&amp;lt;br /&amp;gt; Vickers, S., Bates, R., Istance, H. (2008). Gazing into a Second Life: Gaze-Driven Adventures, Control Barriers, and the Need for Disability Privacy in an Online Virtual World. Proceedings of the 7th International Conference on Disability, Virtual Reality and Associated Technologies; ICDVRAT 2008, Maia, Portuagal, 8th-10th September 2008. &amp;lt;br /&amp;gt; Link: http://www.icdvrat.reading.ac.uk/2008/papers/ICDVRAT2008_S04_N04_Vickers_Bates_et_al.pdf&amp;lt;br /&amp;gt;(according to the [http://www.icdvrat.reading.ac.uk/2008/abstracts.htm ICDVRAT2008 web page], the link will activate on 1 March 2009)&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
=== Relevant Websites ===&lt;br /&gt;
&lt;br /&gt;
==== COGAIN - Leisure Applications ====&lt;br /&gt;
&lt;br /&gt;
Gaze-controlled games and leisure applications available via the COGAIN web portal&amp;lt;br /&amp;gt;Link: [[../../downloads/leisure-applications.1 http://www.cogain.org/downloads/leisure-applications]]&lt;br /&gt;
&lt;br /&gt;
==== SpecialEffect GameBase ====&lt;br /&gt;
&lt;br /&gt;
The [http://www.specialeffect.org.uk/ SpecialEffect] GameBase provides links to accessible computer games. A group of youngsters, all with disabilities, reviewed these games and helped in the creation of the website. It is hoped that the information on this site will help other young players to find games that are suitable for their interests and abilities. Each game review provides information about how a game is controlled, how fast it is, how much it is likely to cost etc., which saves our players from having to spend time/money on games that would not be suitable for them in the first place. Our 'Comments' section provides a format where players can keep adding tips and tricks for each other. &amp;lt;br /&amp;gt; Link: http://www.specialeffect.org.uk/pages/gamebase.htm&lt;br /&gt;
&lt;br /&gt;
==== Game Accessibility ====&lt;br /&gt;
&lt;br /&gt;
All about game accessibility. For ALL disabled and interested gamers. Contains contributions to gaze controlled computer games.&amp;lt;br /&amp;gt; See especially:&amp;lt;br /&amp;gt; Resources (papers, videos etc.): http://www.game-accessibility.com/index.php?pagefile=papers&amp;lt;br /&amp;gt; Forum: http://www.game-accessibility.com/forum/index.php&lt;br /&gt;
&lt;br /&gt;
==== OneSwitch.Org ====&lt;br /&gt;
&lt;br /&gt;
A resource of fun ideas and 'assistive technology' aimed at moderate to severely learning/physically disabled people.&amp;lt;br /&amp;gt; Link: http://www.oneswitch.org.uk/&lt;br /&gt;
&lt;br /&gt;
'''Design Tips For: Eye Tracker Games'''&amp;lt;br /&amp;gt;Link: http://switchgaming.blogspot.com/2008/08/design-tips-for-eye-tracker-games.html&lt;br /&gt;
&lt;br /&gt;
'''Head, Mouth and Eye Controls'''&amp;lt;br /&amp;gt; At oneswitch.org you can find a list detailing a number of different styles of head, mouth and eye operated controllers. Most of these are for PCs and Apple computers, but there are alternatives for games consoles too. Some of these devices are very expensive, so it is always worth trying to track down some way of trying them out before you buy. &amp;lt;br /&amp;gt; See http://www.oneswitch.org.uk/1/AGS/AGS-head.htm.&lt;br /&gt;
&lt;br /&gt;
'''Software Downloads'''&amp;lt;br /&amp;gt; Oneswitch.org provides more than 70 one switch games of different types (adventure, arcade classics, platformers, puzzle &amp;amp;amp; skill games, race games, shoot-em-ups…) for free download. In all likelihood a large number of these are suitible for gaze control. Alongside this you will find articles, instructions and more at &amp;lt;br /&amp;gt;http://www.oneswitch.org.uk/2library.htm&amp;lt;br /&amp;gt;http://www.oneswitch.org.uk/4/games/0index.htm&lt;br /&gt;
&lt;br /&gt;
==== levelgames.net ====&lt;br /&gt;
&lt;br /&gt;
A website focusing on switch games designed to be widely accessible for players who have Muscular Dystrophy, Cerebral Palsy, Spinal Injury, Head Injury or other physical disabilities.&amp;lt;br /&amp;gt; Link: http://www.levelgames.net/&lt;br /&gt;
&lt;br /&gt;
==== Games designed for the MyTobii Eye Tracking System (by Oleg Špakov) ====&lt;br /&gt;
&lt;br /&gt;
This page contains a list of applications developed to run specifically in the MyTobii environment. Each application registeres itself on installation to be recognized by MyTobii as MyTobii Partner Application.&amp;lt;br /&amp;gt; Link: http://www.cs.uta.fi/~oleg/mytobii.html&lt;br /&gt;
&lt;br /&gt;
==== World of Warcraft Percept Interface (by Oleg Komogortsev) ====&lt;br /&gt;
&lt;br /&gt;
Oleg Komogortsev created an interface that allows to play computer games using gaze control without the use of mouse or keyboard. The interface was tested with the virtual reality game World of Warcraft.&amp;lt;br /&amp;gt; Link: http://www.cs.kent.edu/~okomogor/wowpercept/wowpercept.htm&lt;br /&gt;
&lt;br /&gt;
==== Adventure Game Studio ====&lt;br /&gt;
&lt;br /&gt;
Adventure Game Studio (AGS for short) allows you to create your own point-and-click adventure games, similar to the early 90's Sierra and Lucasarts adventures. It consists of an easy-to-use development environment, and run-time engine. AGS is free. You need no programming experience to make a game using AGS - setting most game options is just a matter of point-and-click (though scripting is of course available if you prefer). &amp;lt;br /&amp;gt; Link: http://www.adventuregamestudio.co.uk/&lt;br /&gt;
&lt;br /&gt;
==== Entertainment Software designed for an EOG based Eye Tracking System: EagleEyes ====&lt;br /&gt;
&lt;br /&gt;
This website contains various applications software designed to be run with EagleEyes and Camera Mouse and other similar systems. (Includes Games, Spell and Speak, System)&amp;lt;br /&amp;gt; Link: http://www.bc.edu/schools/csom/eagleeyes/downloads.html&lt;br /&gt;
&lt;br /&gt;
==== List of open source games ====&lt;br /&gt;
&lt;br /&gt;
Open source games are computer games assembled using open-source software and open content. These games are open to modifications, such as implementing gaze control.&amp;lt;br /&amp;gt; Link: http://en.wikipedia.org/wiki/List_of_open_source_games&lt;br /&gt;
&lt;br /&gt;
==== Game Accessibility Suite ====&lt;br /&gt;
&lt;br /&gt;
Code library and utilities to enhance accessibility to existing and future games. &amp;lt;br /&amp;gt; Link: http://sourceforge.net/projects/gameaccess/&lt;br /&gt;
&lt;br /&gt;
==== Retro Remakes Forum ====&lt;br /&gt;
&lt;br /&gt;
Forum on game accessibility containing threads on eye- and head control. &amp;lt;br /&amp;gt; Link: http://www.retroremakes.com/forum2/forumdisplay.php?f=84&lt;br /&gt;
&lt;br /&gt;
==== Eye Trackers ====&lt;br /&gt;
&lt;br /&gt;
Catalogue of currently available eye trackers for interactive applications. &amp;lt;br /&amp;gt; Link: [[http://www.cogain.org/wiki/Eye_Trackers http://www.cogain.org/wiki/Eye_Trackers]]&lt;br /&gt;
&lt;br /&gt;
==== Gaze-aware Space Vampires (by Chris Schmelzle) ====&lt;br /&gt;
&lt;br /&gt;
Chris Schmelzle trialled how adding information from eye movements to improve the game's artificial intelligence can enhance the gaming experience -- when the game enemies know where the player is looking at. &amp;lt;br /&amp;gt; Link: http://www.cschmelzle.net/eye.html&lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
=== Multimedia ===&lt;br /&gt;
&lt;br /&gt;
==== YouTube video clips on gaze &amp;amp;amp; games ====&lt;br /&gt;
&lt;br /&gt;
* [http://www.getacd.org/listen_YPbhSoj3ZFM/bejeweled_2_advanced_player_eye_tracking_study Bejeweled 2 Advanced Player Eye-tracking Study], video &amp;amp; information&lt;br /&gt;
* [http://www.youtube.com/watch?v=6PZpsWzjnvE Dreamhack Eye-Tracking Experiment], by the BTH game research group.&lt;br /&gt;
* [http://www.youtube.com/watch?v=3pRWYE2LRhk Eye Based Video Game Control: Quake 2]&lt;br /&gt;
* [http://www.youtube.com/watch?v=IX6H83ZgYGE Eye Based Video Game Control: Neverwinter Nights]&lt;br /&gt;
* [http://www.youtube.com/watch?v=3JkdFFxdlsw Eye Based Video Game Control: Missile Command]&lt;br /&gt;
* [http://www.youtube.com/watch?v=qbotg30L0rc Eye Gaze Computer Game Crazy Taxi via Tobii PCEye]&lt;br /&gt;
* Eye Gaze Driven Second Life - [http://www.youtube.com/watch?v=ClPAFITx9yY Camera], [http://www.youtube.com/watch?v=UFrvl-eFsAQ&amp;amp;eurl Locomotion]. &amp;lt;br /&amp;gt; See also http://www.cse.dmu.ac.uk/~svickers/scvideos.html&lt;br /&gt;
* [http://www.youtube.com/watch?v=NBIjWA8CHls Eye Gaze Interaction with World of Warcraft]&lt;br /&gt;
* [http://www.youtube.com/watch?v=0Nz68kz51Os Gaze-Controlled Applications at University of Tampere: Board Games]&lt;br /&gt;
* [http://www.youtube.com/watch?v=ldw3HugJ2rE Gaze-Controlled First-Person-Shooter], Eye Tracking with IntelliGaze(tm) technology&lt;br /&gt;
* [http://www.youtube.com/watch?v=lGehsY7pcrc House of the Dead with Eye Tracking]&lt;br /&gt;
* [http://www.youtube.com/watch?v=OfnQDJW6xXA Interaction using Gaze Direction], Gaze (for direction changes) + key press&lt;br /&gt;
* [http://www.youtube.com/watch?v=3j2bEWGkDKE Playing Angry Birds with Gaze Control], IntelliGaze, Desktop 2.0&lt;br /&gt;
* [http://www.youtube.com/watch?v=QmvrR5z4NOA Playing Unreal Tournament with an eye-tracker]&lt;br /&gt;
* [http://www.youtube.com/watch?v=XwMoAqgikRM Solitaire with an eye-tracker]&lt;br /&gt;
* [http://www.youtube.com/watch?v=i0xgyBNVrHk Tobii EyeAsteroids eye-controlled arcade game]&lt;br /&gt;
&lt;br /&gt;
=== Organisations ===&lt;br /&gt;
==== SpecialEffect ====&lt;br /&gt;
SpecialEffect is a charitable organisation dedicated to helping ALL young people with disabilities to enjoy computer games. For these children, the majority of computer games are simply too quick or too difficult to play, and we can help them and their parents to find out which games they CAN play, and how to adapt those games that they can't. &amp;lt;br /&amp;gt; Link: http://www.specialeffect.org.uk/&lt;br /&gt;
==== IGDA (International Game Developers Association) Game Accessibility Special Interest group (GA-SIG) ====&lt;br /&gt;
The GA-SIG was formed to help the game community strive towards creating mainstream games that are universally accessible to all, regardless of disability.&amp;lt;br /&amp;gt; Link: http://www.igda.org/wiki/index.php/Game_Accessibility_SIG&lt;br /&gt;
==== Pin Interactive ====&lt;br /&gt;
Game Accessibility Development company&amp;lt;br /&amp;gt; Link: http://www.pininteractive.com/&lt;br /&gt;
&lt;br /&gt;
----&lt;br /&gt;
&lt;br /&gt;
Please email any additions or corrections to [mailto:office@cogain.org office (at) cogain (dot) org]&lt;br /&gt;
&lt;br /&gt;
NOTE: This web resource is part of the ''COGAIN Deliverable D4.5 Online information resources on how to use gaze for the control of selected games'' by Michael Heubner (Technical University of Dresden), Fiona Mulvey (Technical University of Dresden) and Päivi Majaranta (University of Tampere). Thanks to: Faten Ahmed (Technical University of Dresden), Oleg Špakov (University of Tampere) and Barrie Ellis ([http://oneswitch.org.uk/ Oneswitch.org.uk]). The original version was prepared in August 2007 and delivered in October 2007. New material is added as it appears. &lt;br /&gt;
&lt;br /&gt;
See also the more general '''[[Bibliography Gaze Interaction|Gaze Interaction Bibliography]]'''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2699</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2699"/>
		<updated>2011-11-02T11:53:05Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
== Self-Made Eye Gaze Frames ==&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== More Information ==&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2698</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2698"/>
		<updated>2011-11-02T11:52:46Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Self-Made Eye Gaze Frames */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
== Self-Made Eye Gaze Frames ==&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
== More Information ==&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2697</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2697"/>
		<updated>2011-11-02T11:52:25Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
== Self-Made Eye Gaze Frames ==&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
== More Information ==&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2696</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2696"/>
		<updated>2011-11-02T11:52:08Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
== Self-Made Eye Gaze Frames ==&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== More Information ==&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2695</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2695"/>
		<updated>2011-11-02T11:51:06Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== More Information ==&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2694</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2694"/>
		<updated>2011-11-02T11:50:38Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
== More Information ==&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2693</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2693"/>
		<updated>2011-11-02T11:49:49Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* SPEAKBOOK */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
=== More Information ===&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2692</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2692"/>
		<updated>2011-11-02T11:47:15Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== More Information ===&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2691</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2691"/>
		<updated>2011-11-02T11:46:54Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== More Information ===&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2690</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2690"/>
		<updated>2011-11-02T11:46:25Z</updated>

		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== SPEAKBOOK ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
=== More Information ===&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2689</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2689"/>
		<updated>2011-11-02T11:45:27Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* More Information */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== More Information ===&lt;br /&gt;
&lt;br /&gt;
'''SPEAKBOOK'''&lt;br /&gt;
&lt;br /&gt;
* [http://www.speakbook.org/ SPEAKBOOK], free, downloadable communication book, with instructions. This is an extended version of the simple communication frames illustrated above.&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2688</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2688"/>
		<updated>2011-11-02T11:42:25Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* More Information */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== More Information ===&lt;br /&gt;
&lt;br /&gt;
'''Videos'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
'''Guides'''&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
'''Electronic communication board'''&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
'''Articles'''&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2687</id>
		<title>Eye Gaze Communication Board</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Eye_Gaze_Communication_Board&amp;diff=2687"/>
		<updated>2011-11-02T11:41:18Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Eye gaze frame 2 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Eye Trackers]]&lt;br /&gt;
== Eye Gaze Communication Board ==&lt;br /&gt;
&lt;br /&gt;
[[Image:esa_interacting.jpg|thumb|Esa interacts with his Mum using E-tran frame with his gaze]]&lt;br /&gt;
&lt;br /&gt;
[[Image:COGAIN2006-Plenary_lets-try-lowtech2.jpg|thumb|Low-tech gaze-based communication aid, self made from cardboard]]&lt;br /&gt;
&lt;br /&gt;
'''Low-tech eye pointing, cheap (self-made) gaze communication board, &amp;quot;first aid&amp;quot; solution for acute communication needs'''&lt;br /&gt;
&lt;br /&gt;
'''Gaze Communication Board''' or '''Eye Communication Frame''' is a cheap, fast, easy-to-use eye communication method. There are several options that can be bought from stores that sell communication aids and assistive technology, such as the &amp;quot;E-tran (eye transfer) frame&amp;quot; illustrated below.&lt;br /&gt;
&lt;br /&gt;
The gaze communication board is a see-through frame with letters or pictures on it. The board is placed between two people who want to have a conversation using the board. The board can either be attached on a table or wheelchair, or if one of the conversation partners is able-bodied, s/he can hold the board in front of the disabled person.&lt;br /&gt;
&lt;br /&gt;
The communication board or frame can be made either from transparent plexi-glass, or it can also be made from cardboard with a big hole in the middle. In either case, it is important that the person on the other side can easily see the eye movements of the other person.&lt;br /&gt;
&lt;br /&gt;
The disabled person then looks at one of the letters and the other person interprets her/his gaze direction and speaks out the letter to confirm. After confirmation, the next letter is interpreted similarly: one person looks at the letter and the other acts as a human &amp;quot;eye tracker&amp;quot; and interprets which letter the other person is pointing. Just like a computer-based eye tracking system, also here feedback (speaking out the letter) is important to confirm the selection and to allow error correction in case of a misinterpretation.&lt;br /&gt;
&lt;br /&gt;
In the E-Tran frame, the letter are grouped into corners of the frame. Each letter in a group has different color coding. To select a letter, the user first looks at the letter (group) and then the color &amp;quot;button&amp;quot; that matches the color of the letter. Having the letters in such groups reduces errors, though the 2-step selection is a bit slower than direct pointing. It is a bit hard to interpret gaze direction accurately if the targets are small and close to each other, but it is very easy to see which one of the corners of the frame the gaze is pointed at.&lt;br /&gt;
&lt;br /&gt;
= Self-Made Eye Gaze Frames =&lt;br /&gt;
&lt;br /&gt;
It is possible to make an eye communication frame from a card board (see the picture above). Below are two different versions of the eye gaze frame, with instructions for making them and printable templates for both.&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 1 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template1.jpg|thumb|picture of the eye gaze frame template 1]]&lt;br /&gt;
&lt;br /&gt;
The idea for this frame came from Anette Dinesen (Tønsberg, Norway), thanks Anette!&amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and the instructions in PDF format, [[File:DART-eye-gaze-frames-with-instructions.pdf|click here ]]&lt;br /&gt;
* To download a ZIP packet containing the frames in PowerPoint (ppt) format and the instructions in Microsoft Word (doc) format, [[File:DART-eye-gaze-frames-with-instructions.zip|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Eye gaze frame 2 ===&lt;br /&gt;
&lt;br /&gt;
[[Image:eye-gaze-frame-template2.jpg|thumb|picture of the eye gaze frame template 2]]&lt;br /&gt;
&lt;br /&gt;
This frame was made by Helena Hörkeby, who is a speech and language therapist in Stockholm, thanks Helena! &amp;lt;br /&amp;gt; Instructions and a printable template for this eye gaze frame is available in two different formats:&lt;br /&gt;
&lt;br /&gt;
* To download the eye gaze frame and instructions in PDF format, [[File:DART-eye-gaze-frame.pdf|click here]]&lt;br /&gt;
* To download the eye gaze frame and instructions in Microsoft PowerPoint (ppt) format, [[File:DART-eye-gaze-frame.ppt|click here]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== More Information ===&lt;br /&gt;
&lt;br /&gt;
'''See also'''&lt;br /&gt;
&lt;br /&gt;
YouTube video explaining how to use gaze communication frame:&lt;br /&gt;
* [http://www.youtube.com/watch?v=6_PgPFSV_hs in English] (same as the video below) &lt;br /&gt;
* [http://www.youtube.com/watch?v=dlblMVK6sf0 in German]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The Call Centre has prepared an excellent [http://callcentre.education.ed.ac.uk/downloads/quickguides/aac/etran.pdf Quick Guide for Eye Pointing and using an Etran Frame ]&lt;br /&gt;
&lt;br /&gt;
[http://www.megabee.net/ MegaBee] is an electronic version of the see-through communication board. Using it, the assistant / communication partner does not need to remember the spelled letters but can simply click on the colors the user is looking at. The spelled letters will appear into the text field below the letter board.&lt;br /&gt;
&lt;br /&gt;
Goossens', C. A., &amp;amp;amp; Crain, S. S. (1987). Overview of nonelectronic eye gaze communication techniques. ''Augmentative and Alternative Communication, 3,'' 77-89. [http://www.informaworld.com/smpp/content~content=a714043468~db=alhe Order the article online]&lt;br /&gt;
&lt;br /&gt;
Scott, J. 1998. Low Tech Methods of Augmentative Communication. In Allan Wilson (Ed.) ''Augmentative Communication in Practice: An Introduction'', (2nd ed.), 13-18. [http://www.acipscotland.org.uk/Scott.pdf Available online (PDF)]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
	<entry>
		<id>https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2686</id>
		<title>Links</title>
		<link rel="alternate" type="text/html" href="https://www.wiki.cogain.org/index.php?title=Links&amp;diff=2686"/>
		<updated>2011-10-25T05:29:42Z</updated>

		<summary type="html">&lt;p&gt;Admin: /* Gaze-Based Interaction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:Links]][[Category:Reference]]&lt;br /&gt;
== Links ==&lt;br /&gt;
&lt;br /&gt;
Links and useful resources&lt;br /&gt;
&lt;br /&gt;
=== Eye Gaze Communication and Control ===&lt;br /&gt;
&lt;br /&gt;
* '''[[Eye Typing Systems]]''', a ''comprehensive'' list of Eye Typing / Gaze Writing systems and related links&lt;br /&gt;
* '''[[Gaze-Controlled Games]]''', a list of online information resources on using gaze for the control of games and other leisure applications&lt;br /&gt;
* [http://www.als-communication.dk/ ALS Communication], video clips of communication by gaze, information about ALS and much more.&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeDraw/ EyeDraw], draw pictures solely by eyes, University of Oregon&lt;br /&gt;
* [http://www.cs.uoregon.edu/research/cm-hci/EyeMusic/ EyeMusic], translates eye movements into sound and music, University of Oregon&lt;br /&gt;
* [http://www.specialeffect.org.uk/ SpecialEffect], database of computer games and leisure software for special access technology (including eye tracking)&lt;br /&gt;
* [http://www.research.linst.ac.uk/eyemouse/index.htm The Eye Mouse Project], study of eye movements during [http://www.research.linst.ac.uk/drawing_cognition/eyecontrol.htm drawing &amp;amp;amp; eye painting] for disabled people, Camberwell College of Arts&lt;br /&gt;
* [http://www.it-c.dk/research/EyeGazeInteraction/ Eye Gaze Interaction] research at the IT University of Copenhagen, see also [http://www.itu.dk/people/malte/eyeTrackEng.html Eye-Based IT]&lt;br /&gt;
* [http://www.cse.dmu.ac.uk/~rbates/research/research.htm Research on zooming interfaces and eye command interfaces] by R. Bates.&lt;br /&gt;
* [http://www.brl.ntt.co.jp/people/takehiko/ Gaze interaction research] by T. Ohno, FreeGaze tracker, navigation support by eye movements, fast menu selection method etc.&lt;br /&gt;
* [http://www.archimuse.com/mw2003/papers/milekic/milekic.html The More You Look the More You Get]&amp;lt;nowiki&amp;gt;: Intention-Based Interface using Gaze-tracking by Slavko Milekic in MW2003.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gaze-Based Interaction ===&lt;br /&gt;
&lt;br /&gt;
* [http://vret.ces.clemson.edu/sigcourse/ Eye-Based Interaction in Graphical Systems]&amp;lt;nowiki&amp;gt;: Theory &amp;amp;amp; Practice. A SIGGRAPH 2000 Course by A. Duchowski &amp;amp;amp; R. Vertegaal.&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
* [http://www.eyes-tea.net/history/021017/seifert.html Design of Gaze-Based Interaction as Part of Multimodal Human-Computer Interaction], a presentation by K. Seifert at [http://www.eyes-tea.net/ Eyes-Tea.NET]&lt;br /&gt;
* [http://www.almaden.ibm.com/cs/blueeyes/magic.html BlueEyes Magic Pointing] at IBM Almaden Research Center&lt;br /&gt;
* [http://main.cs.qub.ac.uk/~fmurtagh/eyemouse/ Eye-Gaze Tracking Research] by F. Murtagh. Eye mouse research.&lt;br /&gt;
* [http://hci.stanford.edu/research/GUIDe/ GUIDe: Gaze-enhanced UI Design], EyePoint, point and selection, application switching, scolling and panning by gaze&lt;br /&gt;
* [http://design.open.ac.uk/DV/ Designing with Vision], a tool for supporting the design process with gaze-added CAD tool. Includes info &amp;amp; free download of the software!&lt;br /&gt;
&lt;br /&gt;
=== Eye Tracking Conferences and Meetings ===&lt;br /&gt;
&lt;br /&gt;
* '''COGAIN Conference''', [[[cogain2009 2009]]] [[[cogain2008 2008]]] [[[cogain2007 2007]]] [[[cogain2006.1 2006]]] [[[events/camp2005/conference-program 2005]]]&lt;br /&gt;
* '''ECEM''', European Conference on Eye Movements [[[http://www.ecem2009.org/ ECEM 2009]]] [[[http://www.ecem2007.org/ ECEM 2007]]] [[[http://www.ecem.ch/ ECEM13 2005]]] [[[http://www.dundee.ac.uk/psychology/ecem12/ ECEM12 2003]]] [[[http://congress.utu.fi/ecem11/ ECEM11 2001]]] [[[http://www.phys.uu.nl/~ecem10/Home.html ECEM10 1999]]]&lt;br /&gt;
* '''ETRA''', Eye Tracking Research &amp;amp;amp; Applications Symposium [[[http://www.e-t-r-a.org/2008/ ETRA 2008]]] [[[http://www.e-t-r-a.org/2006/ ETRA 2006]]] [[[http://www.e-t-r-a.org/2004/ ETRA 2004]]] [[[http://www.e-t-r-a.org/2002/ ETRA 2002]]] [[[http://www.e-t-r-a.org/2000/ ETRA 2000]]]&lt;br /&gt;
* '''ECVP''', European Conference on Visual Perception [[[http://www.ecvp2008.org/ ECVP 2008]]] [[[http://www.ecvp.org/meetings.html Archive of previous meetings]]]. See also [http://www.perceptionweb.com/ECVP.html ECVP Abstracts archive]&lt;br /&gt;
* '''SWAET''', the Scandinavian Workshop on Applied Eye-tracking [[[http://www.humlab.se/conference/swaet/2008/ SWAET 2008]]] [[[http://www.sol.lu.se/humlab/eyetracking/swaet2007/ SWAET 2007]]] [[[http://www.sol.lu.se/humlab/eyetracking/conference/ SWAET 2006]]]&lt;br /&gt;
* '''VIV''', Vision in Vehicles [[[http://www.lboro.ac.uk/research/esri/applied-vision/projects/visioninvehicles/viv11.html VIV11 2006]]] [[[http://ibs.derby.ac.uk/viv/viv10.html VIV10 2003]]] [[[http://ibs.derby.ac.uk/viv9/ VIV9 2001]]]&lt;br /&gt;
* [http://www.inb.uni-luebeck.de/bip2005/ '''BIP 2005'''], International Workshop on Bioinspired Information Processing: Cognitive modeling and gaze-based communication&lt;br /&gt;
* [http://isg.cs.tcd.ie/iwet/ The First Irish Workshop on Eye-Tracking]&lt;br /&gt;
* [http://ckir.hkkk.fi/events_eyesymposium.htm Second Finnish Symposium for Eye-movement Research]&lt;br /&gt;
* [http://www.eyes-tea.net/ Eyes-Tea meetings]&lt;br /&gt;
&lt;br /&gt;
=== Technology ===&lt;br /&gt;
&lt;br /&gt;
* [[Eye Trackers|Catalogue of currently available eye trackers for interactive applications within AAC]], currently available systems targeted at people with disabilities&lt;br /&gt;
* [[Eye Trackers#Eyetrackers for eye movement research, analysis and evaluation|Eye trackers for eye movement research, analysis and evaluation]]&lt;br /&gt;
* [[Eye Trackers#Open source gaze tracking, freeware and low cost eye tracking|Open-source and low-cost eye tracking systems]]&lt;br /&gt;
* [http://www.eyetrackawards.com/ Tobii EyeTrack Awards] - got a good idea and need an eye tracker to do the research?&lt;br /&gt;
&lt;br /&gt;
=== General Information on Eye Anatomy, Physiology and Movements ===&lt;br /&gt;
&lt;br /&gt;
* [http://faculty.washington.edu/chudler/eyetr.html Eye Anatomy and Function], learn how the sense of sight works&lt;br /&gt;
* [http://cim.ucdavis.edu/eyes/version1/eyesim.htm Eye Simulation Application], simulates eye motion and demonstrates the effects of disabling eyes muscles and/or nervers (Macromedia Shockwave Plug-in needed)&lt;br /&gt;
* [http://lasereyesurgeons.net/eye-anatomy-sight-complications Eye Anatomy and Eye Sight Complications], a lot of information of resources as well as complications involved with eye and eye complications&lt;br /&gt;
* [http://www.yorku.ca/eye/ The Joy of Visual Perception], A Web Book by Peter K. Kaiser&lt;br /&gt;
* [http://www.journalofvision.org/ Journal of Vision], papers available in HTML or PDF format!&lt;br /&gt;
* [http://www.jiscmail.ac.uk/files/pupil/index.html The Pupil Page] and PUPIL mailing list Archives&lt;br /&gt;
* [http://www.elsevier.com/wps/find/journaldescription.cws_home/263/description#description Vision Research Journal]&lt;br /&gt;
See also the educational resources found here on COGAIN website, e.g.&lt;br /&gt;
* [[Training]], educational materials for (self)training of gaze interaction and eye tracking related issues&lt;br /&gt;
* [[User Involvement]], filled with information and useful tips for new users!&lt;br /&gt;
* [http://www.cogain.org/wiki/Category:Bibliography Bibliography], references for gaze interaction articles&lt;br /&gt;
&lt;br /&gt;
=== Software ===&lt;br /&gt;
&lt;br /&gt;
* [[COGAIN Applications|Applications developed within COGAIN]]&lt;br /&gt;
* [http://www.oatsoft.org/Software/SpecialAccessToWindows SAW (Special Access to Windows)] enables Windows software to be controlled by alternative access devices such as switches, joystick, trackerball, headpointer, or eye tracker.&lt;br /&gt;
* [http://www.oatsoft.org OATS], open source assistive technology software&lt;br /&gt;
* [http://scanpaths.org Scanpaths.org], online repository of scanpath (eye-movement) data&lt;br /&gt;
* [http://appmarket.tobii.com/wiki/index.php/Application_Market_for_Tobii_Eye_Trackers Application Market for Tobii Eye Trackers], a list of apps developed for Tobii trackers, many of them are freely available for download&lt;br /&gt;
&lt;br /&gt;
=== Blogs on Eye Tracking ===&lt;br /&gt;
&lt;br /&gt;
* [http://gazeinteraction.blogspot.com/ Gaze Interaction Blog] by Martin Tall. Refers and discusses interesting research on gaze interaction.&lt;br /&gt;
* [http://eyetracking.me/ EyeTracking.Me] a Blog by Tommy Strandvall. News, ideas and Tommy's training material on how to use eye tracking in different kind of human behavioral research.&lt;br /&gt;
* [http://www.eyetrackingblog.com/ Eye Tracking Blog] Mostly usability related stuff.&lt;br /&gt;
* [http://igazeweb.blogspot.com/ iGaze Web] Gaze enhanced web browsing (Dan's M.A. thesis blog).&lt;br /&gt;
&lt;br /&gt;
=== Portals and Links Collections ===&lt;br /&gt;
&lt;br /&gt;
* [http://www.eyemovementresearch.com/ Eye Movement Research Portal] linking you to eye movement information and resources around the world&lt;br /&gt;
* [http://www.eyemovement.org/ EYE MOVEMENT Knowledge Base], joint portal for ECEM, JEMR, and EYEMOVEMENT, with a knowledge base for sharing experiences.&lt;br /&gt;
* [http://eyetrackingupdate.com/ Eye-Tracking Update], Keep up to date with the latest Eye-Tracking news and trends&lt;br /&gt;
&amp;lt;!-- &lt;br /&gt;
* [http://www.eyetracking.net/ EyeTracking.net], Internet resource dedicated to eye movement related research and applications ([http://www.eyetracking.net/bueye/etsystem.htm Systems], [http://www.eyetracking.net/bueye/etlabs.htm Labs &amp;amp;amp; people], [http://www.eyetracking.net/bueye/etapps.htm Apps &amp;amp;amp; projects], [http://www.eyetracking.net/bueye/etpapers.htm Papers &amp;amp;amp; courses], [http://www.eyetracking.net/bueye/etjobs.htm Events &amp;amp;amp; jobs])&lt;br /&gt;
* [http://www.cs.ucl.ac.uk/staff/J.McCarthy/library.htm Eyetracking Papers], John Dylan McCarthy's list of eyetracking research papers on web&lt;br /&gt;
* [http://www.cs.uta.fi/hci/gaze/iui2005-tutorial/bibliography.php Gaze-Based HCI Bibliography], tutorial bibliography with links to on-line papers&lt;br /&gt;
--&amp;gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
		
	</entry>
</feed>