4. Bibliography (eye-tracking research)
Abdallahi Ould Mohamed, Matthieu Perreira da Silva, Vincent Courboulay. A history of eye gaze tracking. 2007. hal-00215967 (Pdf)
Duchowski A. (2007) Eye Tracking Methodology. Springer, London. https://doi.org/10.1007/978-1-84628-609-4_1 (Link)
Fitts, P. M., Jones, R. E., Milton, J. L. Eye movements of aircraft pilots during instrument landing approaches. Aero. Engng. Review, 1950, IX(2), 1–6. (Looking for publication)
Fitts, P. M., Jones, R. E., Milton, J. L. Eye fixations of aircraft pilots: III. Frequency, duration, and sequence fixations when flying Air Force Ground Controlled Approach System (GCA). USAF Technical Report No. 5967, Wright Air Development Center, Wright-Patterson AFB, Ohio, February, 1950. (Looking for publication)
Green CS, Bavelier D. Action video game modifies visual selective attention. Nature. 2003 May 29;423(6939):534-7. doi: 10.1038/nature01647. PMID: 12774121. nature.com/articles/nature01647
Huey, Edmund B. “Preliminary Experiments in the Physiology and Psychology of Reading.” The American Journal of Psychology, Vol 9, No. 4 (Jul., 1898): 575-86. Accessed December 5, 2020. https://doi.org/10.2307/1412192
Itti, N. L Dhavale and F. Pighin. Realistic avatar eye and head animation using a neurological model of visual attention. In B. Bosacchi, D. B. Fogel, and J.C. Bezdek, editors, Proc. SPIE 48th Annual International Symposium on Optical Science and Technology, SPIE Press, volume 5200, pages 64-78, Bellingham, WA, August 2003. DOI: https://doi.org/10.1117/12.512618 (Pdf)
Laura Florea, Corneliu Florea, Ruxandra Vranceanu, and Constantin Vertan. 2013. Can Your Eyes Tell Me How You Think? A Gaze Directed Estimation of the Mental Activity. In BMVC 2013 – Electronic Proceedings of the British Machine Vision Conference 2013. BMVA Press, 60.1–60.11. (Pdf)
Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. PupilNet v2.0: Convolutional Neural Networks for CPU based real time Robust Pupil Detection. CoRR abs/1711.00112 (2017). http://arxiv.org/abs/1711.00112 (Pdf)
Hansen DW, Ji Q. In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell. 2010 Mar;32(3):478-500. doi: 10.1109/TPAMI.2009.30. PMID: 20075473. (Link)
Kenneth Alberto Funes Mora, Florent Monay, and Jean-Marc Odobez. 2014. EYEDIAP: A Database for the Development and Evaluation of Gaze Estimation Algorithms from RGB and RGB-D Cameras. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA ’14). ACM, New York, NY, USA, 255–258. https://doi.org/10.1145/2578153.2578190 (Pdf)
Lin, Chern-Sheng, Chia-Chin Huan, C. Chan, M. Yeh and Chuang-Chien Chiu. “Design of a computer game using an eye-tracking device for eye’s activity rehabilitation.” Optics and Lasers in Engineering 42 (2004): 91-108. https://doi.org/10.1016/S0143-8166(03)00075-7
Robert J. K. Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152–169. DOI:https://doi.org/10.1145/123078.128728 (Pdf)
Javal. Louis-Emile, Essai sur la physiologie de la lecture. Annales d’Oculistique, tome LXXIX, série 11. T.9, Mars et Avril, pp.97-117, 1878. http://hdl.handle.net/11858/00-001M-0000-002B-B163-5 (Pdf)
Oliver Jesorsky, Klaus J. Kirchberg, and Robert Frischholz. 2001. Robust Face Detection Using the Hausdorff Distance. In Proceedings of the Third International Conference on Audio- and Video-Based Biometric Person Authentication (AVBPA ’01). Springer International Publishing, Berlin, Heidelberg, 90–95. https://www.bioid.com/facedb/ (Pdf)
Jonsson. E. If looks could kill – An evaluation of eye tracking in computer games. Master’s Thesis in Human Computer Interaction, Department of Numerical Analysis and Computer Science, School of Computer Science and Engineering, Royal Institute of Technology, Denmark, 2005. (Pdf)
Joohwan Kim, Michael Stengel, Alexander Majercik, Shalini De Mello, David Dunn, Samuli Laine, Morgan McGuire, and David Luebke. 2019. NVGaze: An Anatomically-Informed Dataset, for Low-Latency, Near-Eye Gaze Estimation. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), May 4–9, 2019, Glasgow, Scotland UK. ACM, New York, NY, USA, 12 pages. https://doi.org/10.1145/3290605.3300780 (Pdf)
Just. M.A. and P.A. Carpenter. Eye fixations and cognitive processes. Cognitive Psychology, 1976. (Pdf)
K. Krafka, A. Khosla, P. Kellnhofer, H. Kannan, S. Bhandarkar, W. Matusik, and A. Torralba. 2016. Eye Tracking for Everyone. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Vol. 1. 2176–2184. https://doi.org/10.1109/CVPR.2016.239 (Pdf)
Karen Simonyan and Andrew Zisserman. 2014. Very Deep Convolutional Networks for Large-Scale Image Recognition. CoRR abs/1409.1556 (2014). arXiv:1409.1556 http://arxiv.org/abs/1409.1556 (Pdf)
Merchant John, Interim Technical Report. Oculometer. Contract No. NASW-1159 February 25, 1965 – December 25, 1965, Honeywell Inc. Radiation Center, 1965, Boston. (Pdf)
Monty RA. An advanced eye-movement measuring and recording system. Am Psychol. 1975 Mar;30(3):331-5. DOI: 10.1037//0003-066x.30.3.331
C. H. Morimoto and M. R. H. Mimica, Eye gaze tracking techniques for interactive applications, Comput. Vis. Image Underst., Vol. 98, no. 1, pp. 4-24, 2005. (Pdf)
Riquier F. Herbelin B. Grillon, H. and D. Thalmann. Use of virtual reality as therapeutic tool for behavioural exposure in the ambit of social anxiety disorder treatment. In ICDVRAT, International Conference Series on Disability, Virtual Reality and Associated technologies, pages 105-112, Esbjerg, 18-20 September 2006. (Pdf)
Poole, A., & Ball, L. (2004). Eye Tracking in Human-Computer Interaction and Usability Research: Current Status and Future Prospects. (Pdf)
Rayner. Keith, Eye Movements in Reading and Information Processing: 20 Years of Research, Psychological Bulletin, 124 (3), 372-422. (Pdf)
Sennersten, Charlotte, Eye movements in an Action Game Tutorial, PhD Thesis, Department of Cognitive Science, 2004. (Link)
Y. Sugano, Y. Matsushita, and Y. Sato. 2014. Learning-by-Synthesis for Appearance-Based 3D Gaze Estimation. 2014 IEEE Conference on Computer Vision and Pattern Recognition 1, 1 (June 2014), 1821–1828. https://doi.org/10.1109/CVPR.2014.235 (Pdf)
Lech Świrski and Neil A. Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. In Proceedings of ECEM 2013. http://www.cl.cam.ac.uk/research/rainbow/projects/eyemodelfit/ (Pdf)
Marc Tonsen, Julian Steil, Yusuke Sugano, and Andreas Bulling. 2017. InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 106 (Sept. 2017), 21 pages. https://doi.org/10.1145/3130971 (Pdf)
Marc Tonsen, Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2016. Labelled Pupils in the Wild: A Dataset for Studying Pupil Detection in Unconstrained Environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA ’16). ACM, New York, NY, USA, 139–142. https://doi.org/10.1145/2857491.2857520 (Pdf)
Erroll Wood, Tadas Baltruaitis, Xucong Zhang, Yusuke Sugano, Peter Robinson, and Andreas Bulling. 2015. Rendering of Eyes for Eye-Shape Registration and Gaze Estimation. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV) (ICCV ’15). IEEE Computer Society, Washington, DC, USA, 3756–3764. https://doi.org/10.1109/ICCV.2015.428 (Pdf)
Wade, Nicholas J. “Pioneers of eye movecment research.” i-Perception vol. 1,2 (2010) 33-68. doi10.1068i0389
Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation. CoRR abs/1711.09017 (2017). arXiv:1711.09017 http://arxiv.org/abs/1711.09017 (Pdf)
Yarbus, Al’fred Luk’yanovich, Eye Movements and Vision. New York: Plenum Press; 1967. (Pdf)