[1] 董卫华, 廖华, 詹智成, 等. 2008年以来地图学眼动与视觉认知研究新进展[J]. 地理学报, 2019, 74(3): 599-614. DONG Weihua, LIAO Hua, ZHAN Zhicheng, et al. New research progress of eye tracking-based map cognition in cartography since 2008[J]. Acta Geographica Sinica, 2019, 74(3): 599-614. [2] OOMS K, DE MAEYER P, FACK V. Analyzing eye movement patterns to improve map design[C]//Proceedings of the 18th International Research Symposium on Computer-based Cartography and GIScience (AutoCarto 2010). Orlando, FL: Cartography and Geographic Information Society, 2010. [3] KIEFER P, GIANNOPOULOS I, RAUBAL M. Using eye movements to recognize activities on cartographic maps[C]//Proceedings of the 21st ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems. Orlando, FL: Association for Computing Machinery, 2013: 488-491. [4] TATEOSIAN L G, GLATZ M, SHUKUNOBE M, et al. GazeGIS: a gaze-based reading and dynamic geographic information system[C]//Proceedings of Workshop on Eye Tracking and Visualization. Chicago, USA: Springer, 2015: 129-147. [5] GÖBEL F, BAKOGIANNIS N, HENGGELER K, et al. A public Gaze-controlled campus map[M]. Zurich: ETH Zurich, 2018. [6] MANSON S M, KNE L, DYKE K R, et al. Using eye-tracking and mouse metrics to test usability of web mapping navigation[J]. Cartography and Geographic Information Science, 2012, 39(1): 48-60. [7] DONG Weihua, LIAO Hua, XU Fang, et al. Using eye tracking to evaluate the usability of animated maps[J]. Science China Earth Sciences, 2014, 57(3): 512-522. [8] 高俊. 图到用时方恨少, 重绘河山待后生——《测绘学报》60年纪念与前瞻[J]. 测绘学报, 2017, 46(10): 1219-1225. DOI: 10.11947/j.AGCS.2017.20170503. GAO Jun. The 60 anniversary and prospect of Acta Geodaetica et Cartographica Sinica[J]. Acta Geodaetica et Cartographica Sinica, 2017, 46(10): 1219-1225. DOI: 10.11947/j.AGCS.2017.20170503. [9] 王家耀. 时空大数据时代的地图学[J]. 测绘学报, 2017, 46(10): 1226-1237. DOI: 10.11947/j.AGCS.2017.20170308. WANG Jiayao. Cartography in the age of spatio-temporal big data[J]. Acta Geodaetica et Cartographica Sinica, 2017, 46(10): 1226-1237. DOI: 10.11947/j.AGCS.2017.20170308. [10] EPPLER M J, MENGIS J. The concept of information overload: a review of literature from organization science, accounting, marketing, MIS, and related disciplines[J]. The Information Society, 2004, 20(5): 325-344. [11] BUNCH R L, LLOYD R E. The cognitive load of geographic information[J]. The Professional Geographer, 2006, 58(2): 209-220. [12] HARROWER M. The cognitive limits of animated maps[J]. Cartographica: The International Journal for Geographic Information and Geovisualization, 2007, 42(4): 349-357. [13] JACOB R J K, KARN K S. Eye tracking in human-computer interaction and usability research: ready to deliver the promises[M]//The Mind’s Eye. Amsterdam: Elsevier, 2003: 573-605. [14] ÇÖLTEKIN A, HEIL B, GARLANDINI S, et al. Evaluating the effectiveness of interactive map interface designs: a case study integrating usability metrics with eye-movement analysis[J]. Cartography and Geographic Information Science, 2009, 36(1): 5-17. [15] DONG Weihua, WANG Shengkai, CHEN Yizhuou, et al. Using eye tracking to evaluate the usability of flow maps[J]. ISPRS International Journal of Geo-Information, 2018, 7(7): 281. [16] KIEFER P, GIANNOPOULOS I. Gaze map matching: mapping eye tracking data to geographic vector features[C]//Proceedings of the 20th International Conference on Advances in Geographic Information Systems. Redondo Beach, California: Association for Computing Machinery, 2012: 359-368. [17] QVARFORDT P, ZHAI Shumin. Conversing with the user based on eye-gaze patterns[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Portland, Oregon, USA: Association for Computing Machinery, 2005: 221-230. [18] LOBBEN A K. Tasks, strategies, and cognitive processes associated with navigational map reading: a review perspective[J]. The Professional Geographer, 2004, 56(2): 270-281. [19] BULLING A, WARD J A, GELLERSEN H, et al. Eye movement analysis for activity recognition using electrooculography[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33(4): 741-753. [20] PENG Hanchuan, LONG Fuhui, DING C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(8): 1226-1238. [21] SALVUCCI D D, GOLDBERG J H. Identifying fixations and saccades in eye-tracking protocols[C]//Proceedings of 2000 Symposium on Eye Tracking Research & Applications. Palm Beach Gardens, Florida, USA: Association for Computing Machinery, 2000: 71-78. [22] GOLDBERG J H, KOTVAL X P. Computer interface evaluation using eye movements: methods and constructs[J]. International Journal of Industrial Ergonomics, 1999, 24(6): 631-645. [23] HOLMQVIST K, NYSTRÖM M, ANDERSSON R, et al. Eye tracking: a comprehensive guide to methods and measures[M]. Oxford: Oxford University Press, 2011. [24] OOMS K, DE MAEYER P, FACK V, et al. Investigating the effectiveness of an efficient label placement method using eye movement data[J]. The Cartographic Journal, 2012, 49(3): 234-246. [25] LIAO Hua, WANG Xueyuan, DONG Weihua, et al. Measuring the influence of map label density on perceived complexity: a user study using eye tracking[J]. Cartography and Geographic Information Science, 2019, 46(3): 210-227. [26] ZANGEMEISTER W H, SHERMAN K, STARK L. Evidence for a global scanpath strategy in viewing abstract compared with realistic images[J]. Neuropsychologia, 1995, 33(8): 1009-1025. |