Abstract:
|
Eye-tracking is an experimental method of recording the point of gaze of a person while viewing an image. Most analytic approaches for eye-tracking data focus on dynamic properties of viewing, such as identification of fixations and saccades, or on semantic properties of the image being viewed, through a topographical representation of visual attention called a saliency map, but not both. This work presents a method that incorporates both aspects into a single Bayesian model that jointly estimates dynamic properties and a saliency map, providing a comprehensive strategy for eye-tracking analysis. The model is implemented using a Gibbs sampler and assessed on simulated data, and estimation of model parameters is highly accurate. The model is also applied to eye-tracking data from children with autism spectrum disorder and typically-developing controls. Saliency differences between autism and control groups were found in both social images (faces) and non-social images (blocks), but differences in dynamic features were evident in only some images. The results in this clinical sample are consistent with previous region-based analyses as well as previous fixation parameter models.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.