Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Machine learning for distinguishing saudi children with and without autism via eye-tracking data
Ist Teil von
Child and adolescent psychiatry and mental health, 2023-09, Vol.17 (1), p.1-112, Article 112
Ort / Verlag
London: BioMed Central Ltd
Erscheinungsjahr
2023
Quelle
2022 ECC(Springer)
Beschreibungen/Notizen
Background Despite the prevalence of Autism Spectrum Disorder (ASD) globally, there's a knowledge gap pertaining to autism in Arabic nations. Recognizing the need for validated biomarkers for ASD, our study leverages eye-tracking technology to understand gaze patterns associated with ASD, focusing on joint attention (JA) and atypical gaze patterns during face perception. While previous studies typically evaluate a single eye-tracking metric, our research combines multiple metrics to capture the multidimensional nature of autism, focusing on dwell times on eyes, left facial side, and joint attention. Methods We recorded data from 104 participants (41 neurotypical, mean age: 8.21 [+ or -] 4.12 years; 63 with ASD, mean age 8 [+ or -] 3.89 years). The data collection consisted of a series of visual stimuli of cartoon faces of humans and animals, presented to the participants in a controlled environment. During each stimulus, the eye movements of the participants were recorded and analyzed, extracting metrics such as time to first fixation and dwell time. We then used these data to train a number of machine learning classification algorithms, to determine if these biomarkers can be used to diagnose ASD. Results We found no significant difference in eye-dwell time between autistic and control groups on human or animal eyes. However, autistic individuals focused less on the left side of both human and animal faces, indicating reduced left visual field (LVF) bias. They also showed slower response times and shorter dwell times on congruent objects during joint attention (JA) tasks, indicating diminished reflexive joint attention. No significant difference was found in time spent on incongruent objects during JA tasks. These results suggest potential eye-tracking biomarkers for autism. The best-performing algorithm was the random forest one, which achieved accuracy = 0.76 [+ or -] 0.08, precision = 0.78 [+ or -] 0.13, recall = 0.84 [+ or -] 0.07, and F1 = 0.80 [+ or -] 0.09. Conclusions Although the autism group displayed notable differences in reflexive joint attention and left visual field bias, the dwell time on eyes was not significantly different. Nevertheless, the machine algorithm model trained on these data proved effective at diagnosing ASD, showing the potential of these biomarkers. Our study shows promising results and opens up potential for further exploration in this under-researched geographical context. Keywords: Autism spectrum disorder, Eye-tracking, Face Processing, Prediction, Screening