post-image

AI eye‑tracking and emotion recognition to improve attention in children with autism

Researchers at Matrouh University published a paper in the International Journal of Human–Computer Interaction describing an AI system that combines eye‑tracking and emotion recognition to detect and support attention in children with autism spectrum disorder (ASD).

The paper is credited to Marwa A. Marzouk, a lecturer in the Faculty of Computers and Artificial Intelligence, and Sherif Sh. Ibrahim, an associate professor in the Faculty of Specific Education and the faculty vice dean. The authors frame the system as a tool to monitor attention and emotional responses across different storytelling modalities.

Eye‑tracking records where a child looks and for how long. Emotion recognition here means using algorithms to infer affective state from facial cues or gaze patterns. The authors propose combining those signals with AI to identify lapses in attention and to adapt storytelling formats or prompts in response.

The source material provided lists the article title and contributors but does not include methods, sample sizes, quantitative results, or a publication date. The paper presents a design and approach rather than reported clinical outcomes in the excerpt available.

For developers and clinicians, the piece is notable for applying multimodal sensing—visual attention plus affective inference—to learning content for children with ASD. The approach could inform adaptive learning interfaces and assistive story formats, but efficacy and safety will depend on validation in real‑world or clinical studies.

Readers seeking implementation details or performance data should consult the full article in the International Journal of Human–Computer Interaction or contact the authors at Matrouh University.

Photo credit: covers.tandf.co.uk

Tags: autism spectrum disorder, eye-tracking, emotion recognition, storytelling, AI

Topics: Wearable neurotech, Biofeedback & neurofeedback, Mental health technology