Monitoring children with Autism Spectrum Dis-order (ASD) during the execution of the Applied Behaviour Analysis (ABA) program is crucial to assess the progresses while performing actions. Despite its importance, this monitoring procedure still relies on ABA operators' visual observation and manual annotation of the significant events. In this work a deep learning (DL) based approach has been proposed to evaluate the autonomy of children with ASD while performing the hand-washing task. The goal of the algorithm is the automatic detection of RGB frames in which the ASD child washes his/her hands autonomously (no-aid frames) or is supported by the operator (aid frames). The proposed approach relies on a pre-trained VGG16 convolutional network (CNN) modified to fulfill the binary classification task. The performance of the fine-tuned VGG16 was compared against that of other CNN architectures. The fine-tuned VGG16 achieved the best performance with a recall of 0.92 and 0.89 for the no-aid and aid class, respectively. These results prompt the possibility of translating the presented methodology into the actual monitoring practice. The integration of the presented tool with other computer-aided monitoring systems into a single framework, will provide fully support to ABA operators during the therapy session.

Evaluating the autonomy of children with autism spectrum disorder in washing hands: A deep-learning approach

Moccia S.;
2020-01-01

Abstract

Monitoring children with Autism Spectrum Dis-order (ASD) during the execution of the Applied Behaviour Analysis (ABA) program is crucial to assess the progresses while performing actions. Despite its importance, this monitoring procedure still relies on ABA operators' visual observation and manual annotation of the significant events. In this work a deep learning (DL) based approach has been proposed to evaluate the autonomy of children with ASD while performing the hand-washing task. The goal of the algorithm is the automatic detection of RGB frames in which the ASD child washes his/her hands autonomously (no-aid frames) or is supported by the operator (aid frames). The proposed approach relies on a pre-trained VGG16 convolutional network (CNN) modified to fulfill the binary classification task. The performance of the fine-tuned VGG16 was compared against that of other CNN architectures. The fine-tuned VGG16 achieved the best performance with a recall of 0.92 and 0.89 for the no-aid and aid class, respectively. These results prompt the possibility of translating the presented methodology into the actual monitoring practice. The integration of the presented tool with other computer-aided monitoring systems into a single framework, will provide fully support to ABA operators during the therapy session.
2020
978-1-7281-8086-1
File in questo prodotto:
File Dimensione Formato  
ICTS4eHealth_2020.pdf

accesso aperto

Tipologia: Documento in Pre-print/Submitted manuscript
Licenza: PUBBLICO - Pubblico con Copyright
Dimensione 474.87 kB
Formato Adobe PDF
474.87 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11382/536512
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
social impact