We're working to develop robots to help children with autism in ways humans can’t
27 June 2017
2 min read
New research involving the University of Portsmouth is aiming to develop robots to help children with autism in ways humans can’t.
The Development of Robot-Enhanced therapy for children with AutisM spectrum disorders (DREAM) project will design robots that can operate autonomously and help the therapist to improve the child’s social interaction skills, such as turn-taking, imitation and joint attention.
Robot-assisted therapies (RAT) have shown promise as potential assessment and therapeutic tools as research has shown that children with an autism spectrum disorder (ASD) engage more readily with robots rather than humans, because robots are simple and predictable.
However, current social robots are simply remote-controlled by the therapists and like standard therapies, still require a lot of time, energy and human resources.
The DREAM Project aims to develop an autonomous robot that minimises the therapist’s intervention so they can focus more on the child and improve the outcome of the therapy. The DREAM robot will also function as a diagnostic tool by collecting clinical data during therapy.
DREAM is a project that will deliver the next generation RAT robot, and its core is its cognitive model which interprets sensory data (body movement and emotion appearance cues), uses these perceptions to assess the child’s behaviour by learning to map them to therapist-specific behavioural classes, and then learns to map these child behaviours to appropriate robot actions as specified by the therapists.
Professor Honghai Liu, Intelligent Systems and Portsmouth research lead for DREAM
The main task of the University of Portsmouth research group is to capture and analyse sensory data from the children – motion gestures, gaze, facial expressions, sound and voice – and make the robot understand what the child is doing so then they can have a better interaction.
The team has substantial experience in multi-sensory data fusion, especially sensing and analytics for multi-camera systems. They have developed a multi-camera smart environment, consisting of a NAO robot, Microsoft Kinect® cameras and high resolution cameras that track and measure the child’s motions and facial expressions and interactions with the robot.
Honghai Liu, Professor of Intelligent Systems and Portsmouth research lead for DREAM, said: “DREAM is a project that will deliver the next generation RAT robot, and its core is its cognitive model which interprets sensory data (body movement and emotion appearance cues), uses these perceptions to assess the child’s behaviour by learning to map them to therapist-specific behavioural classes, and then learns to map these child behaviours to appropriate robot actions as specified by the therapists.
“The multi-sensory data that we are capturing will be used to provide quantitative support for the diagnosis and care and treatment of ASD, replacing current labour intensive techniques involving paper and pencil, or manual video analysis.”
The multi-sensory data that we are capturing will be used to provide quantitative support for the diagnosis and care and treatment of ASD, replacing current labour intensive techniques involving paper and pencil, or manual video analysis.
Professor Honghai Liu, Intelligent Systems and Portsmouth research lead for DREAM
The next stage of the project will involve 40 children with ASD taking part in a study at Universitatea Babeş-Bolyai (UBB) in Romania, which involves half of them experimenting with robot-assisted therapy and the other half only with standard therapy.
DREAM is a project funded by the European Commission and developed by seven different partners: University of Skövde (Sweden), University of Portsmouth, Plymouth University, De Montfort University, Vrije Universiteit Brussel (Belgium), Babeş-Bolyai University (Romania) and Aldebaran Robotics, a French company that conceives, develops, manufactures, and commercialises humanoid robots.