Automatic prediction of therapeutic activities during newborn resuscitation combining video and signal data

Jarle Urdal, University of Stavanger, Norway
Kjersti Engan, University of Stavanger, Norway
Trygve Eftestøl, University of Stavanger, Norway
Øyvind Meinich-Bache, University of Stavanger, Norway
Ingunn Anda Haug, Strategic Research, Laerdal Medical AS, Norway
Paschal Mdoe, Haydom Lutheran Hospital, Tanzania
Esto Mduma, Haydom Lutheran Hospital, Tanzania
Ladislaus Yarrot, Haydom Lutheran Hospital, Tanzania
Hussein Kidanto, Aga Khan University
Hege Ersdal, University of Stavanger, Norway


Abstract: `Newborn mortality is a global challenge with around 2.4 million neonatal deaths in 2019. One third of these occur within the first-and-only day of life with labour complications and birth asphyxia being the primary causes. Existing guidelines for newborn resuscitation are based on limited scientific evidence, and evidens based research is sought for. To increase our knowledge on resuscitation of newborns, it is crucial to first quantify what is currently being done in terms of therapeutic activities, such as ventilation and stimulation, and how they affect resuscitation outcomes. In the current study, the therapeutic activities during newborn resuscitation are quantified by estimating a timeline describing the start and stop of activities. The proposed approach is combining methods using both video and time series data recorded during resuscitation, where the predictions are based on the available sources. From video the activity recognition is done by a 3D CNN method. For the signal data feature extraction is performed on ECG and accelerometer signals and thereafter machine learning is done to perform stimulation detection. We show that best results are achieved with all signals and video available, for the activity “stimulation” we get an AUC of 0.86, sensitivity of 82.32%, specificity of 82.23%, and precision of 57.59%. If only signals or video is available we still get good results with AUC at 0.80, and 0.84 respectively.