Automatic prediction of therapeutic activities during newborn resuscitation combining video and signal data
Obstetrics and Gynaecology (East Africa)
Newborn mortality is a global challenge with around 2.4 million neonatal deaths in 2019. One third of these occur within the first-and-only day of life with labour complications and birth asphyxia being the primary causes. Existing guidelines for newborn resuscitation are based on limited scientific evidence, and evidens based research is sought for. To increase our knowledge on resuscitation of newborns, it is crucial to first quantify what is currently being done in terms of therapeutic activities, such as ventilation and stimulation, and how they affect resuscitation outcomes. In the current study, the therapeutic activities during newborn resuscitation are quantified by estimating a timeline describing the start and stop of activities. The proposed approach is combining methods using both video and time series data recorded during resuscitation, where the predictions are based on the available sources. From video the activity recognition is done by a 3D CNN method. For the signal data feature extraction is performed on ECG and accelerometer signals and thereafter machine learning is done to perform stimulation detection. We show that best results are achieved with all signals and video available, for the activity ‘‘stimulation’’ we get an AUC of 0.86, sensitivity of 82.32%, specificity of 82.23%, and precision of 57.59%. If only signals or video is available we still get good results with AUC at 0.80, and 0.84 respectively
Publication ( Name of Journal)
Biomedical Signal Processing and Control
Bache, Ø. M.,
Haug, I. A.,
Mdoe, P. F.,
Yarrot, L. B.,
(2023). Automatic prediction of therapeutic activities during newborn resuscitation combining video and signal data. Biomedical Signal Processing and Control, 86(Part C), 1-9.
Available at: https://ecommons.aku.edu/eastafrica_fhs_mc_obstet_gynaecol/645
Creative Commons License
This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.