Information

Service information.
Automated Analysis

This service uses machine learning (ML) to analyse nocturnal pulse oximetry data in children. Computed statistical, non-linear, spectral, and oximetric features are used in the prediction process.

Sleep staging of pulse oximetry data is performed using binary classification. Thirty-second segments (epochs) of the recording are classified as Sleep or Wake, and these segments are filtered before analysis for sleep-disordered breathing. Sleep staging enables the computation of sleep statistics traditionally obtained from overnight polysomnography, such as sleep duration, latency, efficiency, and WASO. While this step may be disabled, it is recommended for data collected in uncontrolled environments (e.g., in the home).

Analysis of recording uses a regressor to produce a point estimate of the apnoea-hypopnea index (AHI) with accompanying uncertainty bounds, or a classifier to predict whether the apnoea-hypopnea index (AHI) is ≥5. Configurable cluster analysis enables characterisation of endotypes, assignment of endotypes to recordings, and identification of genuine anomalies or outliers.

Peer-reviewed article containing out-of-sample performance data coming soon.
Online Retraining

This service offers a streamlined pipeline for training and validating models using a bank of pre-computed features and datasets. Newly trained models are immediately available for use after saving, enabling rapid prototyping and rollout of updated models. All trained models are automatically calibrated, and provide estimates of uncertainty alongside predictions.

Model training uses a set of custom solvers. Gaussian Processes (GPs) use an exact solver. Gradient Boosting Machines (GBMs) are Gradient Boosting Decision Trees that use an XGBoost-like algorithm and are trained using a Histogram-based solver. Linear models are trained using the closed-form solution for Linear Regression, and Iteratively Reweighted Least Squares (IRLS) for Logistic Regression. Support Vector Machines (SVMs) are trained using Sequential Minimal Optimization with heuristics that leverage second-order information to accelerate convergence. The performance of these solvers is comparable to other widely used machine learning libraries.