Deep learning of Parkinson’s movement from video, without human-defined measures.

Publication date: Jun 10, 2024

The core clinical sign of Parkinson’s disease (PD) is bradykinesia, for which a standard test is finger tapping: the clinician observes a person repetitively tap finger and thumb together. That requires an expert eye, a scarce resource, and even experts show variability and inaccuracy. Existing applications of technology to finger tapping reduce the tapping signal to one-dimensional measures, with researcher-defined features derived from those measures. (1) To apply a deep learning neural network directly to video of finger tapping, without human-defined measures/features, and determine classification accuracy for idiopathic PD versus controls. (2) To visualise the features learned by the model. 152 smartphone videos of 10s finger tapping were collected from 40 people with PD and 37 controls. We down-sampled pixel dimensions and videos were split into 1 s clips. A 3D convolutional neural network was trained on these clips. For discriminating PD from controls, our model showed training accuracy 0. 91, and test accuracy 0. 69, with test precision 0. 73, test recall 0. 76 and test AUROC 0. 76. We also report class activation maps for the five most predictive features. These show the spatial and temporal sections of video upon which the network focuses attention to make a prediction, including an apparent dropping thumb movement distinct for the PD group. A deep learning neural network can be applied directly to standard video of finger tapping, to distinguish PD from controls, without a requirement to extract a one-dimensional signal from the video, or pre-define tapping features.

Concepts Keywords
Clinician Artificial intelligence
Expert Bradykinesia
Parkinson Computer vision
Smartphone Deep learning
Tap Parkinson’s disease
Video

Semantics

Type Source Name
disease MESH Parkinson’s disease

Original Article

(Visited 2 times, 1 visits today)

Leave a Comment

Your email address will not be published. Required fields are marked *