Visual Pain Assessment

Computer Vision, Healthcare
Visual Pain Assessment

Automated Pain Assessment

Business problem: An innovative company dealing with pain assessment and health tracking for elderly people attained by Dementia and Alzheimer’s wanted to automate their pain analysis system in mobile apps.

Facial recognition is a sub-field of Computer Vision that enables automated analysis of human facial expressions, extracting and classifying the key facial attributes, e.g., emotions and intentions. The patients’ facial expressions can provide a variety of information regarding the underlying disease, enabling effective diagnosis and treatment. This is especially applicable in the case of small children, (elderly) patients suffering from neurodegenerative diseases and patients in emergency departments – who are incapable of expressing their feelings.

Having in mind the need to preserve the privacy of the patients, the pain assessment had to be done locally on mobile devices without sending patient images and videos to the Cloud. We automated the visual detection of micro-facial expressions representing building blocks of a pain expression, and called Action Units (AUs), using state-of-the-art Computer Vision and Deep Learning technologies. Tracking and pain evolution follow-up was supported by a scalable back-end solution.

Automated Pain Assessment

Solution

Solution

Identified a machine-learning SDK capable of running off-line inference on iOS and Android mobile devices. Built and adapted convolutional models for optimal performance on mobile devices. Made efficient use of mobile device cameras to address different image aspect ratios changing lighting conditions. Maintained and accomplished common performance goals across different mobile device families, from low-end to high-end devices.

Solution

Built the mobile application, integrated the chosen machine-learning SDK, configured the inference pipeline consisting of several stages (face-detection, face-landmark detection, tiling, micro-facial expression classification and pain scoring). Developed the entire mobile application for two platforms (iOS and Android) for automated pain assessment. Added integration test procedures for production-level integrity tests of built-in machine-learning models.

Solution

Adapted the application to cover different target population segments (infants, adults), having different micro-facial expressions involved in the overall pain representation. Retrained models, measured their performance, and integrated them into the target application. Fine-tuned the model performance and optimized their execution on mobile devices.

Technology Stack

NVISO machine-learning SDK for pain assessment and configurable pain-detection pipeline; Android Java/Kotlin and iOS Swift mobile development platforms and tools; Python prototyping and machine-learning frameworks (Keras); Bonseyes training environment; Python Django and PostgreSQL for back-end development.

Technology Stack

Project Value Outcome

Embedding a pain presence detection and classification algorithm to a mobile device allows a care and medical professional to deliver the best possible support to patients remotely in a secure and private manner.

  • A software using pain assessment AI has been approved and is in production as a medical device.
  • iOS and Android applications have been approved, deployed and used in Australia, UK, and North America.
  • Successfully used in care centers for elderly people across three continents.
Outcome Value
Edge AI Potential

Discover the real value that Edge AI technology can bring to your business

Contact
Agriculture