Citation

BibTex format

@article{Tan:2022:10.1038/s41598-022-08115-1,
author = {Tan, Y and Rerolle, S and Lalitharathne, TD and Zalk, NV and Jack, R and Nanayakkara, T},
doi = {10.1038/s41598-022-08115-1},
journal = {Scientific Reports},
title = {Simulating dynamic facial expressions of pain from visuo-haptic interactions with a robotic patient},
url = {http://dx.doi.org/10.1038/s41598-022-08115-1},
volume = {12},
year = {2022}
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. An important source of information for physicians is the visual feedback of involuntary pain facial expressions in response to physical palpation on an affected area of a patient. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions and comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative sample of pain facial expressions and face identities, which could result in biased practices. Further, these limitations restrict the utility of such medical simulators to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using a data-driven perception-based psychophysical method combined with the visuo-haptic inputs of users performing palpations on a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of a simulated patient, which triggered the real-time display of six pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change β and activation delay τ. Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale from “strongly disagree” to “strongly agree”. Each participant (n=16, 4 Asian females, 4 Asian males, 4 White females and 4 White males) performed 200 palpation
AU - Tan,Y
AU - Rerolle,S
AU - Lalitharathne,TD
AU - Zalk,NV
AU - Jack,R
AU - Nanayakkara,T
DO - 10.1038/s41598-022-08115-1
PY - 2022///
SN - 2045-2322
TI - Simulating dynamic facial expressions of pain from visuo-haptic interactions with a robotic patient
T2 - Scientific Reports
UR - http://dx.doi.org/10.1038/s41598-022-08115-1
UR - http://hdl.handle.net/10044/1/95855
VL - 12
ER -

Contact the Lab Director

Dr Nejra van Zalk

Dyson School of Design Engineering
Imperial College London
25 Exhibition Road, South Kensington
London, SW7 2AZ
Send e-mail