Simulating Dynamic Facial Expressions of Pain From Visuo-Haptic Interactions With a Robotic Patient
Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. Visual feedback of involuntary pain expressions in response to physical palpation on an affected area of a patient is an important source of information for physicians. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions or comprise a limited range of
... ient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative diversity both of pain facial expressions and face identities, which could result in biased practice. Further, these limitations restrict the utility of such medical simulators to be used to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using the data-driven psychophysical method of reverse correlation and incorporating the visuo-haptic interactions of users performing palpation to a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of simulated patients, which triggered the real-time display of 6 pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change β and activation delay τ. Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale. Each participant (n = 16, 4 Asian female, 4 Asian male, 4 White female and 4 White male) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed that a gradual decrease of β and increase of τ from upper face AUs (around the eyes) to those in the lower face (around the mouth) is rated to be appropriate by all participants. We found that transient parameter values that generated the appropriate pain facial expressions as rated by participants, palpation forces, and delays between palpation actions varied across gender and ethnicity of participant-simulated patient pairs. These findings suggest that gender and ethnicity biases affect the participants' palpation strategies and their perception of the pain facial expressions displayed on MorphFace. We anticipate our approach could be utilised to generate physical examination models with diverse patient demographic groups to reduce erroneous judgments in medical students, and provide focused training to address these errors.