Tags:Cognitive Workload, Deep learning, Elocutionary Computing, Laparoscopic Surgery, Multilayer Extreme Learning Machine and Multilayer Neural Networks
Abstract:
Deep learning classifiers have demonstrated their ability to provide robust accuracy for the treatment of combined signals including electroencephalography (EEG) and functional near infrared spectroscopy (fNIRS) [1, 2]. In this work, an evolutionary deep learning strategy is applied to classify different cognitive workload states that surgeons experience during laparoscopic surgery. The proposed learning strategy is applied to train an Evolutionary Multilayer Perceptron Neural Network (E-MLPNN), where multimodal raw data of EEG, fNIRS and Electrocardiogram (ECG) signals were collected and concatenated from a series of ten experiments using the back-end platform Multi-sensing AI Environment for Surgical Task & Role Optimisation (MAESTRO) as shown in Figure 1(a). Each experiment required surgical trainees to perform a simulated laparoscopic cholecystectomy (LCH), i.e. the removal of a gallbladder in a porcine model using a minimally invasive surgical technique as demonstrated in Figure 1(b). At each experiment, the level of Cognitive Workload (CWL) is assumed to increase as the mental activity increases during the surgical operation. As presented in Figure 1c, a number of tasks performed during the LCH were defined to measure the level of CWL.
Evolutionary Deep Learning Using Hybrid EEG-fNIRS-ECG Signals to Cognitive Workload Classification in Laparoscopic Surgeries