Main content block

Head of Group

Dr George Mylonas

About us

We use perceptual methods, AI, and frugal robotics innovation to deliver transformative diagnostic and treatment solutions.

Research lab info

What we do

The HARMS lab leverages perceptually enabled methodologies, artificial intelligence, and frugal innovation in robotics (such as soft surgical robots) to deliver transformative solutions for diagnosis and treatment. Our research is driven by both problem-solving and curiosity, aiming to build a comprehensive understanding of the actions, interactions, and reactions occurring in the operating room. We focus on using robotic technologies to facilitate procedures that are not yet widely adopted, particularly in endoluminal surgery, such as advanced treatments for gastrointestinal cancer.

Why it is important?

......

How can it benefit patients?

....

Meet the team

Dr Adrian Rubio Solis

Dr Adrian Rubio Solis

Dr Adrian Rubio Solis
Research Associate in Sensing and Machine Learning

Citation

BibTex format

@article{Ezzat:2021:10.1007/s00464-021-08569-w,
author = {Ezzat, A and Kogkas, A and Holt, J and Thakkar, R and Darzi, A and Mylonas, G},
doi = {10.1007/s00464-021-08569-w},
journal = {SURGICAL ENDOSCOPY AND OTHER INTERVENTIONAL TECHNIQUES},
pages = {5381--5391},
title = {An eye-tracking based robotic scrub nurse: proof of concept},
url = {http://dx.doi.org/10.1007/s00464-021-08569-w},
volume = {35},
year = {2021}
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - BackgroundWithin surgery, assistive robotic devices (ARD) have reported improved patient outcomes. ARD can offer the surgical team a “third hand” to perform wider tasks and more degrees of motion in comparison with conventional laparoscopy. We test an eye-tracking based robotic scrub nurse (RSN) in a simulated operating room based on a novel real-time framework for theatre-wide 3D gaze localization in a mobile fashion.MethodsSurgeons performed segmental resection of pig colon and handsewn end-to-end anastomosis while wearing eye-tracking glasses (ETG) assisted by distributed RGB-D motion sensors. To select instruments, surgeons (ST) fixed their gaze on a screen, initiating the RSN to pick up and transfer the item. Comparison was made between the task with the assistance of a human scrub nurse (HSNt) versus the task with the assistance of robotic and human scrub nurse (R&HSNt). Task load (NASA-TLX), technology acceptance (Van der Laan’s), metric data on performance and team communication were measured.ResultsOverall, 10 ST participated. NASA-TLX feedback for ST on HSNt vs R&HSNt usage revealed no significant difference in mental, physical or temporal demands and no change in task performance. ST reported significantly higher frustration score with R&HSNt. Van der Laan’s scores showed positive usefulness and satisfaction scores in using the RSN. No significant difference in operating time was observed.ConclusionsWe report initial findings of our eye-tracking based RSN. This enables mobile, unrestricted hands-free human–robot interaction intra-operatively. Importantly, this platform is deemed non-inferior to HSNt and accepted by ST and HSN test users.
AU - Ezzat,A
AU - Kogkas,A
AU - Holt,J
AU - Thakkar,R
AU - Darzi,A
AU - Mylonas,G
DO - 10.1007/s00464-021-08569-w
EP - 5391
PY - 2021///
SN - 0930-2794
SP - 5381
TI - An eye-tracking based robotic scrub nurse: proof of concept
T2 - SURGICAL ENDOSCOPY AND OTHER INTERVENTIONAL TECHNIQUES
UR - http://dx.doi.org/10.1007/s00464-021-08569-w
UR - http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000658984900002&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=1ba7043ffcc86c417c072aa74d649202
UR - https://link.springer.com/article/10.1007%2Fs00464-021-08569-w
UR - http://hdl.handle.net/10044/1/91861
VL - 35
ER -

Contact Us

General enquiries
hamlyn@imperial.ac.uk

Facility enquiries
hamlyn.facility@imperial.ac.uk


The Hamlyn Centre
Bessemer Building
South Kensington Campus
Imperial College
London, SW7 2AZ
Map location