Main content block

Head of Group

Dr George Mylonas

About us

We use perceptual methods, AI, and frugal robotics innovation to deliver transformative diagnostic and treatment solutions.

Research lab info

What we do

The HARMS lab leverages perceptually enabled methodologies, artificial intelligence, and frugal innovation in robotics (such as soft surgical robots) to deliver transformative solutions for diagnosis and treatment. Our research is driven by both problem-solving and curiosity, aiming to build a comprehensive understanding of the actions, interactions, and reactions occurring in the operating room. We focus on using robotic technologies to facilitate procedures that are not yet widely adopted, particularly in endoluminal surgery, such as advanced treatments for gastrointestinal cancer.

Why it is important?

......

How can it benefit patients?

....

Meet the team

No results found

Search or filter publications

Filter by type:

Filter by publication type

Filter by year:

to

Results

  • Showing results for:
  • Reset all filters

Search results

  • Conference paper
    Kogkas A, Darzi A, Mylonas GP, 2016,

    Gaze-Driven Human-Robot Interaction in the Operating Theatre

    , 6th Joint Workshop on New Technologies for Computer/Robot Assisted Surgery (CRAS 2016)
  • Conference paper
    Khan DZ, Oude Vrielink TJC, Marcus H, Darzi A, Mylonas Get al., 2016,

    NeuroCYCLOPS: development and preclinical validation of a robotic platform for endoscopic neurosurgery

    , European Association of Neurosurgical Societies (EANS 2016), Publisher: European Association of Neurosurgical Societies
  • Conference paper
    Oude Vrielink TJC, Khan DZ, Marcus H, Darzi A, Mylonas Get al., 2016,

    NeuroCYCLOPS: a novel system for endoscopic neurosurgery

    , London, The Hamlyn Symposium on Medical Robotics, Publisher: Imperial College London, Pages: 36-37
  • Conference paper
    Kogkas AA, Sodergren M, Darzi A, Mylonas Get al., 2016,

    Macro- and micro-scale 3D gaze tracking in the operating theatre

    , The Hamlyn Symposium on Medical Robotics 2016, Publisher: Imperial College London, Pages: 100-101
  • Journal article
    Leff DR, James D, Orihuela-Espina F, Kwok KW, Sun L, Mylonas G, Athanasiou T, Darzi A, Yang GZet al., 2015,

    The impact of expert visual guidance on trainee visual search strategy, visual attention and motor skills

    , Frontiers in Human Neuroscience, Vol: 9, ISSN: 1662-5161

    Minimally invasive and robotic surgery changes the capacity for surgical mentors to guide their trainees with the control customary to open surgery. This neuroergonomic study aims to assess a “Collaborative Gaze Channel” (CGC); which detects trainer gaze-behaviour and displays the point of regard to the trainee. A randomised crossover study was conducted in which twenty subjects performed a simulated robotic surgical task necessitating collaboration either with verbal (control condition) or visual guidance with CGC (study condition). Trainee occipito-parietal (O-P) cortical function was assessed with optical topography (OT) and gaze-behaviour was evaluated using video-oculography. Performance during gaze-assistance was significantly superior [biopsy number: (mean ± SD): control=5·6±1·8 vs. CGC=6·6±2·0; p< 0.05] and was associated with significantly lower O-P cortical activity [∆HbO2 mMol x cm [median (IQR)] control = 2.5 (12.0) vs. CGC 0.63 (11.2), p < 0.001]. A random effect model confirmed the association between guidance mode and O-P excitation. Network cost and global efficiency and global efficiency were not significantly influenced by guidance mode. A gaze channel enhances performance, modulates visual search, and alleviates the burden in brain centres subserving visual attention and does not induce changes in the trainee's O-P functional network observable with the current OT technique. The results imply that through visual guidance, attentional resources may be liberated, potentially improving the capability trainees to attend to other safety critical events during the procedure.

  • Patent
    Avila Rencoret FB, Elson DS, Mylonas G, 2015,

    Probe Deployment Device

  • Conference paper
    Avila-Rencoret FB, Elson DS, Mylonas G, 2015,

    Towards a robotic-assisted cartography of the colon: a proof of concept

    , Publisher: IEEE COMPUTER SOC, Pages: 1757-1763
  • Journal article
    Paggetti G, Leff DR, Orihuela-Espina F, Mylonas G, Darzi A, Yang G-Z, Menegaz Get al., 2014,

    The role of the posterior parietal cortex in stereopsis and hand-eye coordination during motor task behaviours

    , Cognitive Processing, Vol: 16, Pages: 177-190, ISSN: 1612-4790
  • Conference paper
    Mylonas GP, Vitiello V, Cundy TP, Darzi A, Yang G-Zet al., 2014,

    CYCLOPS: A versatile robotic tool for bimanual single-access and natural-orifice endoscopic surgery

    , IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE, Pages: 2436-2442

    This paper introduces the CYCLOPS, a novel robotic tool for single-access and natural-orifice endoscopic surgery. Based on the concept of tendon-driven parallel robots, this highly original design gives the system some of its unique capabilities. Just to name a few, unparalleled force exertion capabilities of up to 65N, large and adjustable workspace, bimanual instrument triangulation. Due to the simplicity and nature of the design, the system could be adapted to an existing laparoscope or flexible endoscope. This promises a more immediate and accelerated route to clinical translation not only through endearing low-cost and adaptive features, but also by directly addressing several major barriers of existing designs.

  • Conference paper
    Zhang L, Lee S-L, Yang G-Z, Mylonas GPet al., 2014,

    Semi-autonomous navigation for robot assisted tele-echography using generalized shape models and co-registered RGB-D cameras

    , IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014), Publisher: IEEE, Pages: 3496-3502

    — This paper proposes a semi-autonomous navigatedmaster-slave system, for robot assisted remote echography forearly trauma assessment. Two RGB-D sensors are used tocapture real-time 3D information of the scene at the slave sidewhere the patient is located. A 3D statistical shape model isbuilt and used to generate a customized patient model basedon the point cloud generated by the RGB-D sensors. Thecustomized patient model can be updated and adaptively fittedto the patient. The model is also used to generate a trajectoryto navigate a KUKA robotic arm and safely conduct theultrasound examination. Extensive validation of the proposedsystem shows promising results in terms of accuracy androbustness.

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://www.imperial.ac.uk:80/respub/WEB-INF/jsp/search-t4-html.jsp Request URI: /respub/WEB-INF/jsp/search-t4-html.jsp Query String: id=1305&limit=10&page=8&respub-action=search.html Current Millis: 1726860568220 Current Time: Fri Sep 20 20:29:28 BST 2024

Contact Us

General enquiries
hamlyn@imperial.ac.uk

Facility enquiries
hamlyn.facility@imperial.ac.uk


The Hamlyn Centre
Bessemer Building
South Kensington Campus
Imperial College
London, SW7 2AZ
Map location