We use perceptual methods, AI, and frugal robotics innovation to deliver transformative diagnostic and treatment solutions.

Head of Group

Dr George Mylonas

B415B Bessemer Building
South Kensington Campus

+44 (0)20 3312 5145

YouTube ⇒ HARMS Lab

What we do

The HARMS lab leverages perceptually enabled methodologies, artificial intelligence, and frugal innovation in robotics (such as soft surgical robots) to deliver transformative solutions for diagnosis and treatment. Our research is driven by both problem-solving and curiosity, aiming to build a comprehensive understanding of the actions, interactions, and reactions occurring in the operating room. We focus on using robotic technologies to facilitate procedures that are not yet widely adopted, particularly in endoluminal surgery, such as advanced treatments for gastrointestinal cancer.

Meet the team

No results found

Search or filter publications

Filter by type:

Filter by publication type

Filter by year:

to

Results

  • Showing results for:
  • Reset all filters

Search results

  • Book chapter
    Mylonas G, Yang GZ, 2008,

    Eye Tracking and Depth from Vergence

    , Next Generation Artificial Vision Systems: Reverse Engineering the Human Visual System, Editors: Bharath, Petrou, Publisher: ARTech House, Pages: 187-211
  • Conference paper
    Stoyanov D, Mylonas GP, Yang G-Z, 2008,

    Gaze-Contingent 3D Control for Focused Energy Ablation in Robotic Assisted Surgery

    , 11th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2008), Publisher: SPRINGER-VERLAG BERLIN, Pages: 347-355, ISSN: 0302-9743
  • Conference paper
    Leong JJH, Atallah L, Mylonas GP, Leff DR, Emery RJ, Darzi AW, Yang G-Zet al., 2008,

    Investigation of partial directed coherence for hand-eye coordination in laparoscopic training

    , 4th International Workshop on Medical Imaging and Augmented Reality, Publisher: SPRINGER-VERLAG BERLIN, Pages: 270-+, ISSN: 0302-9743
  • Conference paper
    Noonan D, Mylonas G, Darzi A, Yang GZet al., 2008,

    Gaze Contingent Articulated Robot Control for Robot Assisted Minimally Invasive Surgery

    , International Conference on Intelligent Robots and Systems, Publisher: IEEE/RSJ, Pages: 1186-1191

    This paper introduces a novel technique for controlling an articulated robotic device through the eyes of the surgeon during minimally invasive surgery. The system consists of a binocular eye-tracking unit and a robotic instrument featuring a long, rigid shaft with an articulated distal tip for minimally invasive interventions. They have been integrated into a daVinci surgical robot to provide a seamless and non-invasive localization of eye fixations of the surgeon. By using a gaze contingent framework, the surgeon's fixations in 3D are converted into commands that direct the robotic probe to the desired location. Experimental results illustrate the ability of the system to perform real-time gaze contingent robot control and opens up a new avenue for improving current human-robot interfaces.

  • Conference paper
    Mylonas GP, Stoyanov D, Darzi A, Yang GZet al., 2007,

    Assessment of perceptual quality for gaze-contingent motion stabilization in robotic assisted minimally invasive surgery

    , Pages: 660-667, ISSN: 0302-9743

    With the increasing sophistication of surgical robots, the use of motion stabilisation for enhancing the performance of micro-surgical tasks is an actively pursued research topic. The use of mechanical stabilisation devices has certain advantages, in terms of both simplicity and consistency. The technique, however, can complicate the existing surgical workflow and interfere with an already crowded MIS operated cavity. With the advent of reliable vision-based real-time and in situ in vivo techniques on 3D-deformation recovery, current effort is being directed towards the use of optical based techniques for achieving adaptive motion stabilisation. The purpose of this paper is to assess the effect of virtual stabilization on foveal/parafoveal vision during robotic assisted MIS. Detailed psychovisual experiments have been performed. Results show that stabilisation of the whole visual field is not necessary and it is sufficient to perform accurate motion tracking and deformation compensation within a relatively small area that is directly under foveal vision. The results have also confirmed that under the current motion stabilisation regime, the deformation of the periphery does not affect the visual acuity and there is no indication of the deformation velocity of the periphery affecting foveal sensitivity. These findings are expected to have a direct implication on the future design of visual stabilisation methods for robotic assisted MIS. © Springer-Verlag Berlin Heidelberg 2007.

  • Conference paper
    Leong JJ, Nicolaou M, Atallah L, Mylonas GP, Darzi AW, Yang GZet al., 2007,

    HMM assessment of quality of movement trajectory in laparoscopic surgery.

    , Medical Image Computing and Computer Aided Intervention - MICCAI 2006, Pages: 335-346
  • Conference paper
    Lerotic M, Chung A, Mylonas G, Yang Get al., 2007,

    pq-space Based Non-Photorealistic Rendering for Augmented Reality

    , Medical Image Computing and Computer-Assisted Intervention – MICCAI 2007, Publisher: Springer Berlin Heidelberg, Pages: 102-109
  • Journal article
    Mylonas GP, Darzi A, Yang GZ, 2006,

    Gaze-contingent control for minimally invasive robotic surgery.

    , Comput Aided Surg, Vol: 11, Pages: 256-266, ISSN: 1092-9088

    OBJECTIVE: Recovering tissue depth and deformation during robotically assisted minimally invasive procedures is an important step towards motion compensation, stabilization and co-registration with preoperative data. This work demonstrates that eye gaze derived from binocular eye tracking can be effectively used to recover 3D motion and deformation of the soft tissue. METHODS: A binocular eye-tracking device was integrated into the stereoscopic surgical console. After calibration, the 3D fixation point of the participating subjects could be accurately resolved in real time. A CT-scanned phantom heart model was used to demonstrate the accuracy of gaze-contingent depth extraction and motion stabilization of the soft tissue. The dynamic response of the oculomotor system was assessed with the proposed framework by using autoregressive modeling techniques. In vivo data were also used to perform gaze-contingent decoupling of cardiac and respiratory motion. RESULTS: Depth reconstruction, deformation tracking, and motion stabilization of the soft tissue were possible with binocular eye tracking. The dynamic response of the oculomotor system was able to cope with frequencies likely to occur under most routine minimally invasive surgical operations. CONCLUSION: The proposed framework presents a novel approach towards the tight integration of a human and a surgical robot where interaction in response to sensing is required to be under the control of the operating surgeon.

  • Conference paper
    Leong J, Nicolaou M, Atallah L, Mylonas G, Darzi A, Yang Get al., 2006,

    HMM Assessment of Quality of Movement Trajectory in Laparoscopic Surgery

    , Medical Image Computing and Computer Aided Intervention - MICCAI 2006
  • Journal article
    Leong JJ, Nicolaou M, Atallah L, Mylonas GP, Darzi AW, Yang GZet al., 2006,

    HMM assessment of quality of movement trajectory in laparoscopic surgery.

    , Med Image Comput Comput Assist Interv Int Conf Med Image Comput Comput Assist Interv., Vol: 9, Pages: 752-759

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://www.imperial.ac.uk:80/respub/WEB-INF/jsp/search-t4-html.jsp Request URI: /respub/WEB-INF/jsp/search-t4-html.jsp Query String: id=1305&limit=10&page=12&respub-action=search.html Current Millis: 1734871215212 Current Time: Sun Dec 22 12:40:15 GMT 2024

Contact Us

General enquiries

Facility enquiries


The Hamlyn Centre
Bessemer Building
South Kensington Campus
Imperial College
London, SW7 2AZ
Map location