We use perceptual methods, AI, and frugal robotics innovation to deliver transformative diagnostic and treatment solutions.
Head of Group
B415B Bessemer Building
South Kensington Campus
+44 (0)20 3312 5145
What we do
The HARMS lab leverages perceptually enabled methodologies, artificial intelligence, and frugal innovation in robotics (such as soft surgical robots) to deliver transformative solutions for diagnosis and treatment. Our research is driven by both problem-solving and curiosity, aiming to build a comprehensive understanding of the actions, interactions, and reactions occurring in the operating room. We focus on using robotic technologies to facilitate procedures that are not yet widely adopted, particularly in endoluminal surgery, such as advanced treatments for gastrointestinal cancer.
Meet the team
Results
- Showing results for:
- Reset all filters
Search results
-
Conference paperJames DRC, Orihuela-Espina F, Leff DR, et al., 2010,
Cognitive Burden Estimation for Visuomotor Learning with fNIRS
, 13th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), Publisher: SPRINGER-VERLAG BERLIN, Pages: 319-326, ISSN: 0302-9743- Author Web Link
- Cite
- Citations: 17
-
Journal articleNoonan D, Elson D, Mylonas G, et al., 2009,
Laser Induced Fluorescence and Reflected White Light Imaging for Robot-Assisted MIS
, IEEE Transactions on Biomedical Engineering, Vol: 56, Pages: 889-892 -
Conference paperVisentini-Scarzanella M, Mylonas GP, Stoyanov D, et al., 2009,
<i>i</i>-BRUSH: A Gaze-Contingent Virtual Paintbrush for Dense 3D Reconstruction in Robotic Assisted Surgery
, 12th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2009), Publisher: SPRINGER-VERLAG BERLIN, Pages: 353-+, ISSN: 0302-9743- Author Web Link
- Cite
- Citations: 16
-
Conference paperKwok KW, Sun LW, Vitiello V, et al., 2009,
Perceptually docked control environment for multiple microbots: application to the gastric wall biopsy
, IEEE/RSJ International Conference on Intelligent Robots and Systems, Pages: 2783-2788 -
Conference paperKwok K-W, Mylonas GP, Sun LW, et al., 2009,
Dynamic Active Constraints for Hyper-Redundant Flexible Robots
, 12th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2009), Publisher: SPRINGER-VERLAG BERLIN, Pages: 410-+, ISSN: 0302-9743- Author Web Link
- Cite
- Citations: 22
-
Journal articleStoyanov D, Mylonas GP, Lerotic M, et al., 2008,
Intra-Operative Visualizations: Perceptual Fidelity and Human Factors
, JOURNAL OF DISPLAY TECHNOLOGY, Vol: 4, Pages: 491-501, ISSN: 1551-319X- Author Web Link
- Cite
- Citations: 19
-
Conference paperLo B, Chung AJ, Stoyanov D, et al., 2008,
Real-time intra-operative 3D tissue deformation recovery
, Pages: 1387 -1390-1387 -1390 -
Book chapterMylonas GP, Yang G-Z, 2008,
Eye Tracking and Depth from Vergence
, NEXT GENERATION ARTIFICIAL VISION SYSTEMS: REVERSE ENGINEERING THE HUMAN VISUAL SYSTEM, Editors: Bharath, Petrou, Publisher: ARTECH HOUSE, Pages: 191-215, ISBN: 978-1-59693-224-1 -
Conference paperYang G-Z, Mylonas GP, Kwok K-W, et al., 2008,
Perceptual docking for robotic control
, 4th International Workshop on Medical Imaging and Augmented Reality, Publisher: SPRINGER-VERLAG BERLIN, Pages: 21-30, ISSN: 0302-9743- Author Web Link
- Cite
- Citations: 17
-
Conference paperMylonas GP, Kwok K-W, Darzi A, et al., 2008,
Gaze-Contingent Motor Channelling and Haptic Constraints for Minimally Invasive Robotic Surgery
, 11th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2008), Publisher: SPRINGER-VERLAG BERLIN, Pages: 676-683, ISSN: 0302-9743- Author Web Link
- Cite
- Citations: 17
This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.
Contact Us
The Hamlyn Centre
Bessemer Building
South Kensington Campus
Imperial College
London, SW7 2AZ
Map location