Citation

BibTex format

@inproceedings{Johns:2020:10.1109/CVPR.2019.00197,
author = {Johns, E and Liu, S and Davison, A},
doi = {10.1109/CVPR.2019.00197},
publisher = {IEEE},
title = {End-to-end multi-task learning with attention},
url = {http://dx.doi.org/10.1109/CVPR.2019.00197},
year = {2020}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - We propose a novel multi-task learning architecture, which allows learning of task-specific feature-level attention. Our design, the Multi-Task Attention Network (MTAN), consists of a single shared network containing a global feature pool, together with a soft-attention module for each task. These modules allow for learning of task-specific features from the global features, whilst simultaneously allowing for features to be shared across different tasks. The architecture can be trained end-to-end and can be built upon any feed-forward neural network, is simple to implement, and is parameter efficient. We evaluate our approach on a variety of datasets, across both image-to-image predictions and image classification tasks. We show that our architecture is state-of-the-art in multi-task learning compared to existing methods, and is also less sensitive to various weighting schemes in the multi-task loss function. Code is available at https://github.com/lorenmt/mtan.
AU - Johns,E
AU - Liu,S
AU - Davison,A
DO - 10.1109/CVPR.2019.00197
PB - IEEE
PY - 2020///
TI - End-to-end multi-task learning with attention
UR - http://dx.doi.org/10.1109/CVPR.2019.00197
UR - http://hdl.handle.net/10044/1/77515
ER -

Contact us

Artificial Intelligence Network
South Kensington Campus
Imperial College London
SW7 2AZ

To reach the elected speaker of the network, Dr Rossella Arcucci, please contact:

ai-speaker@imperial.ac.uk

To reach the network manager, Diana O'Malley - including to join the network - please contact:

ai-net-manager@imperial.ac.uk