BibTex format
@article{Creswell:2016,
author = {Creswell, A and Bharath, AA},
title = {Task Specific Adversarial Cost Function},
url = {http://arxiv.org/abs/1609.08661v1},
year = {2016}
}
In this section
Several of our current PhD candidates and fellow researchers at the Data Science Institute have published, or in the proccess of publishing, papers to present their research.
@article{Creswell:2016,
author = {Creswell, A and Bharath, AA},
title = {Task Specific Adversarial Cost Function},
url = {http://arxiv.org/abs/1609.08661v1},
year = {2016}
}
TY - JOUR
AB - The cost function used to train a generative model should fit the purpose ofthe model. If the model is intended for tasks such as generating perceptuallycorrect samples, it is beneficial to maximise the likelihood of a sample drawnfrom the model, Q, coming from the same distribution as the training data, P.This is equivalent to minimising the Kullback-Leibler (KL) distance, KL[Q||P].However, if the model is intended for tasks such as retrieval or classificationit is beneficial to maximise the likelihood that a sample drawn from thetraining data is captured by the model, equivalent to minimising KL[P||Q]. Thecost function used in adversarial training optimises the Jensen-Shannon entropywhich can be seen as an even interpolation between KL[Q||P] and KL[P||Q]. Here,we propose an alternative adversarial cost function which allows easy tuning ofthe model for either task. Our task specific cost function is evaluated on adataset of hand-written characters in the following tasks: Generation,retrieval and one-shot learning.
AU - Creswell,A
AU - Bharath,AA
PY - 2016///
TI - Task Specific Adversarial Cost Function
UR - http://arxiv.org/abs/1609.08661v1
ER -
Data Science Institute
William Penney Laboratory
Imperial College London
South Kensington Campus
London SW7 2AZ
United Kingdom
Email us.
Sign up to our mailing list.
Follow us on Twitter, LinkedIn and Instagram.