Tom Hosking


University of Edinburgh Sep 2019 - Present

PhD Candidate, CDT in NLP

Supervised by Mirella Lapata

University College London Sep 2017 - Sep 2018

MSc, Machine Learning (Distinction)

Distinction in all taught modules and thesis

Commended for highest project mark

University of Oxford, Worcester College 2006 - 2010

2.1 MPhys (Master of Physics)

Winchester College 2001 - 2006

Awarded a scholarship

A Levels: Physics (A), Maths (A), Further Maths (A), Chemistry (A), French (A), Latin (A)


Contractor, Associo Ltd Mar 2019 – Sep 2019

  • Document management at scale for the legal profession
  • Designed and built the whole NLP stack

Intern, Bloomsbury AI Mar 2018 – Sep 2018

Wedding Photographer, self employed Aug 2017 – Present

Trader, DRW Aug 2015 – May 2017

  • Primary trader for USD rates products
  • Improved trading models and converted core pricing spreadsheets to Python app

Quant Trader, Financial Market Engineering Feb 2011 – Jul 2015

  • Proprietary trader of STIR, bond and ICE Swapnote futures
  • Built simulation platform for researching quantitative ideas
  • Combined trading, quant and programming knowledge to facilitate communication and coordinate strategy rollout

Developer, Orbis Investment Advisory Sep 2010 – Feb 2011

  • Supported and improved valuation and pricing systems built in VB.NET

Intern, HSBC (Rates Options Trading) Jul – Sep 2009

Content Author, Wunderman Jul – Sep 2008

Web Developer, Pod1 Creative AgencyJul – Sep 2007
Jul – Aug 2006

  • Maintenance developer for PHP/MySQL web projects

Work Experience, Diamond Light Source Synchrotron Aug 2005

  • Built an app to automate survey measurements using a theodolite


Factorising Meaning and Form for Intent-Preserving Paraphrasing Tom Hosking and Mirella Lapata
ACL 2021

    title = "Factorising Meaning and Form for Intent-Preserving Paraphrasing",
    author = "Hosking, Tom  and
        Lapata, Mirella",
    booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
    month = aug,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "",
    pages = "1405--1418",
    abstract = "We propose a method for generating paraphrases of English questions that retain the original intent but use a different surface form. Our model combines a careful choice of training objective with a principled information bottleneck, to induce a latent encoding space that disentangles meaning and form. We train an encoder-decoder model to reconstruct a question from a paraphrase with the same meaning and an exemplar with the same surface form, leading to separated encoding spaces. We use a Vector-Quantized Variational Autoencoder to represent the surface form as a set of discrete latent variables, allowing us to use a classifier to select a different surface form at test time. Crucially, our method does not require access to an external source of target exemplars. Extensive experiments and a human evaluation show that we are able to generate paraphrases with a better tradeoff between semantic preservation and syntactic novelty compared to previous methods.",

Querent Intent in Multi-Sentence Questions • Laurie Burchell*, Jie Chi*, Tom Hosking*, Nina Markl* and Bonnie Webber
Linguistics Annotation Workshop, COLING 2020 (* = equal contribution)

    title = "Querent Intent in Multi-Sentence Questions",
    author = "Burchell, Laurie  and
        Chi, Jie  and
        Hosking, Tom  and
        Markl, Nina  and
        Webber, Bonnie",
    booktitle = "Proceedings of the 14th Linguistic Annotation Workshop",
    month = dec,
    year = "2020",
    address = "Barcelona, Spain",
    publisher = "Association for Computational Linguistics",
    url = "",
    pages = "138--147",
    abstract = "Multi-sentence questions (MSQs) are sequences of questions connected by relations which, unlike sequences of standalone questions, need to be answered as a unit. Following Rhetorical Structure Theory (RST), we recognise that different {``}question discourse relations{''} between the subparts of MSQs reflect different speaker intents, and consequently elicit different answering strategies. Correctly identifying these relations is therefore a crucial step in automatically answering MSQs. We identify five different types of MSQs in English, and define five novel relations to describe them. We extract over 162,000 MSQs from Stack Exchange to enable future research. Finally, we implement a high-precision baseline classifier based on surface features.",

Evaluating Rewards for Question Generation ModelsTom Hosking and Sebastian Riedel
To appear at NAACL-HLT 2019

    title = "Evaluating Rewards for Question Generation Models",
    author = "Hosking, Tom  and
        Riedel, Sebastian",
    booktitle = "Proceedings of the 2019 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)",
    month = jun,
    year = "2019",
    address = "Minneapolis, Minnesota",
    publisher = "Association for Computational Linguistics",
    url = "",
    pages = "2278--2283"

Skills & Languages

  • Proficient in Python, PyTorch, SQL, HTML/CSS/JS
  • Knowledge of React/redux, Gremlin, Tensorflow, Docker, MongoDB, C++
  • English Native
  • French Advanced


  • Volunteer project supervisor with the Nuffield Future Researchers programme
  • Mentor with Target Oxbridge
  • Coxed the winning Oxford Women’s Boat Race crew 2010, and the Oxford Men’s Lightweight Boat Race crew 2009
  • Volunteered as a teaching assistant and coach in Nemato Township, South Africa
  • Volunteered as a developer with Full Fact
  • Keen photographer