Publications

2022

Learning to Model Editing Processes
Machel Reid and Graham Neubig
Preprint 2022 [paper]


PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
Machel Reid and Mikel Artetxe
The 2022 Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2022). July 2022. Association for Computational Linguistics [paper]
To be presented at the 7th Workshop on Representation Learning for NLP (RepL4NLP 2022), ACL 2022


A Few Thousand Translations Can Go a Long Way: Leveraging Pre-trained Models for African News Translation
David Adelani, Jesujoba Alabi, Angela Fan, Julia Kreutzer, Xiaoyu Shen, Machel Reid,…,Valencia Wagner, Idris Abdulmumin, Ayodele Awokoya
The 2022 Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2022). July 2022. Association for Computational Linguistics [paper]]


Can Wikipedia Help Offline Reinforcement Learning?
Machel Reid, Yutaro Yamada and Shixiang Shane Gu
Preprint 2022 [paper]

2021


AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
Machel Reid, Junjie Hu, Graham Neubig and Yutaka Matsuo
The 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021). November 2021. Association for Computational Linguistics [paper] [code]
Oral Presentation; Best Paper Nomination


LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer
Machel Reid and Victor Zhong
Findings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Findings of ACL-IJCNLP 2021). August 2021. Association for Computational Linguistics [paper] [code]


Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers
Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo
Findings of The 2021 Conference on Empirical Methods in Natural Language Processing (Findings of EMNLP 2021). November 2021. Association for Computational Linguistics [paper] [code]
To be presented at the 2nd Workshop on Simple and Efficient Natural Language Processing (SustaiNLP 2021), EMNLP 2021


Variational Inference for Learning Representations of Natural Language Edits
Edison Marrese-Taylor, Machel Reid and Yutaka Matsuo.
Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) [paper] [code]
Presented at the 5th Workshop on Representation Learning for NLP (RepL4NLP 2020), ACL 2020

2020


VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling
Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo.
The 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020). November 2020. Association for Computational Linguistics [paper] [code]

Workshop


On the Impact of Data Augmentation on Downstream Performance in Natural Language Processing
Itsuki Okimura, Machel Reid, Makoto Kawano and Yutaka Matsuo
In Proceedings of the Third Workshop on Insights from Negative Results in NLP, Online and Dublin, Ireland. Association for Computational Linguistics. [paper]


Low-Resource Machine Translation Using Cross-Lingual Language Model Pretraining
Francis Zheng, Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo
Shared Task, 1st Workshop on NLP for Indigenous Languages of the Americas, NAACL 2021 [paper]


Combining Pretrained High Resource Embeddings and Subword Representations for Low-Resource Languages
Machel Reid, Edison Marrese-Taylor and Yutaka Matsuo.
AfricaNLP Workshop, ICLR 2020. [paper][poster][blog]