Hi There! Name's Machel
I’m a research scientist at Google DeepMind working on NLP research, with a focus on multilingual NLP. I also will be a PhD student at the University of Washington co-advised by Luke Zettlemoyer and Noah A. Smith.
I was previously a visiting student at Carnegie Mellon University, advised by Graham Neubig and researcher at the University of Tokyo at Matsuo Lab with Yutaka Matsuo (where I still collaborate).
I am currently working on multilingual NLP (Reid and Artetxe, 2022), low-resource NLP (Reid et al., 2021), edit models (Marrese-Taylor et al., 2021, Reid and Zhong, 2021, Reid and Neubig, 2022) as well as other topics as well! Feel free to reach out via email or Twitter!
In our paper “Combining Pretrained High Resource Embeddings And Subword Representations For Low Resource Languages” (Accepted to the ICLR 2020 AfricaNLP Work...