Hi There! Name's Machel
I’m a staff research scientist at Google DeepMind in London working on post-training and scaling RL on the Gemini Team. I co-led the effort that became the Gemini Pro 1.5 1M context release, and drove the Gemini 2.5 Flash Thinking release. I have been a core contributor for the Gemini 1.0, 1.5, 2.0, 2.5 releases, and the Gemma 1 & 2 open-source LLM releases.
I was previously a visitor at Carnegie Mellon University, advised by Graham Neubig and researcher at the University of Tokyo at Matsuo Lab with Yutaka Matsuo.
You can reach me at machelreid -at- google -dot- com.
Recent Posts
In our paper “Combining Pretrained High Resource Embeddings And Subword Representations For Low Resource Languages” (Accepted to the ICLR 2020 AfricaNLP Work...