Follow
Machel Reid
Machel Reid
Research Scientist, Google DeepMind
Verified email at google.com - Homepage
Title
Cited by
Cited by
Year
Large Language Models are Zero-Shot Reasoners
T Kojima, SS Gu, M Reid, Y Matsuo, Y Iwasawa
NeurIPS 2022, 2022
15422022
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
arXiv preprint arXiv:2312.11805, 2023
1942023
Can wikipedia help offline reinforcement learning?
M Reid, Y Yamada, SS Gu
arXiv preprint arXiv:2201.12122, 2022
812022
LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer
M Reid, V Zhong
Findings of the Annual Meeting of the Association for Computational …, 2021
552021
A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
DI Adelani, JO Alabi, A Fan, J Kreutzer, X Shen, M Reid, D Ruiter, ...
NAACL 2022, 2022
32*2022
Diffuser: Diffusion via edit-based reconstruction
M Reid, VJ Hellendoorn, G Neubig
The Eleventh International Conference on Learning Representations, 2022
31*2022
Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers
M Reid, E Marrese-Taylor, Y Matsuo
Findings of Empirical Methods in Natural Language Processing (EMNLP), 2021
272021
Learning to Model Editing Processes
M Reid, G Neubig
Findings of Empirical Methods in Natural Language Processing (EMNLP), 2022
212022
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
M Reid, J Hu, G Neubig, Y Matsuo
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
212021
VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling
M Reid, E Marrese-Taylor, Y Matsuo
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
192020
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
M Reid, M Artetxe
Conference of the North American Chapter of the Association for …, 2021
152021
Low-Resource Machine Translation Using Cross-Lingual Language Model Pretraining
F Zheng, M Reid, E Marrese-Taylor, Y Matsuo
AmericasNLP Workshop, NAACL 2021, 2021
112021
M2D2: A Massively Multi-domain Language Modeling Dataset
M Reid, V Zhong, S Gururangan, L Zettlemoyer
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022
102022
On the impact of data augmentation on downstream performance in natural language processing
I Okimura, M Reid, M Kawano, Y Matsuo
Proceedings of the Third Workshop on Insights from Negative Results in NLP …, 2022
102022
mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations
J Pfeiffer, F Piccinno, M Nicosia, X Wang, M Reid, S Ruder
arXiv preprint arXiv:2305.14224, 2023
82023
Variational Inference for Learning Representations of Natural Language Edits
E Marrese-Taylor, M Reid, Y Matsuo
AAAI 2021, 2020
82020
BUFFET: Benchmarking Large Language Models for Few-shot Cross-lingual Transfer
A Asai, S Kudugunta, XV Yu, T Blevins, H Gonen, M Reid, Y Tsvetkov, ...
arXiv preprint arXiv:2305.14857, 2023
22023
On the Role of Parallel Data in Cross-lingual Transfer Learning
M Reid, M Artetxe
arXiv preprint arXiv:2212.10173, 2022
22022
Combining Pretrained High-Resource Embeddings and Subword Representations for Low-Resource Languages
M Reid, E Marrese-Taylor, Y Matsuo
AfricaNLP Workshop, ICLR 2020, 2020
22020
Edit Aware Representation Learning via Levenshtein Prediction
E Marrese-Taylor, M Reid, A Solano
The Fourth Workshop on Insights from Negative Results in NLP, 53-58, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–20