Follow
Ivan Montero
Title
Cited by
Cited by
Year
Plug and play autoencoders for conditional text generation
F Mai, N Pappas, I Montero, NA Smith, J Henderson
arXiv preprint arXiv:2010.02983, 2020
302020
Sentence Bottleneck Autoencoders from Transformer Language Models
I Montero, N Pappas, NA Smith
arXiv preprint arXiv:2109.00055, 2021
212021
How much does attention actually attend? questioning the importance of attention in pretrained transformers
M Hassid, H Peng, D Rotem, J Kasai, I Montero, NA Smith, R Schwartz
arXiv preprint arXiv:2211.03495, 2022
182022
Pivot through english: Reliably answering multilingual questions without document retrieval
I Montero, S Longpre, N Lao, AJ Frank, C DuBois
arXiv preprint arXiv:2012.14094, 2020
22020
The system can't perform the operation now. Try again later.
Articles 1–4