Llion Jones
Llion Jones
Verified email at google.com
Title
Cited by
Cited by
Year
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems, 5998-6008, 2017
122302017
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
2752018
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
2162018
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
1972019
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
1952017
Character-level language modeling with deeper self-attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI Conference on Artificial Intelligence 33, 3159-3166, 2019
1102019
Wikireading: A novel large-scale language understanding task over wikipedia
D Hewlett, A Lacoste, L Jones, I Polosukhin, A Fandrianto, J Han, ...
arXiv preprint arXiv:1608.03542, 2016
1002016
& Polosukhin, I.(2017). Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Advances in neural information processing systems, 5998-6008, 2017
592017
Lingvo: a modular and scalable framework for sequence-to-sequence modeling
J Shen, P Nguyen, Y Wu, Z Chen, MX Chen, Y Jia, A Kannan, T Sainath, ...
arXiv preprint arXiv:1902.08295, 2019
482019
ukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Advances in neural information processing systems, 5998-6008, 2017
422017
Kaiser, and I
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
Polosukhin,“Attention is All you Need,” in Advances in Neural Information …, 2017
172017
Accurate supervised and semi-supervised machine reading for long documents
D Hewlett, L Jones, A Lacoste, I Gur
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
152017
Byte-Level Machine Reading Across Morphologically Varied Languages.
T Kenter, L Jones, D Hewlett
AAAI, 5820-5827, 2018
82018
Kaiser. 2017. Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez
arXiv preprint arXiv:1601.03317, 0
4
ProtTrans: Towards Cracking the Language of Life's Code Through Self-Supervised Deep Learning and High Performance Computing
A Elnaggar, M Heinzinger, C Dallago, G Rihawi, Y Wang, L Jones, ...
arXiv preprint arXiv:2007.06225, 2020
12020
Multi-task multi-modal machine learning system
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent App. 16/689,025, 2020
12020
Machine translation using neural network models
Z Chen, MR Hughes, Y Wu, M Schuster, X Chen, LO Jones, NJ Parmar, ...
US Patent App. 16/521,780, 2020
12020
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent 10,452,978, 2019
12019
WikiReading: A novel large-scale language understanding task over wikipedia
A Lacoste, A Fandrianto, D Hewlett, D Berthelot, I Polosukhin, J Han, ...
12016
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent 10,719,764, 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–20