Följ
Lei Yu
Lei Yu
Google DeepMind
Verifierad e-postadress på google.com - Startsida
Titel
Citeras av
Citeras av
År
Neural Variational Inference for Text Processing
Y Miao, L Yu, P Blunsom
ICML 2016, 2016
7852016
Deep learning for answer sentence selection
L Yu, KM Hermann, P Blunsom, S Pulman
NIPS Deep Learning and Representation Learning Workshop, 2014
5302014
Systems Analysis of Auxin Transport in the Arabidopsis Root Apex
LR Band, DM Wells, JA Fozard, T Ghetiu, AP French, MP Pound, ...
The Plant Cell 26 (3), 862-875, 2014
2292014
Learning and evaluating general linguistic intelligence
D Yogatama, CM d'Autume, J Connor, T Kocisky, M Chrzanowski, L Kong, ...
arXiv preprint arXiv:1901.11373, 2019
202*2019
A Mutual Information Maximization Perspective of Language Representation Learning
L Kong, CM d'Autume, W Ling, L Yu, Z Dai, D Yogatama
ICLR 2020, 2019
163*2019
Unsupervised Recurrent Neural Network Grammars
Y Kim, AM Rush, L Yu, A Kuncoro, C Dyer, G Melis
NAACL 2019, 2019
1542019
Online Segment to Segment Neural Transduction
L Yu, J Buys, P Blunsom
EMNLP 2016, 2016
1032016
Variational Smoothing in Recurrent Neural Network Language Models
L Kong, G Melis, W Ling, L Yu, D Yogatama
ICLR 2019, 2018
91*2018
IsarStep: a Benchmark for High-level Mathematical Reasoning
W Li, L Yu, Y Wu, LC Paulson
ICLR 2021, 2021
80*2021
The Neural Noisy Channel
L Yu, P Blunsom, C Dyer, E Grefenstette, T Kocisky
ICLR 2017, 2017
782017
Better Document-Level Machine Translation with Bayes' Rule
L Yu, L Sartran, W Stokowiec, W Ling, L Kong, P Blunsom, C Dyer
TACL 2020, 2020
51*2020
A reparameterized discrete diffusion model for text generation
L Zheng, J Yuan, L Yu, L Kong
arXiv preprint arXiv:2302.05737, 2023
402023
Pretraining the Noisy Channel Model for Task-Oriented Dialogue
Q Liu, L Yu, L Rimell, P Blunsom
TACL 2021, 2021
302021
Capturing document context inside sentence-level neural machine translation models with self-training
E Mansimov, G Melis, L Yu
arXiv preprint arXiv:2003.05259, 2020
172020
The DeepMind Chinese–English Document Translation System at WMT2020
L Yu, L Sartran, PS Huang, W Stokowiec, D Donato, S Srinivasan, ...
WMT 2020, 2020
172020
Diverse Pretrained Context Encodings Improve Document Translation
D Donato, L Yu, C Dyer
ACL 2021, 2021
102021
A formal model of IEEE floating point arithmetic
L Yu
Archive of Formal Proofs, 91-104, 2013
102013
MAD for robust reinforcement learning in machine translation
D Donato, L Yu, W Ling, C Dyer
arXiv preprint arXiv:2207.08583, 2022
72022
A Natural Bias for Language Generation Models
C Meister, W Stokowiec, T Pimentel, L Yu, L Rimell, A Kuncoro
ACL 2023, 2022
62022
Sequence transduction neural networks
L Yu, CJ Dyer, T Kocisky, P Blunsom
US Patent 10,572,603, 2020
52020
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–20