Yongchang Hao
Yongchang Hao
Verified email at - Homepage
Cited by
Cited by
Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation
W Wang, W Jiao, Y Hao, X Wang, S Shi, Z Tu, M Lyu
Annual Meeting of the Association for Computational Linguistics (ACL) 1 …, 2022
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation
Y Hao, S He, W Jiao, Z Tu, M Lyu, X Wang
North American Chapter of the Association for Computational Linguistics …, 2021
Teacher Forcing Recovers Reward Functions for Text Generation
Y Hao, Y Liu, L Mou
Advances in Neural Information Processing Systems (NeurIPS), 2022
An equal-size hard EM algorithm for diverse dialogue generation
Y Wen, Y Hao, Y Cao, L Mou
International Conference on Learning Representations (ICLR), 2023
Flora: Low-Rank Adapters Are Secretly Gradient Compressors
Y Hao, Y Cao, L Mou
International Conference on Machine Learning (ICML), 2024
LLMR: Knowledge Distillation with a Large Language Model-Induced Reward
D Li, Y Hao, L Mou
Proceedings of the 2024 Joint International Conference on Computational …, 2024
Ginger: An Efficient Curvature Approximation with Linear Complexity for General Neural Networks
Y Hao, Y Cao, L Mou
arXiv preprint arXiv:2402.03295, 2024
Discovering Reward Functions for Language Models
Y Hao
University of Alberta, 2023
The system can't perform the operation now. Try again later.
Articles 1–8