Följ
Junyi Li
Junyi Li
Ph.D. student, Universite de Montreal, Renmin University of China
Verifierad e-postadress på umontreal.ca - Startsida
Titel
Citeras av
Citeras av
År
A survey of large language models
WX Zhao, K Zhou, J Li, T Tang, X Wang, Y Hou, Y Min, B Zhang, J Zhang, ...
arXiv preprint arXiv:2303.18223, 2023
1637*2023
Pretrained language models for text generation: A survey
J Li, T Tang, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2201.05273, 2022
242*2022
A survey of vision-language pre-trained models
Y Du, Z Liu, J Li, WX Zhao
arXiv preprint arXiv:2202.10936, 2022
1432022
Halueval: A large-scale hallucination evaluation benchmark for large language models
J Li, X Cheng, WX Zhao, JY Nie, JR Wen
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
119*2023
WenLan: Bridging vision and language by large-scale multi-modal pre-training
Y Huo, M Zhang, G Liu, H Lu, Y Gao, G Yang, J Wen, H Zhang, B Xu, ...
arXiv preprint arXiv:2103.06561, 2021
1162021
Few-shot knowledge graph-to-text generation with pretrained language models
J Li, T Tang, WX Zhao, Z Wei, NJ Yuan, JR Wen
Findings of The 59th Annual Meeting of the Association for Computational …, 2021
462021
Mining implicit entity preference from user-item interaction data for knowledge graph completion via adversarial learning
G He, J Li, WX Zhao, P Liu, JR Wen
Proceedings of the Web Conference 2020, 740-751, 2020
412020
Generating long and informative reviews with aspect-aware coarse-to-fine decoding
J Li, WX Zhao, JR Wen, Y Song
The 57th Annual Meeting of the Association for Computational Linguistics (ACL), 2019
342019
Knowledge-enhanced personalized review generation with capsule graph neural network
J Li, S Li, WX Zhao, G He, Z Wei, NJ Yuan, JR Wen
Proceedings of the 29th ACM International Conference on Information …, 2020
322020
Textbox 2.0: A text generation library with pre-trained language models
T Tang, J Li, Z Chen, Y Hu, Z Yu, W Dai, Z Dong, X Cheng, Y Wang, ...
arXiv preprint arXiv:2212.13005, 2022
29*2022
Learning to Transfer Prompts for Text Generation
J Li, T Tang, JY Nie, JR Wen, WX Zhao
NAACL 2022, 2022
292022
A survey on long text modeling with transformers
Z Dong, T Tang, L Li, WX Zhao
arXiv preprint arXiv:2302.14502, 2023
252023
Mvp: Multi-task supervised pre-training for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2206.12131, 2022
232022
Knowledge-based review generation by coherence enhanced text planning
J Li, WX Zhao, Z Wei, NJ Yuan, JR Wen
The 44th International ACM SIGIR Conference on Research and Development in …, 2021
212021
Context-tuning: Learning contextualized prompts for natural language generation
T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2201.08670, 2022
192022
ELMER: A non-autoregressive pre-trained language model for efficient and effective text generation
J Li, T Tang, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2210.13304, 2022
132022
The dawn after the dark: An empirical study on factuality hallucination in large language models
J Li, J Chen, R Ren, X Cheng, WX Zhao, JY Nie, JR Wen
arXiv preprint arXiv:2401.03205, 2024
112024
Bamboo: A comprehensive benchmark for evaluating long text modeling capacities of large language models
Z Dong, T Tang, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2309.13345, 2023
82023
Learning to imagine: Visually-augmented natural language generation
T Tang, Y Chen, Y Du, J Li, WX Zhao, JR Wen
arXiv preprint arXiv:2305.16944, 2023
62023
The Web Can Be Your Oyster for Improving Large Language Models
J Li, T Tang, WX Zhao, J Wang, JY Nie, JR Wen
arXiv preprint arXiv:2305.10998, 2023
6*2023
Systemet kan inte utföra åtgärden just nu. Försök igen senare.
Artiklar 1–20