Yichun Yin
Yichun Yin
Noah's Ark Lab, Huawei
Verified email at
Cited by
Cited by
TinyBERT: Distilling BERT for Natural Language Understanding
X Jiao, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
EMNLP-findings (most influential paper of EMNLP-2020), 2019
Unsupervised word and dependency path embeddings for aspect term extraction
Y Yin, F Wei, L Dong, K Xu, M Zhang, M Zhou
IJCAI 2016, 2016
TernaryBERT: Distillation-aware Ultra-low Bit BERT
W Zhang, L Hou, Y Yin, L Shang, X Chen, X Jiang, Q Liu
EMNLP 2020, 2020
Generate & Rank: A Multi-task Framework for Math Word Problems
J Shen, Y Yin, L Li, L Shang, X Jiang, M Zhang, Q Liu
EMNLP 2021 findings, 2021
Document-level multi-aspect sentiment classification as machine comprehension
Y Yin, Y Song, M Zhang
EMNLP 2017, 2044-2054, 2017
bert2BERT: Towards Reusable Pretrained Language Models
C Chen, Y Yin, L Shang, X Jiang, Y Qin, F Wang, Z Wang, X Chen, Z Liu, ...
ACL 2022, 2021
AutoTinyBERT: Automatic Hyper-parameter Optimization for Efficient Pre-trained Language Models
Y Yin, C Chen, L Shang, X Jiang, X Chen, Q Liu
ACL 2021, 2021
Nnembs at semeval-2017 task 4: Neural twitter sentiment classification: a simple ensemble method with different embeddings
Y Yin, Y Song, M Zhang
Proceedings of the 11th International Workshop on Semantic Evaluation …, 2017
Dialog State Tracking with Reinforced Data Augmentation
Y Yin, L Shang, X Jiang, X Chen, Q Liu
AAAI 2020, 2019
PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction
Y Yin, C Wang, M Zhang
COLING 2020, 2019
Socialized word embeddings.
Z Zeng, Y Yin, Y Song, M Zhang
IJCAI, 3915-3921, 2017
Dt-solver: Automated theorem proving with dynamic-tree sampling guided by proof-level value function
H Wang, Y Yuan, Z Liu, J Shen, Y Yin, J Xiong, E Xie, H Shi, Y Li, L Li, ...
ACL 2023, 12632-12646, 2023
Splusplus: a feature-rich two-stage classifier for sentiment analysis of tweets
L Dong, F Wei, Y Yin, M Zhou, K Xu
Proceedings of the 9th international workshop on semantic evaluation …, 2015
Fimo: A challenge formal dataset for automated theorem proving
C Liu, J Shen, H Xin, Z Liu, Y Yuan, H Wang, W Ju, C Zheng, Y Yin, L Li, ...
arXiv preprint arXiv:2309.04295, 2023
Lightmbert: A simple yet effective method for multilingual bert distillation
X Jiao, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
arXiv preprint arXiv:2103.06418, 2021
Improving Task-Agnostic BERT Distillation with Layer Mapping Search
X Jiao, H Chang, Y Yin, L Shang, X Jiang, X Chen, L Li, F Wang, Q Liu
Neurocomputing 2021, 2020
One Cannot Stand for Everyone! Leveraging Multiple User Simulators to train Task-oriented Dialogue Systems
Y Liu, X Jiang, Y Yin, Y Wang, F Mi, Q Liu, X Wan, B Wang
ACL 2023, 1-21, 2023
Dq-lore: Dual queries with low rank approximation re-ranking for in-context learning
J Xiong, Z Li, C Zheng, Z Guo, Y Yin, E Xie, Z Yang, Q Cao, H Wang, ...
ICLR 2024, 2023
Text processing model training method, and text processing method and apparatus
Y Yin, L Shang, X Jiang, X Chen
US Patent App. 17/682,145, 2022
Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation
C Chen, Y Yin, L Shang, Z Wang, X Jiang, X Chen, Q Liu
ICANN 2021, 2021
The system can't perform the operation now. Try again later.
Articles 1–20