Zhirui Zhang (张志锐)
Algorithm Expert at Alibaba DAMO Academy
News | Research Interest | Education | Publications | Experience | Services | Awards

About Me: I am currently an Algorithm Expert in the Language Technology Lab at Alibaba DAMO Academy. Prior to that, I received my Ph.D. degree from the University of Science and Technology (USTC) in 2019, supervised by Prof. Enhong Chen from USTC and Prof. Harry Shum from MSRA. My general research interests lie in natural language processing, machine translation, dialogue system and deep learning.

Location: Hangzhou, China | Email: zrustc11@gmail.com
[Google Scholar] [GitHub]

Feel free and welcome to contact for intern positions and possible collobaration!

News


Research Interest

I work in the field of natural language processing, machine translation, dialogue system and machine learning. Currently, I focus on the following research topics:

Education


Publications

Conference/Journal Paper:

    [15] Xin Zheng, Zhirui Zhang, Junliang Guo, Shujian Huang, Boxing Chen, Weihua Luo and Jiajun Chen, “Adaptive Nearest Neighbor Machine Translation”. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021).

    [14] Junliang Guo, Zhirui Zhang, Linli Xu, Boxing Chen and Enhong Chen, “Adaptive Adapter: an Efficient Way to Incorporate BERT into Neural Machine Translation”. IEEE Transactions on Audio, Speech and Language Processing (TASLP 2021). [PDF]

    [13] Junliang Guo, Zhirui Zhang, Linli Xu, Hao-Ran Wei, Boxing Chen and Enhong Chen, “Incorporating BERT into Parallel Sequence Decoding with Adapters”. The 34th Conference on Neural Information Processing Systems (NeurIPS 2020). [PDF] [Code]

    [12] Hao-Ran Wei, Zhirui Zhang, Boxing Chen and Weihua Luo, “Iterative Domain-Repaired Back-Translation”. The 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020). [PDF] [Code]

    [11] Baijun Ji, Zhirui Zhang, Xiangyu Duan, Min Zhang, Boxing Chen and Weihua Luo, “Cross-lingual Pre-training Based Transfer for Zero-shot Neural Machine Translation”. The 34th AAAI Conference on Artificial Intelligence (AAAI 2020). [PDF]

    [10] Zhirui Zhang, Xiujun Li, Jianfeng Gao and Enhong Chen, “Budgeted Policy Learning for Task-Oriented Dialogue Systems”. The 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019). [PDF]

    [9] Zhirui Zhang, Shuangzhi Wu, Shujie Liu, Mu Li, Ming Zhou and Tong Xu, “Regularizing Neural Machine Translation by Target-bidirectional Agreement”. The 33rd AAAI Conference on Artificial Intelligence (AAAI 2019). [PDF]

    [8] Zhirui Zhang*, Shuo Ren*, Shujie Liu, Ming Zhou and Shuai Ma, “Unsupervised Neural Machine Translation with SMT as Posterior Regularization”. The 33rd AAAI Conference on Artificial Intelligence (AAAI 2019). (* equal contribution) [PDF] [Code]

    [7] Shuangzhi Wu, Dongdong Zhang, Zhirui Zhang, Nan Yang, Mu Li and Ming Zhou, “Dependency-to-Dependency Neural Machine Translation”. IEEE Transactions on Audio, Speech and Language Processing (TASLP 2018).

    [6] Zhirui Zhang, Shujie Liu, Mu Li, Ming Zhou and Enhong Chen, “Bidirectional Generative Adversarial Networks for Neural Machine Translation”. The SIGNLL Conference on Computational Natural Language Learning (CoNLL 2018). [PDF]

    [5] Zhirui Zhang, Shujie Liu, Mu Li, Ming Zhou and Enhong Chen, “Coarse-To-Fine Learning for Neural Machine Translation”. The 7th CCF International Conference on Natural Language Processing and Chinese Computing (NLPCC 2018). [PDF]

    [4] Zhirui Zhang, Shujie Liu, Mu Li, Ming Zhou and Enhong Chen, “Joint Training for Neural Machine Translation Models with Monolingual Data”. The 32nd AAAI Conference on Artificial Intelligence (AAAI 2018). [PDF]

    [3] Wenhu Chen, Guanlin Li, Shuo Ren, Shujie Liu, Zhirui Zhang, Mu Li and Ming Zhou, “Generative Bridging Network in Neural Sequence Prediction”. The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2018). [PDF]

    [2] Duyu Tang, Nan Duan, Zhao Yan, Zhirui Zhang, Yibo Sun, Shujie Liu, Yuanhua Lv and Ming Zhou, “Learning to Collaborate for Question Answering and Asking”. The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL 2018). [PDF]

    [1] Zhirui Zhang, Shujie Liu, Mu Li, Ming Zhou and Enhong Chen, “Stack-based Multi-layer Attention for Transition-based Dependency Parsing”. The 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP 2017). [PDF]


Preprint:

    [4] Junliang Guo, Zhirui Zhang, Linlin Zhang, Linli Xu, Boxing Chen, Enhong Chen and Weihua Luo, “Towards Variable-Length Textual Adversarial Attacks”. arXiv:2104.08139 [cs]. [PDF]

    [3] Zhirui Zhang*, Shuo Ren*, Shujie Liu, Jianyong Wang, Peng Chen, Mu Li, Ming Zhou and Enhong Chen, “Style Transfer as Unsupervised Machine Translation”. arXiv:1808.07894 [cs]. (* equal contribution) [PDF]

    [2] Wenhu Chen, Guanlin Li, Shujie Liu, Zhirui Zhang, Mu Li and Ming Zhou, “Approximate Distribution Matching for Sequence-to-Sequence Learning”. arXiv:1808.08003 [cs]. [PDF]

    [1] Hany Hassan, Anthony Aue, Chang Chen, Vishal Chowdhary, Jonathan Clark, Christian Federmann, Xuedong Huang, Marcin Junczys-Dowmunt, William Lewis, Mu Li, Shujie Liu, Tie-Yan Liu, Renqian Luo, Arul Menezes, Tao Qin, Frank Seide, Xu Tan, Fei Tian, Lijun Wu, Shuangzhi Wu, Yingce Xia, Dongdong Zhang, Zhirui Zhang and Ming Zhou, “Achieving Human Parity on Automatic Chinese to English News Translation”. arXiv:1803.05567 [cs]. [PDF]


Experience

  • 2019.7-Now            Algorithm Expert, Language Technology Lab, Alibaba DAMO Academy.
  • 2019.2-2019.6        Research Intern, Natural Language Computing Group, Microsoft Research Asia.        Supervisor: Shujie Liu
  • 2018.7-2019.1        Research Intern, Deep Learning Group, Microsoft AI & Research.        Supervisor: Xiujun Li & Jianfeng Gao
  • 2015.7-2018.7        Research Intern, Natural Language Computing Group, Microsoft Research Asia.        Supervisor: Mu Li
  • 2013.7-2014.6        Research Intern, Natural Language Computing Group, Microsoft Research Asia.        Supervisor: Mu Li

Services

PC Member:

  • I’ve served as a PC Member for ICML2021, ACL2021, NAACL2021, AAAI2021, NeurIPS2020, ACL2020, AAAI2020, IJCAI2020, COLING2020, INLG2020, NAACL2019, ACL2019, EMNLP2019, INLG2019

Awards

  • 2019, Award of Excellence in MSRA Star of Tomorrow Internship Program
  • 2013, Google Excellent Scholarship
  • 2013, National Scholarship