Follow
Hongfei Xu
Title
Cited by
Cited by
Year
Lipschitz constrained parameter initialization for deep transformers
H Xu, Q Liu, J van Genabith, D Xiong, J Zhang
arXiv preprint arXiv:1911.03179, 2019
24*2019
CMeIE: Construction and evaluation of Chinese medical information extraction dataset
T Guan, H Zan, X Zhou, H Xu, K Zhang
Natural Language Processing and Chinese Computing: 9th CCF International …, 2020
202020
Efficient context-aware neural machine translation with layer-wise weighting and input-aware gating
H Xu, D Xiong, J Van Genabith, Q Liu
Proceedings of the Twenty-Ninth International Conference on International …, 2021
162021
Neutron: An implementation of the Transformer translation model and its variants
H Xu, Q Liu
arXiv preprint arXiv:1903.07402, 2019
152019
Learning source phrase representations for neural machine translation
H Xu, J van Genabith, D Xiong, Q Liu, J Zhang
arXiv preprint arXiv:2006.14405, 2020
102020
Usaar-dfki–the transference architecture for english–german automatic post-editing
S Pal, H Xu, N Herbig, A Krüger, J van Genabith
Proceedings of the Fourth Conference on Machine Translation (Volume 3 …, 2019
92019
Multi-head highly parallelized LSTM decoder for neural machine translation
H Xu, Q Liu, J van Genabith, D Xiong, M Zhang
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
82021
Dynamically adjusting transformer batch size by monitoring gradient direction change
H Xu, J van Genabith, D Xiong, Q Liu
arXiv preprint arXiv:2005.02008, 2020
82020
UdS submission for the WMT 19 automatic post-editing task
H Xu, Q Liu, J van Genabith
arXiv preprint arXiv:1908.03402, 2019
52019
Self-supervised curriculum learning for spelling error correction
Z Gan, H Xu, H Zan
Proceedings of the 2021 Conference on Empirical Methods in Natural Language …, 2021
42021
Modeling task-aware MIMO cardinality for efficient multilingual neural machine translation
H Xu, Q Liu, J van Genabith, D Xiong
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
42021
Transformer with depth-wise lstm
H Xu, Q Liu, D Xiong, J van Genabith
arXiv preprint arXiv:2007.06257, 2020
42020
Probing word translations in the transformer and trading decoder for encoder layers
H Xu, J van Genabith, Q Liu, D Xiong
arXiv preprint arXiv:2003.09586, 2020
42020
基于神经网络的语义选择限制知识自动获取
贾玉祥, 许鸿飞, 昝红英
中文信息学报 31 (1), 155-161, 2017
42017
Analyzing word translation of transformer layers
H Xu, J van Genabith, D Xiong, Q Liu
arXiv preprint arXiv:2003.09586, 2020
32020
Improving Chinese-English neural machine translation with detected usages of function words
K Zhang, H Xu, D Xiong, Q Liu, H Zan
Natural Language Processing and Chinese Computing: 6th CCF International …, 2018
32018
助词 “的” 用法自动识别研究
Q Liu, K Zhang, H Xu, S Yu, H Zan
Beijing Da Xue Xue Bao 54 (3), 466-474, 2018
32018
The transference architecture for automatic post-editing
S Pal, H Xu, N Herbig, SK Naskar, A Krüger, J van Genabith
arXiv preprint arXiv:1908.06151, 2019
22019
ParaZh-22M: A Large-Scale Chinese Parabank via Machine Translation
W Hao, H Xu, D Xiong, H Zan, L Mu
Proceedings of the 29th International Conference on Computational …, 2022
12022
Transformer-based NMT: modeling, training and implementation
H Xu
Saarländische Universitäts-und Landesbibliothek, 2021
12021
The system can't perform the operation now. Try again later.
Articles 1–20