Touch Editing: A Flexible One-Time Interaction Approach for Translation

Qian Wang1, Jiajun Zhang2, Lemao Liu3, Guoping Huang3, Chengqing Zong1
1Institute of Automation, Chinese Academy of Sciences, 2Institute of Automation Chinese Academy of Sciences, 3Tencent AI Lab


We propose a touch-based editing method for translation, which is more flexible than traditional keyboard-mouse-based translation postediting. This approach relies on touch actions that users perform to indicate translation errors. We present a dual-encoder model to handle the actions and generate refined translations. To mimic the user feedback, we adopt the TER algorithm comparing between draft translations and references to automatically extract the simulated actions for training data construction. Experiments on translation datasets with simulated editing actions show that our method significantly improves original translation of Transformer (up to 25.31 BLEU) and outperforms existing interactive translation methods (up to 16.64 BLEU). We also conduct experiments on post-editing dataset to further prove the robustness and effectiveness of our method.