Abstract: In current Neural Machine Translation (NMT) research, translating low-resource language pairs remains a significant challenge. This work proposes an LGE-Transformer method for Chinese-Malay ...
Abstract: The prevalent use of Byte Pair Encoding (BPE) in Large Language Models (LLMs) facilitates robust handling of subword units and avoids issues of out-of-vocabulary words. Despite its success, ...