Retrieval-based language model adaptation for handwritten Chinese text recognition

Shuying Hu, Qiufeng Wang*, Kaizhu Huang, Min Wen, Frans Coenen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

In handwritten text recognition, compared to human, computers are far short of linguistic context knowledge, especially domain-matched knowledge. In this paper, we present a novel retrieval-based method to obtain an adaptive language model for offline recognition of unconstrained handwritten Chinese texts. The content of handwritten texts to be recognized is varied and usually unknown a priori. Therefore we adopt a two-pass recognition strategy. In the first pass, we utilize a common language model to obtain initial recognition results, which are used to retrieve the related contents from Internet. In the content retrieval, we evaluate different types of semantic representation from BERT output and the traditional TF–IDF representation. Then, we dynamically generate an adaptive language model from these related contents, which will consequently be combined with the common language model and applied in the second-pass recognition. We evaluate the proposed method on two benchmark unconstrained handwriting datasets, namely CASIA-HWDB and ICDAR-2013. Experimental results show that the proposed retrieval-based language model adaptation yields improvements in recognition performance, despite the reduced Internet contents hereby employed.

Original languageEnglish
Pages (from-to)109-119
Number of pages11
JournalInternational Journal on Document Analysis and Recognition
Volume26
Issue number2
DOIs
Publication statusAccepted/In press - 2022

Keywords

  • Handwritten Chinese text recognition
  • Information retrieval
  • Internet content
  • Language model adaptation
  • Recognition

Fingerprint

Dive into the research topics of 'Retrieval-based language model adaptation for handwritten Chinese text recognition'. Together they form a unique fingerprint.

Cite this