Neural CAPTCHA networks

Ying Ma, Guoqiang Zhong*, Wen Liu, Jinxuan Sun, Kaizhu Huang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

To protect against attacks by malicious computer programs, many websites apply the CAPTCHA (short for completely automated public turing test to tell computers and humans apart) technique for security protection. The distortion, rotation and deformation of the characters or puzzles in CAPTCHAs increase the difficulty for machines to automatically recognize them. State-of-the-art CAPTCHA recognition algorithms generally use convolutional neural networks (CNNs) without considering the spatially sequential property of the characters/image features. To address this problem, we propose a new CAPTCHA recognition algorithm called neural CAPTCHA networks (NCNs). NCNs use a convolutional structure to extract CAPTCHA image features, and use bidirectional recurrent modules to learn the spatially sequential information in CAPTCHAs. We have applied NCNs to recognize text-based CAPTCHAs, including arithmetic operation, character recognition and character matching CAPTCHAs, and puzzle-based CAPTCHAs. For arithmetic operation and character recognition CAPTCHAs, we obtained 100% accuracy on the SOIEC CAPTCHA dataset, for the character matching task, we obtained 99.3% accuracy on the SOIEC CAPTCHA dataset, while for the puzzle-based CAPTCHAs, we obtained 98.13% accuracy. These experimental results demonstrate the advantages of NCNs over related state-of-the-art approaches for CAPTCHA recognition.

Original languageEnglish
Article number106769
JournalApplied Soft Computing
Volume97
DOIs
Publication statusPublished - Dec 2020

Keywords

  • Bidirectional long short-term memory
  • Connectionist temporal classification loss
  • Contrastive loss
  • Convolutional neural networks
  • Neural CAPTCHA networks

Fingerprint

Dive into the research topics of 'Neural CAPTCHA networks'. Together they form a unique fingerprint.

Cite this