A Novel Dynamic Attack on Classical Ciphers Using an Attention-Based LSTM Encoder-Decoder Model

Ezat Ahmadzadeh, Hyunil Kim, Ongee Jeong, Inkyu Moon

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

Information security has become an intrinsic part of data communication. Cryptanalysis using deep learning-based methods to identify weaknesses in ciphers has not been thoroughly studied. Recently, long short-term memory (LSTM) networks have shown promising performance in sequential data processing by modeling the dependencies and data dynamics. Given an encrypted ciphertext sequence and corresponding plaintext, by taking advantage of sequential processing, LSTM can adaptively discover the decryption function regardless of the complexity level, which substantially outperforms traditional methods. However, a lengthy ciphertext sequence causes LSTM to lose important information along the sequence, leading to a decrease in network performance. To tackle these problems, we propose adding an attention mechanism to enhance the LSTM sequential processing power. This paper presents a novel, dynamic way to attack classical ciphers by using an attention-based LSTM encoder-decoder for different ciphertext sequence lengths. The proposed approach takes in a sequence of ciphertext and outputs a sequence of plaintext. The effectiveness and flexibility of the proposed model were evaluated on different classical ciphers. We got close to 100% accuracy in breaking all types of classical ciphers in character-level and word-level attacks. We empirically provide further insights into our results on two datasets with short and long ciphertext lengths. In addition, we provide a performance comparison of the proposed method against state-of-the-art methods. The proposed approach has the potential to attack modern ciphers. To the best of our knowledge, this is the first time an attention-based LSTM encoder-decoder has been applied to attack classical ciphers.

Original languageEnglish
Article number9408655
Pages (from-to)60960-60970
Number of pages11
JournalIEEE Access
Volume9
DOIs
StatePublished - 2021

Bibliographical note

Publisher Copyright:
© 2013 IEEE.

Keywords

  • Cryptanalysis
  • attention-based LSTM encoder-decoder
  • classical ciphers
  • recurrent neural network

Fingerprint

Dive into the research topics of 'A Novel Dynamic Attack on Classical Ciphers Using an Attention-Based LSTM Encoder-Decoder Model'. Together they form a unique fingerprint.

Cite this