EXPLORING AND COMPARING THE APPLICATION OF AI TRANSFORMER TECHNIQUES AND LONG-SHORT TERM MEMORY IN NETWORK INTRUSION DETECTION SYSTEMS

  • Regent University College of Science and Technology-Ghana.
  • Abstract
  • Keywords
  • How to Cite This Article
  • Corresponding Author

The increasing complexity and frequency of cyber-attacks have made Network Intrusion Detection Systems (NIDS) a critical component of modern cybersecurity. Traditional machine learning approaches have shown promise in detecting anomalies, but they often struggle with capturing long-term dependencies and complex patterns in network traffic. This study explores and compares the effectiveness of two advanced artificial intelligence techniques, Long Short-Term Memory (LSTM) networksand Transformer-based models in the context of NIDS. LSTMs, a type of recurrent neural network, are designed to handle sequential data and retain temporal dependencies, making them suitable for identifying patterns over time. Transformers, leveraging self-attention mechanisms, excel at modeling global relationships in sequences, enabling them to capture intricate dependencies and interactions across network traffic features. By evaluating both techniques on benchmark intrusion detection datasets,this study highlights differences in detection accuracy, computational efficiency, and adaptability to evolving attack patterns. The findings suggest that while LSTMs provide robust temporal analysis, Transformers demonstrate superior performance in recognizing complex and context-dependent intrusion patterns, offering a promising direction for next-generation NIDS.


David Laud Amenyo Fiase, Kwadwo Opoku Attah, Perry Opoku Agyeman and Nathaniel Nelson (2026); EXPLORING AND COMPARING THE APPLICATION OF AI TRANSFORMER TECHNIQUES AND LONG-SHORT TERM MEMORY IN NETWORK INTRUSION DETECTION SYSTEMS, Int. J. of Adv. Res. (Feb), ISSN 2320-5407. DOI URL: https://dx.doi.org/


David Laud Amenyo Fiase
Regent University College of Science and Technology
Ghana