Alkhafaji, Nadia ORCID: https://orcid.org/0009-0009-7818-8596, Viana, Thiago
ORCID: https://orcid.org/0000-0001-9380-4611 and Al-Sherbaz, Ali
(2025)
Integrated Genetic Algorithm and Deep Learning Approach for Effective Cyber-Attack Detection and Classification in Industrial Internet of Things (IIoT) Environments.
Arabian Journal for Science and Engineering, 50 (15).
pp. 12071-12095.
doi:10.1007/s13369-024-09663-6
Preview |
Text
15349 Alkhafaji, N et al. (2025) Integrated Genetic Algorithm and Deep Learning Approach for Effective Cyber-Attack Detection and Classification in Industrial Internet of Things (IIoT) Environments.pdf - Published Version Available under License Creative Commons Attribution 4.0. Download (3MB) | Preview |
Abstract
Cyber-attack detection within Industrial Internet of Things (IIoT) environments presents unique challenges due to the complex, resource-constrained, and real-time nature of these networks. Traditional detection techniques often struggle to adapt to the dynamic environment of IIoT. For instance, many existing methods rely on signature-based detection, which fails to identify evolving threats. Other approaches, such as anomaly-based detection, can generate a high rate of False Positives, leading to inefficiencies in threat management. To address these challenges, we propose a novel detection and classification model specifically tailored for IIoT environments. The proposed model integrates Genetic Algorithms (GA) and Deep Learning (DL) to enhance cyber-attack detection within IIoT environments. The GA component optimises feature selection from raw network data, ensuring the extraction of meaningful and relevant features. Leveraging these selected features, the DL component constructs a robust model capable of accurately detecting and classifying various cyber-attack patterns across IIoT devices. Through experimentation on real-world IIoT network traffic (UNSW-NB 15 dataset), the proposed approach demonstrates its efficacy in improving attack detection accuracy and adaptability. The integration of GA and DL offers a synergistic solution that addresses the complexities of IIoT cybersecurity, contributing to a more secure and resilient IIoT ecosystem. The integrated GA–DL classification model developed in this work achieved 98% precision, 96% accuracy, 94% recall, and 12% losses with only less than 50% of the features of the UNSW-NB 15 dataset. The reduction in features required for the identification and classification of cyber-attacks reduces the processing time by 50%.
Item Type: | Article |
---|---|
Article Type: | Article |
Uncontrolled Keywords: | IIoT; Genetic Algorithm; Deep Learning; Cyber-security; Cyber-attacks; Artificial intelligence; Prediction; Classification; Features selection |
Subjects: | H Social Sciences > HD Industries. Land use. Labor > HD28 Management. Industrial Management > HD61 Risk in industry. Risk management Q Science > QA Mathematics > QA75 Electronic computers. Computer science Q Science > QA Mathematics > QA76 Computer software Q Science > QA Mathematics > QA76 Computer software > QA76.9 Other topics > QA76.9.A43 Algorithms |
Divisions: | Schools and Research Institutes > School of Business, Computing and Social Sciences |
Depositing User: | Kamila Niekoraniec |
Date Deposited: | 02 Oct 2025 09:40 |
Last Modified: | 02 Oct 2025 09:45 |
URI: | https://eprints.glos.ac.uk/id/eprint/15349 |
University Staff: Request a correction | Repository Editors: Update this record