Complex & Intelligent Systems ( IF 5.0 ) Pub Date : 2024-11-13 , DOI: 10.1007/s40747-024-01606-w Xuguang Wu, Yiliang Han, Minqing Zhang, Yu Li, Su Cui
Pseudo-random number generators (PRNGs) are deterministic algorithms that generate sequences of numbers approximating the properties of random numbers, which are widely utilized in various fields. In this paper, we present a Genetic Algorithm Optimized Generative Adversarial Network (hereinafter referred to as GAGAN), which is designed for the effective training of discrete generative adversarial networks. In situations where non-differentiable activation functions, such as the modulo operation, are employed and traditional gradient-based backpropagation methods are inapplicable, genetic algorithms are utilized to optimize the parameters of the generator network. Based on this framework, we propose a novel recursive PRNG. Given that a PRNG can be constructed from one-way functions and their associated hardcore predicates, our proposed generator consists of two neural networks that simulate these functions and serve as the state transition function and the output function, respectively. The proposed PRNG has been rigorously tested using stringent benchmarks such as the NIST Statistical Test Suite (SP800-22) and the Chinese standard for random number generation (GM/T 0005-2021). Additionally, it has demonstrated outstanding performance in terms of Hamming distance. The results indicate that the proposed GAN-based PRNG has achieved a high degree of randomness and is highly sensitive to variations in the input.