Tags:Deep Neural Networks, Leaky ReLU and Reluplex Algorithm
Abstract:
In recent years, Deep Neural Networks (DNNs) have been experiencing rapid development and have been widely used in various fields. However, while DNNs have shown strong capabilities, their security problems have gradually been exposed. Therefore, the formal guarantee of neural network output is needed. Prior to the appearance of the Reluplex algorithm, the verification of DNNs was always a difficult problem. Reluplex algorithm is specially used to verify DNNs with ReLU activation function. This is an excellent and effective algorithm, but it cannot verify more activation functions. ReLU activation function will bring about “Dead Neuron” problem, and Leaky ReLU activation function can solve this problem, so it is necessary to verify DNNs based on Leaky ReLU activation function. Therefore, we propose the Leaky-Reluplex algorithm, which is based on the Reuplex algorithm. Leaky-Reluplex algorithm can verify DNNs based on Leaky ReLU activation function.