The document proposes a bi-activation function to improve the performance of convolutional neural networks. The bi-activation function combines a positive activation function and negative activation function to reflect both positive and negative information. Experiments on MNIST, Fashion MNIST, CIFAR-10 and CIFAR-100 datasets show the bi-activation function performs better than RELU and ELU activation functions. The bi-activation function more flexibly learns bi-directional information compared to existing activation functions.