This study evaluates the performance of the Mish activation function in the CNN-BiGRU model for intrusion detection, revealing that Mish outperforms the widely used ReLU function across various datasets. The research highlights the critical role of activation functions in enhancing the efficacy of deep learning models, particularly in identifying complex cyber threats. By utilizing datasets like ASNM-TUN, ASNM-CDX, and Hogzilla, the findings contribute valuable insights into activation function selection for improved cybersecurity applications.