Skip to content
GitHub Universe 2025
Save $400 on Universe passes until 9/17. Register now
#

leaky-relu

Here are 32 public repositories matching this topic...

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

  • Updated Aug 15, 2022
  • Jupyter Notebook

This project predicts loan approval outcomes (Approved/Rejected) using a PyTorch neural network. It includes data preprocessing, train/validation/test split, model training with BCEWithLogitsLoss, and inference with probability-based classification.

  • Updated Sep 13, 2025
  • Python

This project classifies SMS messages as spam or ham using a feedforward neural network in PyTorch with a bag-of-words representation. It includes train/validation/test splits, performance evaluation (accuracy, sensitivity, specificity, precision), and saving the trained model and vectorizer for reuse in inference.

  • Updated Sep 8, 2025
  • Python

Improve this page

Add a description, image, and links to the leaky-relu topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the leaky-relu topic, visit your repo's landing page and select "manage topics."

Learn more