Skip to content

edujime23/NeuralNetworkFromScratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NumPy Neural Network Module

License Python Version [Dependencies](https://numpy.org/ | https://scipy.org/ | https://matplotlib.org/)

A neural network built from scratch using only NumPy and SciPy. This has convolutional, dense, and recurrent layers, with support for mixing them. It also includes a selection of commonly used activation and cost functions, as well as popular optimization algorithms. I tried to make it easily extensible, allowing you to implement their your own custom layers and optimizers.

Features

  • Layer Types:
    • Convolutional Layers (Also supports convolutions of 4D+ dimensions)
    • Dense (Fully Connected) Layers
    • Recurrent Layers (e.g., SimpleRNN, LSTM)
    • Utility (Pooling, Reshape, Flatten, Permutate) Layers (May raise errors)
  • Layer Mixing: Ability to combine different layer types to create complex architectures.
  • Activation Functions:
    • Sigmoid
    • ReLU
    • Tanh
    • Softmax
    • (and more...)
  • Cost Functions:
    • Mean Squared Error (MSE)
    • Binary Cross-Entropy
    • Categorical Cross-Entropy
    • (and more...)
  • Optimizers:
    • Adam
    • AdamW
    • AmsGrad
    • SGD
    • SGD with Nesterov Momentum
    • RMSprop
  • Extensibility:
    • Base classes for creating custom layer types.
    • Base classes for implementing custom optimization algorithms.
  • Dependencies: Only relies on NumPy, SciPy and Numba, but for running the test files matplotlib is needed

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages