I built a Neural Network library in pure PHP with GPU support (OpenCL) - looking for feedback!
Posted by Artistic_Farmer8019@reddit | learnprogramming | View on Reddit | 1 comments
I've been working on a Neural Network library written in PHP. It started as a learning project but has grown into a fairly feature-rich library that I wanted to share.
Key Features:
- GPU Acceleration: Supports OpenCL (via Rindow Math Matrix) for significant speedups (20x-50x on large matrices).
- Pure PHP Fallback: Works out-of-the-box without extensions if needed.
- Layers: Dense, Conv2D, Dropout, BatchNormalization, Flatten.
- Optimizers: Adam, AdamW, RMSProp, SGD (with momentum).
- Activations: ReLU, Sigmoid, Tanh, Softmax, LeakyReLU, ELU.
I'm looking for feedback on the code structure, especially regarding the backend detection logic and matrix operations. If anyone has experience optimizing math operations in PHP, I'd love to hear your thoughts.
Here is the code: https://github.com/GuilhermeBiancardi/Neural-Network-PHP
If you like give me a star!
Thanks!
Rain-And-Coffee@reddit
People generally don't care about my pet projects or yours.
Very rarely will someone sit and read an entire code base.
One way I found to get feedback was to teach why my project is useful, ex: writing a blog post that introduces the project to someone unfamiliar with the domain.