-
-
Notifications
You must be signed in to change notification settings - Fork 195
394 convert misc nn classes to num power #395
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: 3.0
Are you sure you want to change the base?
Conversation
…neural network layers.
…hensive unit tests and fixed broken source file link in the documentation
…d updated documentation with fixed source file link. Added `Parametric` interface to define parameterized layers.
…ith `NumPower` utilities
…ical stability during inference, and gradient computation logic
…ical stability during inference, and gradient computation logic
…itional tests and updated shape handling
…itional tests and updated shape handling
…interface definition for output layers.
…entation and unit tests
…d/backward passes
…ence/backward passes, unit tests, and documentation updates
…ith `NumPower` utilities
…rd/inference/backward passes, unit tests
…ference/backward passes, unit tests
…rward/inference/backward passes, unit tests
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR refactors various neural network classes to use the NumPower library. It introduces new implementations for neural network layers, networks, snapshots, and their corresponding test files. The changes also update documentation paths to reflect the new class structure organized into subdirectories.
Changes:
- Adds new neural network layer implementations (Dense, Activation, Binary, Continuous, Multiclass, Dropout, Noise, BatchNorm, PReLU, Swish, Placeholder1D) using NumPower
- Introduces Network and FeedForward classes with comprehensive test coverage
- Updates initializers to include explicit
locparameter for NumPower compatibility - Updates documentation paths to reflect new directory structure
Reviewed changes
Copilot reviewed 51 out of 51 changed files in this pull request and generated 10 comments.
Show a summary per file
| File | Description |
|---|---|
| src/NeuralNet/Networks/Network.php | New base Network class implementation using NumPower |
| src/NeuralNet/FeedForwards/FeedForward.php | New FeedForward network extending Network |
| src/NeuralNet/Layers//.php | New layer implementations (Dense, Activation, Binary, etc.) |
| src/NeuralNet/Snapshots/Snapshot.php | New snapshot class for network state management |
| tests/NeuralNet//.php | Comprehensive test coverage for new implementations |
| src/NeuralNet/Initializers//.php | Updated to include explicit loc parameter |
| docs/neural-network/hidden-layers/*.md | Documentation path updates |
| phpunit.xml | Memory limit increase for tests |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| use Stringable; | ||
|
|
||
| /** | ||
| * Hidden |
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The docblock says "Hidden" but this is the Layer interface, not the Hidden interface. This should be corrected to say "Layer" instead of "Hidden".
| * Hidden | |
| * Layer |
| @@ -14,9 +16,8 @@ | |||
| #[CoversClass(GraphvizTest::class)] | |||
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The CoversClass attribute incorrectly references GraphvizTest::class instead of Graphviz::class. This should be changed to #[CoversClass(Graphviz::class)] to correctly indicate that this test covers the Graphviz class.
| /** | ||
| * Return an array of hidden layers indexed left to right. | ||
| * | ||
| * @return list<Layers\Hidden> |
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The return type annotation uses Layers\Hidden which is incorrect. It should be list<Hidden> since the proper namespace is already imported with use Rubix\ML\NeuralNet\Layers\Base\Contracts\Hidden; at the top of the file.
| * @return list<Layers\Hidden> | |
| * @return list<Hidden> |
| use NumPower; | ||
| use Rubix\ML\NeuralNet\Layers\Base\Contracts\Hidden; | ||
| use Rubix\ML\NeuralNet\Layers\Base\Contracts\Input; | ||
| use Rubix\ML\NeuralNet\Layers\Base\Contracts\Output; |
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The import use Rubix\ML\NeuralNet\Layers\Base\Contracts\Layer; is missing at the top of the file. This is needed because line 127 references Traversable<Layers\Layer> in a docblock. Either add the import and change the reference to Traversable<Layer>, or use the full namespace path in the docblock.
| use Rubix\ML\NeuralNet\Layers\Base\Contracts\Output; | |
| use Rubix\ML\NeuralNet\Layers\Base\Contracts\Output; | |
| use Rubix\ML\NeuralNet\Layers\Base\Contracts\Layer; |
| @@ -0,0 +1,192 @@ | |||
| <?php | |||
|
|
|||
| declare(strict_types = 1); | |||
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The spacing in the strict_types declaration is inconsistent with other test files in the codebase. The majority of test files use declare(strict_types=1); without spaces, but this file uses declare(strict_types = 1); with spaces. For consistency, it should match the predominant pattern in the codebase which is without spaces.
| } | ||
| $expected = NumPower::array([$expected]); | ||
|
|
||
| $batchSize = count($this->labels); |
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The variable $batchSize is calculated but never used (other than the erroneous call on line 176). This line should be removed.
| } | ||
|
|
||
| #[Test] | ||
| #[TestDox('Computes gradient directly given input, output, expected, and batch size')] |
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The TestDox attribute mentions "input, output, expected, and batch size" but the gradient method does not accept a batch size parameter. The description should be updated to remove the mention of "batch size" since it's not actually a parameter to the gradient method.
| // Original formulation: | ||
| // derivative = (output / input) * (1 - output) + output | ||
| // Implemented using NumPower operations to avoid explicit ones matrix. | ||
| $term1 = NumPower::divide($output, $input); |
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The differentiate method performs division NumPower::divide($output, $input) which could cause division by zero if the input contains zero values. While NumPower might handle this, it would be safer to add a comment explaining the expected behavior or add explicit handling for this edge case.
| @@ -0,0 +1,114 @@ | |||
| <?php | |||
|
|
|||
| declare(strict_types = 1); | |||
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The spacing in the strict_types declaration is inconsistent with other test files in the codebase. The majority of test files use declare(strict_types=1); without spaces, but this file uses declare(strict_types = 1); with spaces. For consistency, it should match the predominant pattern in the codebase which is without spaces.
|
|
||
| $batchSize = count($this->labels); | ||
|
|
||
| $gradient = $this->layer->gradient($input, $output, $expected, $batchSize); |
Copilot
AI
Jan 28, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test calls the gradient method with 4 parameters ($input, $output, $expected, $batchSize), but the Binary::gradient method signature only accepts 3 parameters (NDArray $input, NDArray $output, NDArray $expected). The fourth parameter $batchSize should be removed from this test call since it's not used by the gradient method.
No description provided.