Skip to content

Commit 5255d64

Browse files
committed
ML-390 Typo fixes
1 parent a67655f commit 5255d64

File tree

7 files changed

+11
-11
lines changed

7 files changed

+11
-11
lines changed

docs/neural-network/optimizers/cyclical.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ $$
1111
\text{cycle} &= \left\lfloor 1 + \frac{t}{2\,\text{steps}} \right\rfloor \\
1212
x &= \left| \frac{t}{\text{steps}} - 2\,\text{cycle} + 1 \right| \\
1313
\text{scale} &= \text{decay}^{\,t} \\
14-
\eta_t &= \text{lower} + (\text{upper} - \text{lower})\,\max\bigl(0\,1 - x\bigr)\,\text{scale} \\
14+
\eta_t &= \text{lower} + (\text{upper} - \text{lower})\,\max\bigl(0,1 - x\bigr)\,\text{scale} \\
1515
\Delta\theta_t &= \eta_t\,g_t
1616
\end{aligned}
1717
$$

docs/neural-network/optimizers/rms-prop.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -15,11 +15,11 @@ $$
1515
$$
1616

1717
where:
18-
- $g_t$ - is the current gradient,
19-
- $v_t$ - is the running average of squared gradients,
20-
- $\rho$ - is the averaging coefficient ($1 − decay$),
21-
- $\eta$ - is the learning rate ($rate$),
22-
- $\varepsilon$ - is a small constant to avoid division by zero (implemented by clipping $\sqrt{v_t}$ to $[ε, +∞)$).
18+
- $g_t$ is the current gradient,
19+
- $v_t$ is the running average of squared gradients,
20+
- $\rho$ is the averaging coefficient ($1 − decay$),
21+
- $\eta$ is the learning rate ($rate$),
22+
- $\varepsilon$ is a small constant to avoid division by zero (implemented by clipping $\sqrt{v_t}$ to $[ε, +∞)$).
2323

2424
## Parameters
2525
| # | Name | Default | Type | Description |

src/NeuralNet/Optimizers/Cyclical.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ public function __construct(
8686

8787
if ($lower > $upper) {
8888
throw new InvalidArgumentException('Lower bound cannot be'
89-
. ' reater than the upper bound.');
89+
. ' greater than the upper bound.');
9090
}
9191

9292
if ($losses < 1) {

src/NeuralNet/Optimizers/Cyclical/Cyclical.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ public function __construct(
9090

9191
if ($lower > $upper) {
9292
throw new InvalidArgumentException(
93-
'Lower bound cannot be reater than the upper bound.'
93+
'Lower bound cannot be greater than the upper bound.'
9494
);
9595
}
9696

tests/NeuralNet/Optimizers/AdaGrad/AdaGradTest.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
<?php
22

3-
declare(strict_types=1);
3+
declare(strict_types = 1);
44

55
namespace Rubix\ML\Tests\NeuralNet\Optimizers\AdaGrad;
66

tests/NeuralNet/Optimizers/Adam/AdamTest.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
<?php
22

3-
declare(strict_types=1);
3+
declare(strict_types = 1);
44

55
namespace Rubix\ML\Tests\NeuralNet\Optimizers\Adam;
66

tests/NeuralNet/Optimizers/Momentum/MomentumTest.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
<?php
22

3-
declare(strict_types=1);
3+
declare(strict_types = 1);
44

55
namespace Rubix\ML\Tests\NeuralNet\Optimizers\Momentum;
66

0 commit comments

Comments
 (0)