Passed
Pull Request — master (#156)
by Tomáš
07:08
created

MP::mul()   A

Complexity

Conditions 2
Paths 2

Size

Total Lines 9
Code Lines 5

Duplication

Lines 0
Ratio 0 %

Importance

Changes 0
Metric Value
dl 0
loc 9
rs 9.6666
c 0
b 0
f 0
cc 2
eloc 5
nc 2
nop 2
1
<?php
2
3
declare(strict_types=1);
4
5
namespace Phpml\Helper\Optimizer;
6
7
use Closure;
8
/**
9
 * Conjugate Gradient method to solve a non-linear f(x) with respect to unknown x
10
 * See https://en.wikipedia.org/wiki/Nonlinear_conjugate_gradient_method)
11
 *
12
 * The method applied below is explained in the below document in a practical manner
13
 *  - http://web.cs.iastate.edu/~cs577/handouts/conjugate-gradient.pdf
14
 *
15
 * However it is compliant with the general Conjugate Gradient method with
16
 * Fletcher-Reeves update method. Note that, the f(x) is assumed to be one-dimensional
17
 * and one gradient is utilized for all dimensions in the given data.
18
 */
19
class ConjugateGradient extends GD
20
{
21
    public function runOptimization(array $samples, array $targets, Closure $gradientCb): array
22
    {
23
        $this->samples = $samples;
24
        $this->targets = $targets;
25
        $this->gradientCb = $gradientCb;
26
        $this->sampleCount = count($samples);
27
        $this->costValues = [];
28
29
        $d = MP::muls($this->gradient($this->theta), -1);
30
31
        for ($i = 0; $i < $this->maxIterations; ++$i) {
32
            // Obtain α that minimizes f(θ + α.d)
33
            $alpha = $this->getAlpha(array_sum($d));
34
35
            // θ(k+1) = θ(k) + α.d
36
            $thetaNew = $this->getNewTheta($alpha, $d);
37
38
            // β = ||∇f(x(k+1))||²  ∕  ||∇f(x(k))||²
0 ignored issues
show
Unused Code Comprehensibility introduced by
44% of this comment could be valid code. Did you maybe forget this after debugging?

Sometimes obsolete code just ends up commented out instead of removed. In this case it is better to remove the code once you have checked you do not need it.

The code might also have been commented out for debugging purposes. In this case it is vital that someone uncomments it again or your project may behave in very unexpected ways in production.

This check looks for comments that seem to be mostly valid code and reports them.

Loading history...
39
            $beta = $this->getBeta($thetaNew);
40
41
            // d(k+1) =–∇f(x(k+1)) + β(k).d(k)
0 ignored issues
show
Unused Code Comprehensibility introduced by
40% of this comment could be valid code. Did you maybe forget this after debugging?

Sometimes obsolete code just ends up commented out instead of removed. In this case it is better to remove the code once you have checked you do not need it.

The code might also have been commented out for debugging purposes. In this case it is vital that someone uncomments it again or your project may behave in very unexpected ways in production.

This check looks for comments that seem to be mostly valid code and reports them.

Loading history...
42
            $d = $this->getNewDirection($thetaNew, $beta, $d);
43
44
            // Save values for the next iteration
45
            $oldTheta = $this->theta;
46
            $this->costValues[] = $this->cost($thetaNew);
47
48
            $this->theta = $thetaNew;
49
            if ($this->enableEarlyStop && $this->earlyStop($oldTheta)) {
50
                break;
51
            }
52
        }
53
54
        $this->clear();
55
56
        return $this->theta;
57
    }
58
59
    /**
60
     * Executes the callback function for the problem and returns
61
     * sum of the gradient for all samples & targets.
62
     */
63
    protected function gradient(array $theta): array
64
    {
65
        [, $gradient] = parent::gradient($theta);
0 ignored issues
show
Bug introduced by
The variable $gradient does not exist. Did you forget to declare it?

This check marks access to variables or properties that have not been declared yet. While PHP has no explicit notion of declaring a variable, accessing it before a value is assigned to it is most likely a bug.

Loading history...
66
67
        return $gradient;
68
    }
69
70
    /**
71
     * Returns the value of f(x) for given solution
72
     */
73
    protected function cost(array $theta): float
74
    {
75
        [$cost] = parent::gradient($theta);
0 ignored issues
show
Bug introduced by
The variable $cost does not exist. Did you forget to declare it?

This check marks access to variables or properties that have not been declared yet. While PHP has no explicit notion of declaring a variable, accessing it before a value is assigned to it is most likely a bug.

Loading history...
Comprehensibility Bug introduced by
It seems like you call parent on a different method (gradient() instead of cost()). Are you sure this is correct? If so, you might want to change this to $this->gradient().

This check looks for a call to a parent method whose name is different than the method from which it is called.

Consider the following code:

class Daddy
{
    protected function getFirstName()
    {
        return "Eidur";
    }

    protected function getSurName()
    {
        return "Gudjohnsen";
    }
}

class Son
{
    public function getFirstName()
    {
        return parent::getSurname();
    }
}

The getFirstName() method in the Son calls the wrong method in the parent class.

Loading history...
76
77
        return array_sum($cost) / $this->sampleCount;
78
    }
79
80
    /**
81
     * Calculates alpha that minimizes the function f(θ + α.d)
82
     * by performing a line search that does not rely upon the derivation.
83
     *
84
     * There are several alternatives for this function. For now, we
85
     * prefer a method inspired from the bisection method for its simplicity.
86
     * This algorithm attempts to find an optimum alpha value between 0.0001 and 0.01
87
     *
88
     * Algorithm as follows:
89
     *  a) Probe a small alpha  (0.0001) and calculate cost function
90
     *  b) Probe a larger alpha (0.01) and calculate cost function
91
     *		b-1) If cost function decreases, continue enlarging alpha
92
     *		b-2) If cost function increases, take the midpoint and try again
93
     */
94
    protected function getAlpha(float $d): float
95
    {
96
        $small = 0.0001 * $d;
97
        $large = 0.01 * $d;
98
99
        // Obtain θ + α.d for two initial values, x0 and x1
100
        $x0 = MP::adds($this->theta, $small);
101
        $x1 = MP::adds($this->theta, $large);
102
103
        $epsilon = 0.0001;
104
        $iteration = 0;
105
        do {
106
            $fx1 = $this->cost($x1);
107
            $fx0 = $this->cost($x0);
108
109
            // If the difference between two values is small enough
110
            // then break the loop
111
            if (abs($fx1 - $fx0) <= $epsilon) {
112
                break;
113
            }
114
115
            if ($fx1 < $fx0) {
116
                $x0 = $x1;
117
                $x1 = MP::adds($x1, 0.01); // Enlarge second
118
            } else {
119
                $x1 = MP::divs(MP::add($x1, $x0), 2.0);
120
            } // Get to the midpoint
121
122
            $error = $fx1 / $this->dimensions;
123
        } while ($error <= $epsilon || $iteration++ < 10);
124
125
        //  Return α = θ / d
126
        if ($d == 0) {
127
            return $x1[0] - $this->theta[0];
128
        }
129
130
        return ($x1[0] - $this->theta[0]) / $d;
131
    }
132
133
    /**
134
     * Calculates new set of solutions with given alpha (for each θ(k)) and
135
     * gradient direction.
136
     *
137
     * θ(k+1) = θ(k) + α.d
138
     */
139
    protected function getNewTheta(float $alpha, array $d): array
140
    {
141
        $theta = $this->theta;
142
143
        for ($i = 0; $i < $this->dimensions + 1; ++$i) {
144
            if ($i === 0) {
145
                $theta[$i] += $alpha * array_sum($d);
146
            } else {
147
                $sum = 0.0;
148
                foreach ($this->samples as $si => $sample) {
149
                    $sum += $sample[$i - 1] * $d[$si] * $alpha;
150
                }
151
152
                $theta[$i] += $sum;
153
            }
154
        }
155
156
        return $theta;
157
    }
158
159
    /**
160
     * Calculates new beta (β) for given set of solutions by using
161
     * Fletcher–Reeves method.
162
     *
163
     * β = ||f(x(k+1))||²  ∕  ||f(x(k))||²
164
     *
165
     * See:
166
     *  R. Fletcher and C. M. Reeves, "Function minimization by conjugate gradients", Comput. J. 7 (1964), 149–154.
167
     */
168
    protected function getBeta(array $newTheta): float
169
    {
170
        $dNew = array_sum($this->gradient($newTheta));
171
        $dOld = array_sum($this->gradient($this->theta)) + 1e-100;
172
173
        return  $dNew ** 2 / $dOld ** 2;
174
    }
175
176
    /**
177
     * Calculates the new conjugate direction
178
     *
179
     * d(k+1) =–∇f(x(k+1)) + β(k).d(k)
180
     */
181
    protected function getNewDirection(array $theta, float $beta, array $d): array
182
    {
183
        $grad = $this->gradient($theta);
184
185
        return MP::add(MP::muls($grad, -1), MP::muls($d, $beta));
186
    }
187
}
188
189
/**
190
 * Handles element-wise vector operations between vector-vector
191
 * and vector-scalar variables
192
 */
193
class MP
0 ignored issues
show
Coding Style Compatibility introduced by
PSR1 recommends that each class should be in its own file to aid autoloaders.

Having each class in a dedicated file usually plays nice with PSR autoloaders and is therefore a well established practice. If you use other autoloaders, you might not want to follow this rule.

Loading history...
194
{
195
    /**
196
     * Element-wise <b>multiplication</b> of two vectors of the same size
197
     */
198
    public static function mul(array $m1, array $m2): array
199
    {
200
        $res = [];
201
        foreach ($m1 as $i => $val) {
202
            $res[] = $val * $m2[$i];
203
        }
204
205
        return $res;
206
    }
207
208
    /**
209
     * Element-wise <b>division</b> of two vectors of the same size
210
     */
211
    public static function div(array $m1, array $m2): array
212
    {
213
        $res = [];
214
        foreach ($m1 as $i => $val) {
215
            $res[] = $val / $m2[$i];
216
        }
217
218
        return $res;
219
    }
220
221
    /**
222
     * Element-wise <b>addition</b> of two vectors of the same size
223
     */
224
    public static function add(array $m1, array $m2, int $mag = 1): array
225
    {
226
        $res = [];
227
        foreach ($m1 as $i => $val) {
228
            $res[] = $val + $mag * $m2[$i];
229
        }
230
231
        return $res;
232
    }
233
234
    /**
235
     * Element-wise <b>subtraction</b> of two vectors of the same size
236
     */
237
    public static function sub(array $m1, array $m2): array
238
    {
239
        return self::add($m1, $m2, -1);
240
    }
241
242
    /**
243
     * Element-wise <b>multiplication</b> of a vector with a scalar
244
     */
245
    public static function muls(array $m1, float $m2): array
246
    {
247
        $res = [];
248
        foreach ($m1 as $val) {
249
            $res[] = $val * $m2;
250
        }
251
252
        return $res;
253
    }
254
255
    /**
256
     * Element-wise <b>division</b> of a vector with a scalar
257
     */
258
    public static function divs(array $m1, float $m2): array
259
    {
260
        $res = [];
261
        foreach ($m1 as $val) {
262
            $res[] = $val / ($m2 + 1e-32);
263
        }
264
265
        return $res;
266
    }
267
268
    /**
269
     * Element-wise <b>addition</b> of a vector with a scalar
270
     */
271
    public static function adds(array $m1, float $m2, int $mag = 1): array
272
    {
273
        $res = [];
274
        foreach ($m1 as $val) {
275
            $res[] = $val + $mag * $m2;
276
        }
277
278
        return $res;
279
    }
280
281
    /**
282
     * Element-wise <b>subtraction</b> of a vector with a scalar
283
     */
284
    public static function subs(array $m1, float $m2): array
285
    {
286
        return self::adds($m1, $m2, -1);
287
    }
288
}
289