| Conditions | 12 | 
| Total Lines | 64 | 
| Lines | 0 | 
| Ratio | 0 % | 
| Changes | 2 | ||
| Bugs | 0 | Features | 0 | 
Small methods make your code easier to understand, in particular if combined with a good name. Besides, if your method is small, finding a good name is usually much easier.
For example, if you find yourself adding comments to a method's body, this is usually a good sign to extract the commented part to a new method, and use the comment as a starting point when coming up with a good name for this new method.
Commonly applied refactorings include:
If many parameters/temporary variables are present:
Complex classes like GreyWolfOptimizer.move() often do a lot of different things. To break such a class down, we need to identify a cohesive component within that class. A common approach to find such a component is to look for fields/methods that share the same prefixes, or suffixes.
Once you have determined the fields that belong together, you can apply the Extract Class refactoring. If the component makes sense as a sub-class, Extract Subclass is also a candidate, and is often faster.
| 1 | """Grey wolf optimizer.  | 
            ||
| 62 | def move(self):  | 
            ||
| 63 | |||
| 64 | self.init()  | 
            ||
| 65 | |||
| 66 | while True:  | 
            ||
| 67 | if self.evaluations == self.nFES:  | 
            ||
| 68 | break  | 
            ||
| 69 | |||
| 70 | for i in range(self.NP):  | 
            ||
| 71 | self.Positions[i] = self.bounds(self.Positions[i])  | 
            ||
| 72 | |||
| 73 | Fit = self.Fun(self.D, self.Positions[i])  | 
            ||
| 74 | self.evaluations = self.evaluations + 1  | 
            ||
| 75 | |||
| 76 | if Fit < self.Alpha_score:  | 
            ||
| 77 | self.Alpha_score = Fit  | 
            ||
| 78 | self.Alpha_pos = self.Positions[i]  | 
            ||
| 79 | |||
| 80 | if ((Fit > self.Alpha_score) and (Fit < self.Beta_score)):  | 
            ||
| 81 | self.Beta_score = Fit  | 
            ||
| 82 | self.Beta_pos = self.Positions[i]  | 
            ||
| 83 | |||
| 84 | if ((Fit > self.Alpha_score) and (Fit > self.Beta_score) and  | 
            ||
| 85 | (Fit < self.Delta_score)):  | 
            ||
| 86 | self.Delta_score = Fit  | 
            ||
| 87 | self.Delta_pos = self.Positions[i]  | 
            ||
| 88 | |||
| 89 | a = 2 - self.evaluations * ((2) / self.nFES)  | 
            ||
| 90 | |||
| 91 | for i in range(self.NP):  | 
            ||
| 92 | for j in range(self.D):  | 
            ||
| 93 | |||
| 94 | r1 = random.random()  | 
            ||
| 95 | r2 = random.random()  | 
            ||
| 96 | |||
| 97 | A1 = 2 * a * r1 - a  | 
            ||
| 98 | C1 = 2 * r2  | 
            ||
| 99 | |||
| 100 | D_alpha = abs(  | 
            ||
| 101 | C1 * self.Alpha_pos[j] - self.Positions[i][j])  | 
            ||
| 102 | X1 = self.Alpha_pos[j] - A1 * D_alpha  | 
            ||
| 103 | |||
| 104 | r1 = random.random()  | 
            ||
| 105 | r2 = random.random()  | 
            ||
| 106 | |||
| 107 | A2 = 2 * a * r1 - a  | 
            ||
| 108 | C2 = 2 * r2  | 
            ||
| 109 | |||
| 110 | D_beta = abs(C2 * self.Beta_pos[j] - self.Positions[i][j])  | 
            ||
| 111 | X2 = self.Beta_pos[j] - A2 * D_beta  | 
            ||
| 112 | |||
| 113 | r1 = random.random()  | 
            ||
| 114 | r2 = random.random()  | 
            ||
| 115 | |||
| 116 | A3 = 2 * a * r1 - a  | 
            ||
| 117 | C3 = 2 * r2  | 
            ||
| 118 | |||
| 119 | D_delta = abs(  | 
            ||
| 120 | C3 * self.Delta_pos[j] - self.Positions[i][j])  | 
            ||
| 121 | X3 = self.Delta_pos[j] - A3 * D_delta  | 
            ||
| 122 | |||
| 123 | self.Positions[i][j] = (X1 + X2 + X3) / 3  | 
            ||
| 124 | |||
| 125 | return self.Alpha_score  | 
            ||
| 126 |