Passed
Pull Request — master (#233)
by Grega
01:17
created

ArtificialBeeColonyAlgorithm.runIteration()   D

Complexity

Conditions 12

Size

Total Lines 54
Code Lines 32

Duplication

Lines 0
Ratio 0 %

Importance

Changes 0
Metric Value
cc 12
eloc 32
nop 9
dl 0
loc 54
rs 4.8
c 0
b 0
f 0

How to fix   Long Method    Complexity    Many Parameters   

Long Method

Small methods make your code easier to understand, in particular if combined with a good name. Besides, if your method is small, finding a good name is usually much easier.

For example, if you find yourself adding comments to a method's body, this is usually a good sign to extract the commented part to a new method, and use the comment as a starting point when coming up with a good name for this new method.

Commonly applied refactorings include:

Complexity

Complex classes like NiaPy.algorithms.basic.abc.ArtificialBeeColonyAlgorithm.runIteration() often do a lot of different things. To break such a class down, we need to identify a cohesive component within that class. A common approach to find such a component is to look for fields/methods that share the same prefixes, or suffixes.

Once you have determined the fields that belong together, you can apply the Extract Class refactoring. If the component makes sense as a sub-class, Extract Subclass is also a candidate, and is often faster.

Many Parameters

Methods with many parameters are not only hard to understand, but their parameters also often become inconsistent when you need more, or different data.

There are several approaches to avoid long parameter lists:

1
# encoding=utf8
2
# pylint: disable=mixed-indentation, line-too-long, multiple-statements, attribute-defined-outside-init, logging-not-lazy, arguments-differ, bad-continuation
3
import copy
4
import logging
5
6
from numpy import asarray, full, argmax
7
8
from NiaPy.algorithms.algorithm import Algorithm, Individual, defaultIndividualInit
9
10
logging.basicConfig()
11
logger = logging.getLogger('NiaPy.algorithms.basic')
12
logger.setLevel('INFO')
13
14
__all__ = ['ArtificialBeeColonyAlgorithm']
15
16
class SolutionABC(Individual):
17
	r"""Representation of solution for Artificial Bee Colony Algorithm.
18
19
	Date:
20
		2018
21
22
	Author:
23
		Klemen Berkovič
24
25
	See Also:
26
		* :class:`NiaPy.algorithms.Individual`
27
	"""
28
	def __init__(self, **kargs):
29
		r"""Initialize individual.
30
31
		Args:
32
			kargs (Dict[str, Any]): Additional arguments.
33
34
		See Also:
35
			* :func:`NiaPy.algorithms.Individual.__init__`
36
		"""
37
		Individual.__init__(self, **kargs)
38
39
class ArtificialBeeColonyAlgorithm(Algorithm):
40
	r"""Implementation of Artificial Bee Colony algorithm.
41
42
	Algorithm:
43
		Artificial Bee Colony algorithm
44
45
	Date:
46
		2018
47
48
	Author:
49
		Uros Mlakar and Klemen Berkovič
50
51
	License:
52
		MIT
53
54
	Reference paper:
55
		Karaboga, D., and Bahriye B. "A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm." Journal of global optimization 39.3 (2007): 459-471.
56
57
	Arguments
58
		Name (List[str]): List containing strings that represent algorithm names
59
		Limit (Union[float, numpy.ndarray[float]]): Limt
60
61
	See Also:
62
		* :class:`NiaPy.algorithms.Algorithm`
63
	"""
64
	Name = ['ArtificialBeeColonyAlgorithm', 'ABC']
65
66
	@staticmethod
67
	def typeParameters():
68
		r"""Return functions for checking values of parameters.
69
70
		Returns:
71
			Dict[str, Callable]:
72
				* Limit (Callable[Union[float, numpy.ndarray[float]]]): TODO
73
74
		See Also:
75
			* :func:`NiaPy.algorithms.Algorithm.typeParameters`
76
		"""
77
		d = Algorithm.typeParameters()
78
		d.update({'Limit': lambda x: isinstance(x, int) and x > 0})
79
		return d
80
81
	def setParameters(self, NP=10, Limit=100, **ukwargs):
82
		r"""Set the parameters of Artificial Bee Colony Algorithm.
83
84
		Parameters:
85
			Limit (Optional[Union[float, numpy.ndarray[float]]]): Limt
86
			**ukwargs (Dict[str, Any]): Additional arguments
87
88
		See Also:
89
			* :func:`NiaPy.algorithms.Algorithm.setParameters`
90
		"""
91
		Algorithm.setParameters(self, NP=NP, InitPopFunc=defaultIndividualInit, itype=SolutionABC, **ukwargs)
92
		self.FoodNumber, self.Limit = int(self.NP / 2), Limit
93
94
	def CalculateProbs(self, Foods, Probs):
95
		r"""Calculate the probes.
96
97
		Parameters:
98
			Foods (numpy.ndarray): TODO
99
			Probs (numpy.ndarray): TODO
100
101
		Returns:
102
			numpy.ndarray: TODO
103
		"""
104
		Probs = [1.0 / (Foods[i].f + 0.01) for i in range(self.FoodNumber)]
105
		s = sum(Probs)
106
		Probs = [Probs[i] / s for i in range(self.FoodNumber)]
107
		return Probs
108
109
	def initPopulation(self, task):
110
		r"""Initialize the starting population.
111
112
		Parameters:
113
			task (Task): Optimization task
114
115
		Returns:
116
			Tuple[numpy.ndarray, numpy.ndarray[float], Dict[str, Any]]:
117
				1. New population
118
				2. New population fitness/function values
119
				3. Additional arguments:
120
					* Probes (numpy.ndarray): TODO
121
					* Trial (numpy.ndarray): TODO
122
123
		See Also:
124
			* :func:`NiaPy.algorithms.Algorithm.initPopulation`
125
		"""
126
		Foods, fpop, _ = Algorithm.initPopulation(self, task)
127
		Probs, Trial = full(self.FoodNumber, 0.0), full(self.FoodNumber, 0.0)
128
		return Foods, fpop, {'Probs': Probs, 'Trial': Trial}
129
130
	def runIteration(self, task, Foods, fpop, xb, fxb, Probs, Trial, **dparams):
131
		r"""Core funciton of  the algorithm.
132
133
		Parameters:
134
			task (Task): Optimization task
135
			Foods (numpy.ndarray): Current population
136
			fpop (numpy.ndarray[float]): Function/fitness values of current population
137
			xb (numpy.ndarray): Current best individual
138
			fxb (float): Current best individual fitness/function value
139
			Probs (numpy.ndarray): TODO
140
			Trial (numpy.ndarray): TODO
141
			dparams (Dict[str, Any]): Additional parameters
142
143
		Returns:
144
			Tuple[numpy.ndarray, numpy.ndarray, numpy.ndarray, float, Dict[str, Any]]:
145
				1. New population
146
				2. New population fitness/function values
147
				3. New global best solution
148
				4. New global best fitness/objecive value
149
				5. Additional arguments:
150
					* Probes (numpy.ndarray): TODO
151
					* Trial (numpy.ndarray): TODO
152
		"""
153
		for i in range(self.FoodNumber):
154
			newSolution = copy.deepcopy(Foods[i])
155
			param2change = int(self.rand() * task.D)
156
			neighbor = int(self.FoodNumber * self.rand())
157
			newSolution.x[param2change] = Foods[i].x[param2change] + (-1 + 2 * self.rand()) * (Foods[i].x[param2change] - Foods[neighbor].x[param2change])
158
			newSolution.evaluate(task, rnd=self.Rand)
159
			if newSolution.f < Foods[i].f:
160
				Foods[i], Trial[i] = newSolution, 0
161
				if newSolution.f < fxb: xb, fxb = newSolution.x.copy(), newSolution.f
162
			else: Trial[i] += 1
163
		Probs, t, s = self.CalculateProbs(Foods, Probs), 0, 0
164
		while t < self.FoodNumber:
165
			if self.rand() < Probs[s]:
166
				t += 1
167
				Solution = copy.deepcopy(Foods[s])
168
				param2change = int(self.rand() * task.D)
169
				neighbor = int(self.FoodNumber * self.rand())
170
				while neighbor == s: neighbor = int(self.FoodNumber * self.rand())
171
				Solution.x[param2change] = Foods[s].x[param2change] + (-1 + 2 * self.rand()) * (Foods[s].x[param2change] - Foods[neighbor].x[param2change])
172
				Solution.evaluate(task, rnd=self.Rand)
173
				if Solution.f < Foods[s].f:
174
					Foods[s], Trial[s] = Solution, 0
175
					if Solution.f < fxb: xb, fxb = Solution.x.copy(), Solution.f
176
				else: Trial[s] += 1
177
			s += 1
178
			if s == self.FoodNumber: s = 0
179
		mi = argmax(Trial)
180
		if Trial[mi] >= self.Limit:
181
			Foods[mi], Trial[mi] = SolutionABC(task=task, rnd=self.Rand), 0
182
			if Foods[mi].f < fxb: xb, fxb = Foods[mi].x.copy(), Foods[mi].f
183
		return Foods, asarray([f.f for f in Foods]), xb, fxb, {'Probs': Probs, 'Trial': Trial}
184
185
# vim: tabstop=3 noexpandtab shiftwidth=3 softtabstop=3
186