Passed
Push — master ( 588022...8a2a5a )
by Simon
01:36
created

hyperactive.opt.gfo._pattern_search   A

Complexity

Total Complexity 3

Size/Duplication

Total Lines 148
Duplicated Lines 97.3 %

Importance

Changes 0
Metric Value
wmc 3
eloc 52
dl 144
loc 148
rs 10
c 0
b 0
f 0

3 Methods

Rating   Name   Duplication   Size   Complexity  
A PatternSearch._get_gfo_class() 11 11 1
A PatternSearch.__init__() 27 27 1
A PatternSearch.get_test_params() 26 26 1

How to fix   Duplicated Code   

Duplicated Code

Duplicate code is one of the most pungent code smells. A rule that is often used is to re-structure code once it is duplicated in three or more places.

Common duplication problems, and corresponding solutions are:

1
from hyperactive.opt._adapters._gfo import _BaseGFOadapter
2
3
4 View Code Duplication
class PatternSearch(_BaseGFOadapter):
0 ignored issues
show
Duplication introduced by
This code seems to be duplicated in your project.
Loading history...
5
    """Pattern search optimizer.
6
7
    Parameters
8
    ----------
9
    search_space : dict[str, list]
10
        The search space to explore. A dictionary with parameter
11
        names as keys and a numpy array as values.
12
    initialize : dict[str, int]
13
        The method to generate initial positions. A dictionary with
14
        the following key literals and the corresponding value type:
15
        {"grid": int, "vertices": int, "random": int, "warm_start": list[dict]}
16
    constraints : list[callable]
17
        A list of constraints, where each constraint is a callable.
18
        The callable returns `True` or `False` dependend on the input parameters.
19
    random_state : None, int
20
        If None, create a new random state. If int, create a new random state
21
        seeded with the value.
22
    rand_rest_p : float
23
        The probability of a random iteration during the the search process.
24
    n_positions : int
25
        Number of positions that the pattern consists of.
26
    pattern_size : float
27
        The initial size of the patterns in percentage of the size of the search space in the corresponding dimension.
28
    reduction : float
29
        The factor that reduces the size of the pattern if no better position is found.
30
    n_iter : int, default=100
31
        The number of iterations to run the optimizer.
32
    verbose : bool, default=False
33
        If True, print the progress of the optimization process.
34
    experiment : BaseExperiment, optional
35
        The experiment to optimize parameters for.
36
        Optional, can be passed later via ``set_params``.
37
38
    Examples
39
    --------
40
    Basic usage of PatternSearch with a scikit-learn experiment:
41
42
    1. defining the experiment to optimize:
43
    >>> from hyperactive.experiment.integrations import SklearnCvExperiment
44
    >>> from sklearn.datasets import load_iris
45
    >>> from sklearn.svm import SVC
46
    >>>
47
    >>> X, y = load_iris(return_X_y=True)
48
    >>>
49
    >>> sklearn_exp = SklearnCvExperiment(
50
    ...     estimator=SVC(),
51
    ...     X=X,
52
    ...     y=y,
53
    ... )
54
55
    2. setting up the patternSearch optimizer:
56
    >>> from hyperactive.opt import PatternSearch
57
    >>> import numpy as np
58
    >>>
59
    >>> config = {
60
    ...     "search_space": {
61
    ...         "C": [0.01, 0.1, 1, 10],
62
    ...         "gamma": [0.0001, 0.01, 0.1, 1, 10],
63
    ...     },
64
    ...     "n_iter": 100,
65
    ... }
66
    >>> optimizer = PatternSearch(experiment=sklearn_exp, **config)
67
68
    3. running the optimization:
69
    >>> best_params = optimizer.run()
70
71
    Best parameters can also be accessed via:
72
    >>> best_params = optimizer.best_params_
73
    """
74
75
    _tags = {
76
        "info:name": "Pattern Search",
77
        "info:local_vs_global": "local",
78
        "info:explore_vs_exploit": "explore",
79
        "info:compute": "middle",
80
    }
81
82
    def __init__(
83
        self,
84
        search_space=None,
85
        initialize=None,
86
        constraints=None,
87
        random_state=None,
88
        rand_rest_p=0.1,
89
        n_positions=4,
90
        pattern_size=0.25,
91
        reduction=0.9,
92
        n_iter=100,
93
        verbose=False,
94
        experiment=None,
95
    ):
96
        self.random_state = random_state
97
        self.rand_rest_p = rand_rest_p
98
        self.n_positions = n_positions
99
        self.pattern_size = pattern_size
100
        self.reduction = reduction
101
        self.search_space = search_space
102
        self.initialize = initialize
103
        self.constraints = constraints
104
        self.n_iter = n_iter
105
        self.experiment = experiment
106
        self.verbose = verbose
107
108
        super().__init__()
109
110
    def _get_gfo_class(self):
111
        """Get the GFO class to use.
112
113
        Returns
114
        -------
115
        class
116
            The GFO class to use. One of the concrete GFO classes
117
        """
118
        from gradient_free_optimizers import PatternSearch
119
120
        return PatternSearch
121
122
    @classmethod
123
    def get_test_params(cls, parameter_set="default"):
124
        """Get the test parameters for the optimizer.
125
126
        Returns
127
        -------
128
        dict with str keys
129
            The test parameters dictionary.
130
        """
131
        import numpy as np
132
133
        params = super().get_test_params()
134
        experiment = params[0]["experiment"]
135
        more_params = {
136
            "experiment": experiment,
137
            "n_positions": 3,
138
            "pattern_size": 0.5,
139
            "reduction": 0.999,
140
            "search_space": {
141
                "C": [0.01, 0.1, 1, 10],
142
                "gamma": [0.0001, 0.01, 0.1, 1, 10],
143
            },
144
            "n_iter": 100,
145
        }
146
        params.append(more_params)
147
        return params
148