Passed
Push — master ( 588022...8a2a5a )
by Simon
01:36
created

hyperactive.opt.gfo._lipschitz_optimization   A

Complexity

Total Complexity 3

Size/Duplication

Total Lines 151
Duplicated Lines 97.35 %

Importance

Changes 0
Metric Value
wmc 3
eloc 53
dl 147
loc 151
rs 10
c 0
b 0
f 0

3 Methods

Rating   Name   Duplication   Size   Complexity  
A LipschitzOptimizer.__init__() 29 29 1
A LipschitzOptimizer._get_gfo_class() 11 11 1
A LipschitzOptimizer.get_test_params() 25 25 1

How to fix   Duplicated Code   

Duplicated Code

Duplicate code is one of the most pungent code smells. A rule that is often used is to re-structure code once it is duplicated in three or more places.

Common duplication problems, and corresponding solutions are:

1
from hyperactive.opt._adapters._gfo import _BaseGFOadapter
2
3
4 View Code Duplication
class LipschitzOptimizer(_BaseGFOadapter):
0 ignored issues
show
Duplication introduced by
This code seems to be duplicated in your project.
Loading history...
5
    """Lipschitz optimizer.
6
7
    Parameters
8
    ----------
9
    search_space : dict[str, list]
10
        The search space to explore. A dictionary with parameter
11
        names as keys and a numpy array as values.
12
    initialize : dict[str, int]
13
        The method to generate initial positions. A dictionary with
14
        the following key literals and the corresponding value type:
15
        {"grid": int, "vertices": int, "random": int, "warm_start": list[dict]}
16
    constraints : list[callable]
17
        A list of constraints, where each constraint is a callable.
18
        The callable returns `True` or `False` dependend on the input parameters.
19
    random_state : None, int
20
        If None, create a new random state. If int, create a new random state
21
        seeded with the value.
22
    rand_rest_p : float
23
        The probability of a random iteration during the the search process.
24
    warm_start_smbo
25
        The warm start for SMBO.
26
    max_sample_size : int
27
        The maximum number of points to sample.
28
    sampling : dict
29
        The sampling method to use.
30
    replacement : bool
31
        Whether to sample with replacement.
32
    n_iter : int, default=100
33
        The number of iterations to run the optimizer.
34
    verbose : bool, default=False
35
        If True, print the progress of the optimization process.
36
    experiment : BaseExperiment, optional
37
        The experiment to optimize parameters for.
38
        Optional, can be passed later via ``set_params``.
39
40
    Examples
41
    --------
42
    Basic usage of LipschitzOptimizer with a scikit-learn experiment:
43
44
    1. defining the experiment to optimize:
45
    >>> from hyperactive.experiment.integrations import SklearnCvExperiment
46
    >>> from sklearn.datasets import load_iris
47
    >>> from sklearn.svm import SVC
48
    >>>
49
    >>> X, y = load_iris(return_X_y=True)
50
    >>>
51
    >>> sklearn_exp = SklearnCvExperiment(
52
    ...     estimator=SVC(),
53
    ...     X=X,
54
    ...     y=y,
55
    ... )
56
57
    2. setting up the lipschitzOptimizer optimizer:
58
    >>> from hyperactive.opt import LipschitzOptimizer
59
    >>> import numpy as np
60
    >>>
61
    >>> config = {
62
    ...     "search_space": {
63
    ...         "C": [0.01, 0.1, 1, 10],
64
    ...         "gamma": [0.0001, 0.01, 0.1, 1, 10],
65
    ...     },
66
    ...     "n_iter": 100,
67
    ... }
68
    >>> optimizer = LipschitzOptimizer(experiment=sklearn_exp, **config)
69
70
    3. running the optimization:
71
    >>> best_params = optimizer.run()
72
73
    Best parameters can also be accessed via:
74
    >>> best_params = optimizer.best_params_
75
    """
76
77
    _tags = {
78
        "info:name": "Lipschitz Optimization",
79
        "info:local_vs_global": "global",
80
        "info:explore_vs_exploit": "mixed",
81
        "info:compute": "high",
82
    }
83
84
    def __init__(
85
        self,
86
        search_space=None,
87
        initialize=None,
88
        constraints=None,
89
        random_state=None,
90
        rand_rest_p=0.1,
91
        warm_start_smbo=None,
92
        max_sample_size=10000000,
93
        sampling=None,
94
        replacement=True,
95
        n_iter=100,
96
        verbose=False,
97
        experiment=None,
98
    ):
99
        self.random_state = random_state
100
        self.rand_rest_p = rand_rest_p
101
        self.warm_start_smbo = warm_start_smbo
102
        self.max_sample_size = max_sample_size
103
        self.sampling = sampling
104
        self.replacement = replacement
105
        self.search_space = search_space
106
        self.initialize = initialize
107
        self.constraints = constraints
108
        self.n_iter = n_iter
109
        self.experiment = experiment
110
        self.verbose = verbose
111
112
        super().__init__()
113
114
    def _get_gfo_class(self):
115
        """Get the GFO class to use.
116
117
        Returns
118
        -------
119
        class
120
            The GFO class to use. One of the concrete GFO classes
121
        """
122
        from gradient_free_optimizers import LipschitzOptimizer
123
124
        return LipschitzOptimizer
125
126
    @classmethod
127
    def get_test_params(cls, parameter_set="default"):
128
        """Get the test parameters for the optimizer.
129
130
        Returns
131
        -------
132
        dict with str keys
133
            The test parameters dictionary.
134
        """
135
        import numpy as np
136
137
        params = super().get_test_params()
138
        experiment = params[0]["experiment"]
139
        more_params = {
140
            "experiment": experiment,
141
            "max_sample_size": 1000,
142
            "replacement": True,
143
            "search_space": {
144
                "C": [0.01, 0.1, 1, 10],
145
                "gamma": [0.0001, 0.01, 0.1, 1, 10],
146
            },
147
            "n_iter": 100,
148
        }
149
        params.append(more_params)
150
        return params
151