Passed
Push — master ( 7ebd63...8bd9a3 )
by Simon
06:11
created

BayesianOptimizer.__init__()   A

Complexity

Conditions 1

Size

Total Lines 30
Code Lines 29

Duplication

Lines 30
Ratio 100 %

Importance

Changes 0
Metric Value
eloc 29
dl 30
loc 30
rs 9.184
c 0
b 0
f 0
cc 1
nop 13

How to fix   Many Parameters   

Many Parameters

Methods with many parameters are not only hard to understand, but their parameters also often become inconsistent when you need more, or different data.

There are several approaches to avoid long parameter lists:

1
# Author: Simon Blanke
2
# Email: [email protected]
3
# License: MIT License
4
5
from typing import List, Dict, Literal, Literal
6
7
from ..search import Search
8
from ..optimizers import BayesianOptimizer as _BayesianOptimizer
9
from ..optimizers.smb_opt.bayesian_optimization import gaussian_process
10
11
12 View Code Duplication
class BayesianOptimizer(_BayesianOptimizer, Search):
0 ignored issues
show
Duplication introduced by
This code seems to be duplicated in your project.
Loading history...
13
    """
14
    A class implementing the **bayesian optimizer** for the public API.
15
    Inheriting from the `Search`-class to get the `search`-method and from
16
    the `BayesianOptimizer`-backend to get the underlying algorithm.
17
18
    Parameters
19
    ----------
20
    search_space : dict[str, list]
21
        The search space to explore. A dictionary with parameter
22
        names as keys and a numpy array as values.
23
    initialize : dict[str, int]
24
        The method to generate initial positions. A dictionary with
25
        the following key literals and the corresponding value type:
26
        {"grid": int, "vertices": int, "random": int, "warm_start": list[dict]}
27
    constraints : list[callable]
28
        A list of constraints, where each constraint is a callable.
29
        The callable returns `True` or `False` dependend on the input parameters.
30
    random_state : None, int
31
        If None, create a new random state. If int, create a new random state
32
        seeded with the value.
33
    rand_rest_p : float
34
        The probability of a random iteration during the the search process.
35
    warm_start_smbo
36
        The warm start for SMBO.
37
    max_sample_size : int
38
        The maximum number of points to sample.
39
    sampling : dict
40
        The sampling method to use.
41
    replacement : bool
42
        Whether to sample with replacement.
43
    gpr : dict
44
        The Gaussian Process Regressor to use.
45
    xi : float
46
        The exploration-exploitation trade-off parameter.
47
    """
48
49
    def __init__(
50
        self,
51
        search_space: Dict[str, list],
52
        initialize: Dict[
53
            Literal["grid", "vertices", "random", "warm_start"], int | List
54
        ] = {"grid": 4, "random": 2, "vertices": 4},
55
        constraints: List[callable] = [],
56
        random_state: int = None,
57
        rand_rest_p: float = 0,
58
        nth_process: int = None,
59
        warm_start_smbo=None,
60
        max_sample_size: int = 10000000,
61
        sampling: Dict[Literal["random"], int] = {"random": 1000000},
62
        replacement: bool = True,
63
        gpr=gaussian_process["gp_nonlinear"],
64
        xi: float = 0.03,
65
    ):
66
        super().__init__(
67
            search_space=search_space,
68
            initialize=initialize,
69
            constraints=constraints,
70
            random_state=random_state,
71
            rand_rest_p=rand_rest_p,
72
            nth_process=nth_process,
73
            warm_start_smbo=warm_start_smbo,
74
            max_sample_size=max_sample_size,
75
            sampling=sampling,
76
            replacement=replacement,
77
            gpr=gpr,
78
            xi=xi,
79
        )
80