Passed
Push — master ( 9fe41f...17bc31 )
by Simon
01:26
created

multiple_scores   A

Complexity

Total Complexity 1

Size/Duplication

Total Lines 52
Duplicated Lines 0 %

Importance

Changes 0
Metric Value
eloc 28
dl 0
loc 52
rs 10
c 0
b 0
f 0
wmc 1

1 Function

Rating   Name   Duplication   Size   Complexity  
A model() 0 18 1
1
import time
2
from sklearn.model_selection import cross_val_score
3
from sklearn.ensemble import GradientBoostingRegressor
4
from sklearn.datasets import load_boston
5
from hyperactive import Hyperactive
6
7
data = load_boston()
8
X, y = data.data, data.target
9
10
"""
11
Hyperactive cannot handle multi objective optimization. 
12
But we can achive something similar with a workaround.
13
The following example searches for the highest cv-score and the lowest training time.
14
It is possible by creating an objective/score from those two variables.
15
You can also return additional parameters to track the cv-score and training time separately.
16
"""
17
18
19
def model(opt):
20
    gbr = GradientBoostingRegressor(
21
        n_estimators=opt["n_estimators"],
22
        max_depth=opt["max_depth"],
23
        min_samples_split=opt["min_samples_split"],
24
    )
25
26
    c_time = time.time()
27
    scores = cross_val_score(gbr, X, y, cv=3)
28
    train_time = time.time() - c_time
29
30
    cv_score = scores.mean()
31
32
    # you can create a score that is a composition of two objectives
33
    score = cv_score / train_time
34
35
    # instead of just returning the score you can also return the score + a dict
36
    return score, {"training_time": train_time, "cv_score": cv_score}
37
38
39
search_space = {
40
    "n_estimators": list(range(10, 150, 5)),
41
    "max_depth": list(range(2, 12)),
42
    "min_samples_split": list(range(2, 22)),
43
}
44
45
46
hyper = Hyperactive()
47
hyper.add_search(model, search_space, n_iter=20)
48
hyper.run()
49
50
# The variables from the dict are collected in the results.
51
print("\n Results \n", hyper.results(model))
52