GitHub Access Token became invalid

It seems like the GitHub access token used for retrieving details about this repository from GitHub became invalid. This might prevent certain types of inspections from being run (in particular, everything related to pull requests).
Please ask an admin of your repository to re-new the access token on this website.

Issues (4082)

Orange/regression/linear.py (58 issues)

1
import sklearn.linear_model as skl_linear_model
0 ignored issues
show
The import sklearn.linear_model could not be resolved.

This can be caused by one of the following:

1. Missing Dependencies

This error could indicate a configuration issue of Pylint. Make sure that your libraries are available by adding the necessary commands.

# .scrutinizer.yml
before_commands:
    - sudo pip install abc # Python2
    - sudo pip3 install abc # Python3
Tip: We are currently not using virtualenv to run pylint, when installing your modules make sure to use the command for the correct version.

2. Missing __init__.py files

This error could also result from missing __init__.py files in your module folders. Make sure that you place one file in each sub-folder.

Loading history...
2
import sklearn.pipeline as skl_pipeline
0 ignored issues
show
The import sklearn.pipeline could not be resolved.

This can be caused by one of the following:

1. Missing Dependencies

This error could indicate a configuration issue of Pylint. Make sure that your libraries are available by adding the necessary commands.

# .scrutinizer.yml
before_commands:
    - sudo pip install abc # Python2
    - sudo pip3 install abc # Python3
Tip: We are currently not using virtualenv to run pylint, when installing your modules make sure to use the command for the correct version.

2. Missing __init__.py files

This error could also result from missing __init__.py files in your module folders. Make sure that you place one file in each sub-folder.

Loading history...
3
import sklearn.preprocessing as skl_preprocessing
0 ignored issues
show
The import sklearn.preprocessing could not be resolved.

This can be caused by one of the following:

1. Missing Dependencies

This error could indicate a configuration issue of Pylint. Make sure that your libraries are available by adding the necessary commands.

# .scrutinizer.yml
before_commands:
    - sudo pip install abc # Python2
    - sudo pip3 install abc # Python3
Tip: We are currently not using virtualenv to run pylint, when installing your modules make sure to use the command for the correct version.

2. Missing __init__.py files

This error could also result from missing __init__.py files in your module folders. Make sure that you place one file in each sub-folder.

Loading history...
4
5
from Orange.regression import Learner, Model, SklLearner
6
7
8
__all__ = ["LinearRegressionLearner", "RidgeRegressionLearner",
9
           "LassoRegressionLearner", "SGDRegressionLearner",
10
           "ElasticNetLearner", "ElasticNetCVLearner",
11
           "PolynomialLearner"]
12
13
14
class LinearRegressionLearner(SklLearner):
15
    __wraps__ = skl_linear_model.LinearRegression
16
    name = 'linreg'
17
18
    def __init__(self, preprocessors=None):
19
        super().__init__(preprocessors=preprocessors)
20
    
0 ignored issues
show
Trailing whitespace
Loading history...
21
    def fit(self, X, Y, W):
22
        sk = skl_linear_model.LinearRegression()
23
        sk.fit(X, Y)
24
        return LinearModel(sk)
25
26
27
class RidgeRegressionLearner(SklLearner):
28
    __wraps__ = skl_linear_model.Ridge
29
    name = 'ridge'
30
31
    def __init__(self, alpha=1.0, fit_intercept=True,
0 ignored issues
show
The argument alpha seems to be unused.
Loading history...
The argument fit_intercept seems to be unused.
Loading history...
32
                 normalize=False, copy_X=True, max_iter=None,
0 ignored issues
show
The argument normalize seems to be unused.
Loading history...
The argument copy_X seems to be unused.
Loading history...
The argument max_iter seems to be unused.
Loading history...
33
                 tol=0.001, solver='auto', preprocessors=None):
0 ignored issues
show
The argument solver seems to be unused.
Loading history...
The argument tol seems to be unused.
Loading history...
34
        super().__init__(preprocessors=preprocessors)
35
        self.params = vars()
36
37
38
class LassoRegressionLearner(SklLearner):
39
    __wraps__ = skl_linear_model.Lasso
40
    name = 'lasso'
41
42
    def __init__(self, alpha=1.0, fit_intercept=True, normalize=False,
0 ignored issues
show
The argument alpha seems to be unused.
Loading history...
The argument normalize seems to be unused.
Loading history...
The argument fit_intercept seems to be unused.
Loading history...
43
                 precompute=False, copy_X=True, max_iter=1000,
0 ignored issues
show
The argument precompute seems to be unused.
Loading history...
The argument copy_X seems to be unused.
Loading history...
The argument max_iter seems to be unused.
Loading history...
44
                 tol=0.0001, warm_start=False, positive=False,
0 ignored issues
show
The argument positive seems to be unused.
Loading history...
The argument warm_start seems to be unused.
Loading history...
The argument tol seems to be unused.
Loading history...
45
                 preprocessors=None):
46
        super().__init__(preprocessors=preprocessors)
47
        self.params = vars()
48
49
50
class ElasticNetLearner(SklLearner):
51
    __wraps__ = skl_linear_model.ElasticNet
52
    name = 'elastic'
53
54
    def __init__(self, alpha=1.0, l1_ratio=0.5, fit_intercept=True,
0 ignored issues
show
The argument alpha seems to be unused.
Loading history...
The argument l1_ratio seems to be unused.
Loading history...
The argument fit_intercept seems to be unused.
Loading history...
55
                 normalize=False, precompute=False, max_iter=1000,
0 ignored issues
show
The argument normalize seems to be unused.
Loading history...
The argument max_iter seems to be unused.
Loading history...
The argument precompute seems to be unused.
Loading history...
56
                 copy_X=True, tol=0.0001, warm_start=False, positive=False,
0 ignored issues
show
The argument warm_start seems to be unused.
Loading history...
The argument positive seems to be unused.
Loading history...
The argument tol seems to be unused.
Loading history...
The argument copy_X seems to be unused.
Loading history...
57
                 preprocessors=None):
58
        super().__init__(preprocessors=preprocessors)
59
        self.params = vars()
60
61
62
class ElasticNetCVLearner(SklLearner):
63
    __wraps__ = skl_linear_model.ElasticNetCV
64
    name = 'elasticCV'
65
66
    def __init__(self, l1_ratio=0.5, eps=0.001, n_alphas=100, alphas=None,
0 ignored issues
show
The argument alphas seems to be unused.
Loading history...
The argument eps seems to be unused.
Loading history...
The argument n_alphas seems to be unused.
Loading history...
The argument l1_ratio seems to be unused.
Loading history...
67
                 fit_intercept=True, normalize=False, precompute='auto',
0 ignored issues
show
The argument precompute seems to be unused.
Loading history...
The argument normalize seems to be unused.
Loading history...
The argument fit_intercept seems to be unused.
Loading history...
68
                 max_iter=1000, tol=0.0001, cv=None, copy_X=True,
0 ignored issues
show
The argument copy_X seems to be unused.
Loading history...
The argument max_iter seems to be unused.
Loading history...
The argument tol seems to be unused.
Loading history...
The argument cv seems to be unused.
Loading history...
69
                 verbose=0, n_jobs=1, positive=False, preprocessors=None):
0 ignored issues
show
The argument positive seems to be unused.
Loading history...
The argument n_jobs seems to be unused.
Loading history...
The argument verbose seems to be unused.
Loading history...
70
        super().__init__(preprocessors=preprocessors)
71
        self.params = vars()
72
73
74
class SGDRegressionLearner(SklLearner):
75
    __wraps__ = skl_linear_model.SGDRegressor
76
    name = 'sgd'
77
78
    def __init__(self, loss='squared_loss', alpha=0.0001, epsilon=0.1,
0 ignored issues
show
The argument alpha seems to be unused.
Loading history...
The argument epsilon seems to be unused.
Loading history...
The argument loss seems to be unused.
Loading history...
79
                 eta0=0.01, l1_ratio=0.15, penalty='l2', power_t=0.25,
0 ignored issues
show
The argument eta0 seems to be unused.
Loading history...
The argument penalty seems to be unused.
Loading history...
The argument l1_ratio seems to be unused.
Loading history...
The argument power_t seems to be unused.
Loading history...
80
                 learning_rate='invscaling', n_iter=5, fit_intercept=True,
0 ignored issues
show
The argument learning_rate seems to be unused.
Loading history...
The argument n_iter seems to be unused.
Loading history...
The argument fit_intercept seems to be unused.
Loading history...
81
                 preprocessors=None):
82
        super().__init__(preprocessors=preprocessors)
83
        self.params = vars()
84
85
    def fit(self, X, Y, W):
86
        sk = self.__wraps__(**self.params)
87
        clf = skl_pipeline.Pipeline(
88
            [('scaler', skl_preprocessing.StandardScaler()), ('sgd', sk)])
89
        clf.fit(X, Y.ravel())
90
        return LinearModel(clf)
91
92
93
class PolynomialLearner(Learner):
94
    name = 'poly learner'
95
96
    def __init__(self, learner, degree=1, preprocessors=None):
97
        super().__init__(preprocessors=preprocessors)
98
        self.degree = degree
99
        self.learner = learner
100
    
0 ignored issues
show
Trailing whitespace
Loading history...
101
    def fit(self, X, Y, W):
0 ignored issues
show
Bug Best Practice introduced by
Signature differs from overridden 'fit' method

It is generally a good practice to use signatures that are compatible with the Liskov substitution principle.

This allows to pass instances of the child class anywhere where the instances of the super-class/interface would be acceptable.

Loading history...
102
        polyfeatures = skl_preprocessing.PolynomialFeatures(self.degree)
103
        X = polyfeatures.fit_transform(X)
104
        clf = self.learner
105
        if W is None or not self.supports_weights:
106
            model = clf.fit(X, Y, None)
107
        else:
108
            model = clf.fit(X, Y, sample_weight=W.reshape(-1))
109
        return PolynomialModel(model, polyfeatures)
110
111
112
class LinearModel(Model):
113
    supports_multiclass = True
114
115
    def __init__(self, model):
0 ignored issues
show
The __init__ method of the super-class ModelRegression is not called.

It is generally advisable to initialize the super-class by calling its __init__ method:

class SomeParent:
    def __init__(self):
        self.x = 1

class SomeChild(SomeParent):
    def __init__(self):
        # Initialize the super class
        SomeParent.__init__(self)
Loading history...
116
        self.model = model
117
118
    def predict(self, X):
119
        vals = self.model.predict(X)
120
        if len(vals.shape) == 1:
121
            # Prevent IndexError for 1D array
122
            return vals
123
        elif vals.shape[1] == 1:
124
            return vals.ravel()
125
        else:
126
            return vals
127
128
    def __str__(self):
129
        return 'LinearModel {}'.format(self.model)
130
131
class PolynomialModel(Model):
132
    supports_multiclass = True
133
134
    def __init__(self, model, polyfeatures):
0 ignored issues
show
The __init__ method of the super-class ModelRegression is not called.

It is generally advisable to initialize the super-class by calling its __init__ method:

class SomeParent:
    def __init__(self):
        self.x = 1

class SomeChild(SomeParent):
    def __init__(self):
        # Initialize the super class
        SomeParent.__init__(self)
Loading history...
135
        self.model = model
136
        self.polyfeatures = polyfeatures
137
138
    def predict(self, X):
139
        X = self.polyfeatures.fit_transform(X)
140
        return self.model.predict(X)
141
142
    def __str__(self):
143
        return 'PolynomialModel {}'.format(self.model)
144