Skip to content

Commit 875d32f

Browse files
DOC Fix versionadded/versionchanged for 0.24 (scikit-learn#18312)
1 parent 80c0a29 commit 875d32f

File tree

14 files changed

+55
-24
lines changed

14 files changed

+55
-24
lines changed

doc/whats_new/v0.24.rst

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -184,8 +184,8 @@ Changelog
184184
.......................
185185

186186
- |Feature| :class:`ensemble.HistGradientBoostingRegressor` and
187-
:class:`ensemble.HistGradientClassifier` now support `staged_predict`,
188-
which allows monitoring of each stage.
187+
:class:`ensemble.HistGradientBoostingClassifier` now support the
188+
method `staged_predict`, which allows monitoring of each stage.
189189
:pr:`16985` by :user:`Hao Chun Chang <haochunchang>`.
190190

191191
- |API|: The parameter ``n_classes_`` is now deprecated in
@@ -219,8 +219,9 @@ Changelog
219219
attribute name/path or a `callable` for extracting feature importance from
220220
the estimator. :pr:`15361` by :user:`Venkatachalam N <venkyyuvy>`.
221221

222-
- |Enhancement| Added the option for the number of n_features_to_select to be
223-
given as a float representing the percentage of features to select.
222+
- |Enhancement| :class:`feature_selection.RFE` supports the option for the
223+
number of `n_features_to_select` to be given as a float representing the
224+
percentage of features to select.
224225
:pr:`17090` by :user:`Lisa Schwetlick <lschwetlick>` and
225226
:user:`Marija Vlajic Wheeler <marijavlajic>`.
226227

@@ -234,7 +235,7 @@ Changelog
234235
...............................
235236

236237
- |Enhancement| A new method
237-
:class:`gaussian_process.Kernel._check_bounds_params` is called after
238+
:meth:`gaussian_process.Kernel._check_bounds_params` is called after
238239
fitting a Gaussian Process and raises a ``ConvergenceWarning`` if the bounds
239240
of the hyperparameters are too tight.
240241
:issue:`12638` by :user:`Sylvain Lannuzel <SylvainLan>`.
@@ -253,10 +254,9 @@ Changelog
253254
:pr:`17526` by :user:`Ayako YAGI <yagi-3>` and
254255
:user:`Juan Carlos Alfaro Jiménez <alfaro96>`.
255256

256-
- |Feature| :class:`impute.SimpleImputer` now supports ``inverse_transform``
257-
functionality to revert imputed data to original when instantiated
258-
with `add_indicator=True`.
259-
:pr:`17612` by :user:`Srimukh Sripada <d3b0unce>`.
257+
- |Feature| Added method :meth:`impute.SimpleImputer.inverse_transform` to
258+
revert imputed data to original when instantiated with
259+
``add_indicator=True``. :pr:`17612` by :user:`Srimukh Sripada <d3b0unce>`.
260260

261261
- |Fix| :class:`impute.IterativeImputer` will not attempt to set the
262262
estimator's `random_state` attribute, allowing to use it with more external classes.
@@ -284,7 +284,7 @@ Changelog
284284
:pr:`16289` by :user:`Masashi Kishimoto <kishimoto-banana>` and
285285
:user:`Olivier Grisel <ogrisel>`.
286286

287-
- |Enhancement| :class:`isotonic.IsotonicRegression` now accepts 2darray with
287+
- |Enhancement| :class:`isotonic.IsotonicRegression` now accepts 2d array with
288288
1 feature as input array. :pr:`17379` by :user:`Jiaxiang <fujiaxiang>`.
289289

290290
:mod:`sklearn.kernel_approximation`
@@ -335,8 +335,8 @@ Changelog
335335
:pr:`10591` by :user:`Jeremy Karnowski <jkarnows>` and
336336
:user:`Daniel Mohns <dmohns>`.
337337

338-
- |Feature| Added :func:`metrics.plot_det_curve` and :class:`DetCurveDisplay`
339-
to ease the plot of DET curves.
338+
- |Feature| Added :func:`metrics.plot_det_curve` and
339+
:class:`metrics.DetCurveDisplay` to ease the plot of DET curves.
340340
:pr:`18176` by :user:`Guillaume Lemaitre <glemaitre>`.
341341

342342
- |Feature| Added :func:`metrics.mean_absolute_percentage_error` metric and

sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -968,6 +968,8 @@ def staged_predict(self, X):
968968
This method allows monitoring (i.e. determine error on testing set)
969969
after each stage.
970970
971+
.. versionadded:: 0.24
972+
971973
Parameters
972974
----------
973975
X : array-like of shape (n_samples, n_features)
@@ -1193,6 +1195,8 @@ def staged_predict(self, X):
11931195
This method allows monitoring (i.e. determine error on testing set)
11941196
after each stage.
11951197
1198+
.. versionadded:: 0.24
1199+
11961200
Parameters
11971201
----------
11981202
X : array-like of shape (n_samples, n_features)

sklearn/feature_selection/_rfe.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,9 @@ class RFE(SelectorMixin, MetaEstimatorMixin, BaseEstimator):
6464
to select. If float between 0 and 1, it is the fraction of features to
6565
select.
6666
67+
.. versionchanged:: 0.24
68+
Added float values for fractions.
69+
6770
step : int or float, default=1
6871
If greater than or equal to 1, then ``step`` corresponds to the
6972
(integer) number of features to remove at each iteration.

sklearn/impute/_base.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -501,6 +501,8 @@ def inverse_transform(self, X):
501501
indicator, and the imputation done at ``transform`` time won't be
502502
inverted.
503503
504+
.. versionadded:: 0.24
505+
504506
Parameters
505507
----------
506508
X : array-like of shape \

sklearn/inspection/_permutation_importance.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -98,6 +98,8 @@ def permutation_importance(estimator, X, y, *, scoring=None, n_repeats=5,
9898
sample_weight : array-like of shape (n_samples,), default=None
9999
Sample weights used in scoring.
100100
101+
.. versionadded:: 0.24
102+
101103
Returns
102104
-------
103105
result : :class:`~sklearn.utils.Bunch`

sklearn/isotonic.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -302,6 +302,9 @@ def fit(self, X, y, sample_weight=None):
302302
X : array-like of shape (n_samples,) or (n_samples, 1)
303303
Training data.
304304
305+
.. versionchanged:: 0.24
306+
Also accepts 2d array with 1 feature.
307+
305308
y : array-like of shape (n_samples,)
306309
Training target.
307310
@@ -346,6 +349,9 @@ def transform(self, T):
346349
T : array-like of shape (n_samples,) or (n_samples, 1)
347350
Data to transform.
348351
352+
.. versionchanged:: 0.24
353+
Also accepts 2d array with 1 feature.
354+
349355
Returns
350356
-------
351357
y_pred : ndarray of shape (n_samples,)

sklearn/kernel_approximation.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,8 @@ class PolynomialCountSketch(BaseEstimator, TransformerMixin):
3939
vector with itself using Fast Fourier Transforms (FFT). Read more in the
4040
:ref:`User Guide <polynomial_kernel_approx>`.
4141
42+
.. versionadded:: 0.24
43+
4244
Parameters
4345
----------
4446
gamma : float, default=1.0

sklearn/linear_model/_ridge.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1749,6 +1749,8 @@ class RidgeCV(MultiOutputMixin, RegressorMixin, _BaseRidgeCV):
17491749
fitting, the `alpha_` attribute will contain a value for each target.
17501750
When set to `False`, a single alpha is used for all targets.
17511751
1752+
.. versionadded:: 0.24
1753+
17521754
Attributes
17531755
----------
17541756
cv_values_ : ndarray of shape (n_samples, n_alphas) or \

sklearn/metrics/_regression.py

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@
5252

5353

5454
def _check_reg_targets(y_true, y_pred, multioutput, dtype="numeric"):
55-
"""Check that y_true and y_pred belong to the same regression task
55+
"""Check that y_true and y_pred belong to the same regression task.
5656
5757
Parameters
5858
----------
@@ -81,9 +81,9 @@ def _check_reg_targets(y_true, y_pred, multioutput, dtype="numeric"):
8181
Custom output weights if ``multioutput`` is array-like or
8282
just the corresponding argument if ``multioutput`` is a
8383
correct keyword.
84+
8485
dtype: str or list, default="numeric"
8586
the dtype argument passed to check_array
86-
8787
"""
8888
check_consistent_length(y_true, y_pred)
8989
y_true = check_array(y_true, ensure_2d=False, dtype=dtype)
@@ -126,7 +126,7 @@ def _check_reg_targets(y_true, y_pred, multioutput, dtype="numeric"):
126126
def mean_absolute_error(y_true, y_pred, *,
127127
sample_weight=None,
128128
multioutput='uniform_average'):
129-
"""Mean absolute error regression loss
129+
"""Mean absolute error regression loss.
130130
131131
Read more in the :ref:`User Guide <mean_absolute_error>`.
132132
@@ -197,12 +197,14 @@ def mean_absolute_error(y_true, y_pred, *,
197197
def mean_absolute_percentage_error(y_true, y_pred,
198198
sample_weight=None,
199199
multioutput='uniform_average'):
200-
"""Mean absolute percentage error regression loss
200+
"""Mean absolute percentage error regression loss.
201201
202202
Note here that we do not represent the output as a percentage in range
203203
[0, 100]. Instead, we represent it in range [0, 1/eps]. Read more in the
204204
:ref:`User Guide <mean_absolute_percentage_error>`.
205205
206+
.. versionadded:: 0.24
207+
206208
Parameters
207209
----------
208210
y_true : array-like of shape (n_samples,) or (n_samples, n_outputs)
@@ -273,7 +275,7 @@ def mean_absolute_percentage_error(y_true, y_pred,
273275
def mean_squared_error(y_true, y_pred, *,
274276
sample_weight=None,
275277
multioutput='uniform_average', squared=True):
276-
"""Mean squared error regression loss
278+
"""Mean squared error regression loss.
277279
278280
Read more in the :ref:`User Guide <mean_squared_error>`.
279281
@@ -329,7 +331,6 @@ def mean_squared_error(y_true, y_pred, *,
329331
array([0.41666667, 1. ])
330332
>>> mean_squared_error(y_true, y_pred, multioutput=[0.3, 0.7])
331333
0.825...
332-
333334
"""
334335
y_type, y_true, y_pred, multioutput = _check_reg_targets(
335336
y_true, y_pred, multioutput)
@@ -354,7 +355,7 @@ def mean_squared_error(y_true, y_pred, *,
354355
def mean_squared_log_error(y_true, y_pred, *,
355356
sample_weight=None,
356357
multioutput='uniform_average'):
357-
"""Mean squared logarithmic error regression loss
358+
"""Mean squared logarithmic error regression loss.
358359
359360
Read more in the :ref:`User Guide <mean_squared_log_error>`.
360361
@@ -403,7 +404,6 @@ def mean_squared_log_error(y_true, y_pred, *,
403404
array([0.00462428, 0.08377444])
404405
>>> mean_squared_log_error(y_true, y_pred, multioutput=[0.3, 0.7])
405406
0.060...
406-
407407
"""
408408
y_type, y_true, y_pred, multioutput = _check_reg_targets(
409409
y_true, y_pred, multioutput)
@@ -421,7 +421,7 @@ def mean_squared_log_error(y_true, y_pred, *,
421421
@_deprecate_positional_args
422422
def median_absolute_error(y_true, y_pred, *, multioutput='uniform_average',
423423
sample_weight=None):
424-
"""Median absolute error regression loss
424+
"""Median absolute error regression loss.
425425
426426
Median absolute error output is non-negative floating point. The best value
427427
is 0.0. Read more in the :ref:`User Guide <median_absolute_error>`.
@@ -473,7 +473,6 @@ def median_absolute_error(y_true, y_pred, *, multioutput='uniform_average',
473473
array([0.5, 1. ])
474474
>>> median_absolute_error(y_true, y_pred, multioutput=[0.3, 0.7])
475475
0.85
476-
477476
"""
478477
y_type, y_true, y_pred, multioutput = _check_reg_targets(
479478
y_true, y_pred, multioutput)
@@ -497,7 +496,7 @@ def median_absolute_error(y_true, y_pred, *, multioutput='uniform_average',
497496
def explained_variance_score(y_true, y_pred, *,
498497
sample_weight=None,
499498
multioutput='uniform_average'):
500-
"""Explained variance regression score function
499+
"""Explained variance regression score function.
501500
502501
Best possible score is 1.0, lower values are worse.
503502
@@ -549,7 +548,6 @@ def explained_variance_score(y_true, y_pred, *,
549548
>>> y_pred = [[0, 2], [-1, 2], [8, -5]]
550549
>>> explained_variance_score(y_true, y_pred, multioutput='uniform_average')
551550
0.983...
552-
553551
"""
554552
y_type, y_true, y_pred, multioutput = _check_reg_targets(
555553
y_true, y_pred, multioutput)

sklearn/model_selection/_search.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -482,6 +482,8 @@ def score_samples(self, X):
482482
Only available if ``refit=True`` and the underlying estimator supports
483483
``score_samples``.
484484
485+
.. versionadded:: 0.24
486+
485487
Parameters
486488
----------
487489
X : iterable

0 commit comments

Comments
 (0)