Skip to content

Commit b87a2f2

Browse files
committed
What's new: group entries by topic
1 parent 74861d0 commit b87a2f2

File tree

1 file changed

+13
-14
lines changed

1 file changed

+13
-14
lines changed

doc/whats_new.rst

Lines changed: 13 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -20,23 +20,14 @@ Changelog
2020
:class:`ensemble.BaggingRegressor` meta-estimators for ensembling
2121
any kind of base estimator. See the :ref:`Bagging <bagging>` section of
2222
the user guide for details and examples. By `Gilles Louppe`_.
23-
24-
- Speed improvement of the :mod:`sklearn.ensemble.gradient_boosting` module.
25-
By `Gilles Louppe`_ and `Peter Prettenhofer`_.
2623

27-
- Memory improvements of extra trees and random forest by
28-
`Arnaud Joly`_.
24+
- Memory improvements of decision trees, by `Arnaud Joly`_.
2925

30-
- Reduce memory usage and overhead when fitting and predicting with forests
31-
of randomized trees in parallel with ``n_jobs != 1`` by leveraging new
32-
threading backend of joblib 0.8 and releasing the GIL in the tree fitting
33-
Cython code. By `Olivier Grisel`_ and `Gilles Louppe`_.
34-
3526
- Decision trees can now be built in best-first manner by using ``max_leaf_nodes``
3627
as the stopping criteria. Refactored the tree code to use either a
3728
stack or a priority queue for tree building.
3829
By `Peter Prettenhofer`_ and `Gilles Louppe`_.
39-
30+
4031
- Decision trees can now be fitted on fortran- and c-style arrays, and
4132
non-continuous arrays without the need to make a copy.
4233
If the input array has a different dtype than ``np.float32``, a fortran-
@@ -51,13 +42,24 @@ Changelog
5142
- Changed the internal storage of decision trees to use a struct array.
5243
This fixed some small bugs, while improving code and providing a small
5344
speed gain. By `Joel Nothman`_.
45+
46+
- Reduce memory usage and overhead when fitting and predicting with forests
47+
of randomized trees in parallel with ``n_jobs != 1`` by leveraging new
48+
threading backend of joblib 0.8 and releasing the GIL in the tree fitting
49+
Cython code. By `Olivier Grisel`_ and `Gilles Louppe`_.
5450

51+
- Speed improvement of the :mod:`sklearn.ensemble.gradient_boosting` module.
52+
By `Gilles Louppe`_ and `Peter Prettenhofer`_.
53+
5554
- Various enhancements to the :mod:`sklearn.ensemble.gradient_boosting`
5655
module: a ``warm_start`` argument to fit additional trees,
5756
a ``max_leaf_nodes`` argument to fit GBM style trees,
5857
a ``monitor`` fit argument to inspect the estimator during training, and
5958
refactoring of the verbose code. By `Peter Prettenhofer`_.
6059

60+
- Fixed bug in :class:`gradient_boosting.GradientBoostingRegressor` with
61+
``loss='huber'``: ``gamma`` might have not been initialized.
62+
6163
- Fixed feature importances as computed with a forest of randomized trees
6264
when fit with ``sample_weight != None`` and/or with ``bootstrap=True``.
6365
By `Gilles Louppe`_.
@@ -110,9 +112,6 @@ Changelog
110112
and predictive power.
111113
By `Eustache Diemert`_.
112114

113-
- Fixed bug in :class:`gradient_boosting.GradientBoostingRegressor` with
114-
``loss='huber'``: ``gamma`` might have not been initialized.
115-
116115
- :class:`dummy.DummyClassifier` can now be used to predict a constant
117116
output value. By `Manoj Kumar`_.
118117

0 commit comments

Comments
 (0)