Skip to content

Commit 646450b

Browse files
josephsalmonjnothman
authored andcommitted
DOC Update linear_model.rst : min should be \min (scikit-learn#11374)
1 parent 7f90f3c commit 646450b

File tree

1 file changed

+9
-9
lines changed

1 file changed

+9
-9
lines changed

doc/modules/linear_model.rst

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ of squares between the observed responses in the dataset, and the
3131
responses predicted by the linear approximation. Mathematically it
3232
solves a problem of the form:
3333

34-
.. math:: \underset{w}{min\,} {|| X w - y||_2}^2
34+
.. math:: \underset{w}{\min\,} {|| X w - y||_2}^2
3535

3636
.. figure:: ../auto_examples/linear_model/images/sphx_glr_plot_ols_001.png
3737
:target: ../auto_examples/linear_model/plot_ols.html
@@ -83,7 +83,7 @@ of squares,
8383

8484
.. math::
8585
86-
\underset{w}{min\,} {{|| X w - y||_2}^2 + \alpha {||w||_2}^2}
86+
\underset{w}{\min\,} {{|| X w - y||_2}^2 + \alpha {||w||_2}^2}
8787
8888
8989
Here, :math:`\alpha \geq 0` is a complexity parameter that controls the amount
@@ -170,7 +170,7 @@ weights (see
170170
Mathematically, it consists of a linear model trained with :math:`\ell_1` prior
171171
as regularizer. The objective function to minimize is:
172172

173-
.. math:: \underset{w}{min\,} { \frac{1}{2n_{samples}} ||X w - y||_2 ^ 2 + \alpha ||w||_1}
173+
.. math:: \underset{w}{\min\,} { \frac{1}{2n_{samples}} ||X w - y||_2 ^ 2 + \alpha ||w||_1}
174174

175175
The lasso estimate thus solves the minimization of the
176176
least-squares penalty with :math:`\alpha ||w||_1` added, where
@@ -319,7 +319,7 @@ Mathematically, it consists of a linear model trained with a mixed
319319
:math:`\ell_1` :math:`\ell_2` prior as regularizer.
320320
The objective function to minimize is:
321321

322-
.. math:: \underset{w}{min\,} { \frac{1}{2n_{samples}} ||X W - Y||_{Fro} ^ 2 + \alpha ||W||_{21}}
322+
.. math:: \underset{w}{\min\,} { \frac{1}{2n_{samples}} ||X W - Y||_{Fro} ^ 2 + \alpha ||W||_{21}}
323323

324324
where :math:`Fro` indicates the Frobenius norm:
325325

@@ -355,7 +355,7 @@ The objective function to minimize is in this case
355355

356356
.. math::
357357
358-
\underset{w}{min\,} { \frac{1}{2n_{samples}} ||X w - y||_2 ^ 2 + \alpha \rho ||w||_1 +
358+
\underset{w}{\min\,} { \frac{1}{2n_{samples}} ||X w - y||_2 ^ 2 + \alpha \rho ||w||_1 +
359359
\frac{\alpha(1-\rho)}{2} ||w||_2 ^ 2}
360360
361361
@@ -402,7 +402,7 @@ The objective function to minimize is:
402402

403403
.. math::
404404
405-
\underset{W}{min\,} { \frac{1}{2n_{samples}} ||X W - Y||_{Fro}^2 + \alpha \rho ||W||_{2 1} +
405+
\underset{W}{\min\,} { \frac{1}{2n_{samples}} ||X W - Y||_{Fro}^2 + \alpha \rho ||W||_{2 1} +
406406
\frac{\alpha(1-\rho)}{2} ||W||_{Fro}^2}
407407
408408
The implementation in the class :class:`MultiTaskElasticNet` uses coordinate descent as
@@ -734,12 +734,12 @@ regularization.
734734
As an optimization problem, binary class L2 penalized logistic regression
735735
minimizes the following cost function:
736736

737-
.. math:: \underset{w, c}{min\,} \frac{1}{2}w^T w + C \sum_{i=1}^n \log(\exp(- y_i (X_i^T w + c)) + 1) .
737+
.. math:: \underset{w, c}{\min\,} \frac{1}{2}w^T w + C \sum_{i=1}^n \log(\exp(- y_i (X_i^T w + c)) + 1) .
738738

739739
Similarly, L1 regularized logistic regression solves the following
740740
optimization problem
741741

742-
.. math:: \underset{w, c}{min\,} \|w\|_1 + C \sum_{i=1}^n \log(\exp(- y_i (X_i^T w + c)) + 1).
742+
.. math:: \underset{w, c}{\min\,} \|w\|_1 + C \sum_{i=1}^n \log(\exp(- y_i (X_i^T w + c)) + 1).
743743

744744
Note that, in this notation, it's assumed that the observation :math:`y_i` takes values in the set
745745
:math:`{-1, 1}` at trial :math:`i`.
@@ -1137,7 +1137,7 @@ The loss function that :class:`HuberRegressor` minimizes is given by
11371137

11381138
.. math::
11391139
1140-
\underset{w, \sigma}{min\,} {\sum_{i=1}^n\left(\sigma + H_m\left(\frac{X_{i}w - y_{i}}{\sigma}\right)\sigma\right) + \alpha {||w||_2}^2}
1140+
\underset{w, \sigma}{\min\,} {\sum_{i=1}^n\left(\sigma + H_m\left(\frac{X_{i}w - y_{i}}{\sigma}\right)\sigma\right) + \alpha {||w||_2}^2}
11411141
11421142
where
11431143

0 commit comments

Comments
 (0)