Skip to content

Commit b13acb1

Browse files
hlin117jnothman
authored andcommitted
and related minor documentation fixes
1 parent 204472a commit b13acb1

File tree

2 files changed

+13
-6
lines changed

2 files changed

+13
-6
lines changed

doc/modules/classes.rst

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -803,6 +803,7 @@ details.
803803
metrics.average_precision_score
804804
metrics.brier_score_loss
805805
metrics.classification_report
806+
metrics.cohen_kappa_score
806807
metrics.confusion_matrix
807808
metrics.f1_score
808809
metrics.fbeta_score
@@ -928,7 +929,7 @@ See the :ref:`metrics` section of the user guide for further details.
928929
metrics.pairwise.paired_manhattan_distances
929930
metrics.pairwise.paired_cosine_distances
930931
metrics.pairwise.paired_distances
931-
932+
932933

933934
.. _mixture_ref:
934935

sklearn/metrics/classification.py

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -271,7 +271,7 @@ def confusion_matrix(y_true, y_pred, labels=None, sample_weight=None):
271271
def cohen_kappa_score(y1, y2, labels=None, weights=None):
272272
"""Cohen's kappa: a statistic that measures inter-annotator agreement.
273273
274-
This function computes Cohen's kappa [1], a score that expresses the level
274+
This function computes Cohen's kappa [1]_, a score that expresses the level
275275
of agreement between two annotators on a classification problem. It is
276276
defined as
277277
@@ -282,7 +282,9 @@ def cohen_kappa_score(y1, y2, labels=None, weights=None):
282282
assigned to any sample (the observed agreement ratio), and :math:`p_e` is
283283
the expected agreement when both annotators assign labels randomly.
284284
:math:`p_e` is estimated using a per-annotator empirical prior over the
285-
class labels [2].
285+
class labels [2]_.
286+
287+
Read more in the :ref:`User Guide <cohen_kappa>`.
286288
287289
Parameters
288290
----------
@@ -313,8 +315,11 @@ class labels [2].
313315
.. [1] J. Cohen (1960). "A coefficient of agreement for nominal scales".
314316
Educational and Psychological Measurement 20(1):37-46.
315317
doi:10.1177/001316446002000104.
316-
.. [2] R. Artstein and M. Poesio (2008). "Inter-coder agreement for
317-
computational linguistics". Computational Linguistic 34(4):555-596.
318+
.. [2] `R. Artstein and M. Poesio (2008). "Inter-coder agreement for
319+
computational linguistics". Computational Linguistics 34(4):555-596.
320+
<http://www.mitpressjournals.org/doi/abs/10.1162/coli.07-034-R2#.V0J1MJMrIWo>`_
321+
.. [3] `Wikipedia entry for the Cohen's kappa.
322+
<https://en.wikipedia.org/wiki/Cohen%27s_kappa>`_
318323
"""
319324
confusion = confusion_matrix(y1, y2, labels=labels)
320325
n_classes = confusion.shape[0]
@@ -1831,7 +1836,8 @@ def brier_score_loss(y_true, y_prob, sample_weight=None, pos_label=None):
18311836
18321837
References
18331838
----------
1834-
https://en.wikipedia.org/wiki/Brier_score
1839+
.. [1] `Wikipedia entry for the Brier score.
1840+
<https://en.wikipedia.org/wiki/Brier_score>`_
18351841
"""
18361842
y_true = column_or_1d(y_true)
18371843
y_prob = column_or_1d(y_prob)

0 commit comments

Comments
 (0)