File size: 3,798 Bytes
ed3e3bb
 
 
 
 
 
 
 
 
 
d7b729a
 
 
ed3e3bb
d7b729a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
title: Confusion Matrix
emoji: 📉
colorFrom: yellow
colorTo: green
sdk: gradio
sdk_version: 3.17.0
app_file: app.py
pinned: false
---
tags:
- evaluate
- metric

description: >-
  Accuracy is the proportion of correct predictions among the total number of cases processed. It can be computed with:
  Accuracy = (TP + TN) / (TP + TN + FP + FN)
  Where:
  TP: True positive
  TN: True negative
  FP: False positive
  FN: False negative
---

# Metric Card for Confusion Matrix


## Metric Description

Compute confusion matrix to evaluate the accuracy of a classification.
By definition a confusion matrix :math:`C` is such that :math:`C_{i, j}`
is equal to the number of observations known to be in group :math:`i` and
predicted to be in group :math:`j`.

Thus in binary classification, the count of true negatives is
:math:`C_{0,0}`, false negatives is :math:`C_{1,0}`, true positives is
:math:`C_{1,1}` and false positives is :math:`C_{0,1}`.


## How to Use

At minimum, this metric requires predictions and references as inputs.

```python
>>> cfm_metric = evaluate.load("confusion_matrix")
>>> results = cfm_metric.compute(references=[1, 2, 3, 2, 1, 1, 0, 2], predictions=[1, 0, 3, 2, 2, 1, 0, 3])
>>> print(results)
{'confusion_matrix': [[1, 0, 0, 0], [0, 2, 1, 0], [1, 0, 1, 1], [0, 0, 0, 1]]}
```


### Inputs
- **predictions** (`list` of `int`): Predicted labels.
- **references** (`list` of `int`): Ground truth labels.
- **normalize** (`str` or `None`): {`true`, `pred`, `all`}, default=None
        Normalizes confusion matrix over the true (rows), predicted (columns)
        conditions or all the population. If None, confusion matrix will not be
        normalized
- **sample_weight** (`list` of `float`): Sample weights Defaults to None.
- **labels** (`list` of `float`): default=None
        List of labels to index the matrix. This may be used to reorder
        or select a subset of labels.
        If ``None`` is given, those that appear at least once
        in ``y_true`` or ``y_pred`` are used in sorted order.

### Output Values
- **confusion_matrix**(`list` of `int`): Confusion matrix. Minimum possible value is 0. Maximum possible value is 1.0, or the number of examples input, if `normalize` is set to `True`.. A higher score means higher accuracy.
Output Example(s):
```python
{'confusion_matrix': [[1, 0, 0, 0], [0, 2, 1, 0], [1, 0, 1, 1], [0, 0, 0, 1]]}

```
This metric outputs a dictionary, containing the confusion matrix.

### Examples
>>> from sklearn.metrics import confusion_matrix
>>> y_true = [2, 0, 2, 2, 0, 1]
>>> y_pred = [0, 0, 2, 2, 0, 2]
>>> confusion_matrix(y_true, y_pred)
array([[2, 0, 0],
        [0, 0, 1],
        [1, 0, 2]])

>>> y_true = ["cat", "ant", "cat", "cat", "ant", "bird"]
>>> y_pred = ["ant", "ant", "cat", "cat", "ant", "cat"]
>>> confusion_matrix(y_true, y_pred, labels=["ant", "bird", "cat"])
array([[2, 0, 0],
        [0, 0, 1],
        [1, 0, 2]])

In the binary case, we can extract true positives, etc as follows:

>>> tn, fp, fn, tp = confusion_matrix([0, 1, 0, 1], [1, 1, 1, 0]).ravel()
>>> (tn, fp, fn, tp)
(0, 2, 1, 1)

## Citation(s)
```bibtex
@article{scikit-learn,
  title={Scikit-learn: Machine Learning in {P}ython},
  author={Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V.
         and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P.
         and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and
         Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E.},
  journal={Journal of Machine Learning Research},
  volume={12},
  pages={2825--2830},
  year={2011}
}
```
## Further References
Wikipedia entry for the Confusion matrix
<https://en.wikipedia.org/wiki/Confusion_matrix>`_
(Wikipedia and other references may use a different
convention for axes).