Recompense commited on
Commit
425b361
·
verified ·
1 Parent(s): ce77842

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -69,7 +69,7 @@ The primary evaluation metric used is Accuracy. A confusion matrix was also gene
69
  * **Accuracy:** The proportion of correctly classified images out of the total number of images evaluated.
70
 
71
  $$
72
- \text{Accuracy} = \frac{\text{Number of correct predictions}}{\text{Total number of predictions}}
73
  $$
74
 
75
  * **Confusion Matrix:** A table that visualizes the performance of a classification model. Each row represents the instances in an actual class, while each column represents the instances in a predicted class.
 
69
  * **Accuracy:** The proportion of correctly classified images out of the total number of images evaluated.
70
 
71
  $$
72
+ ACCURACY = \frac{\text{Number of correct predictions}}{\text{Total number of predictions}}
73
  $$
74
 
75
  * **Confusion Matrix:** A table that visualizes the performance of a classification model. Each row represents the instances in an actual class, while each column represents the instances in a predicted class.