Update README.md
Browse files
README.md
CHANGED
|
@@ -67,8 +67,11 @@ Evaluation was performed on the overall validation dataset. Further analysis cou
|
|
| 67 |
|
| 68 |
The primary evaluation metric used is Accuracy. A confusion matrix was also generated to visualize per-class performance.
|
| 69 |
|
| 70 |
-
* **Accuracy:** The proportion of correctly classified images out of the total number of images evaluated.
|
| 71 |
-
|
|
|
|
|
|
|
|
|
|
| 72 |
|
| 73 |
* **Confusion Matrix:** A table that visualizes the performance of a classification model. Each row represents the instances in an actual class, while each column represents the instances in a predicted class.
|
| 74 |
|
|
|
|
| 67 |
|
| 68 |
The primary evaluation metric used is Accuracy. A confusion matrix was also generated to visualize per-class performance.
|
| 69 |
|
| 70 |
+
* **Accuracy:** The proportion of correctly classified images out of the total number of images evaluated.
|
| 71 |
+
$$
|
| 72 |
+
\text{Accuracy} = \frac{\text{Number of correct predictions}}{\text{Total number of predictions}}
|
| 73 |
+
$$
|
| 74 |
+
|
| 75 |
|
| 76 |
* **Confusion Matrix:** A table that visualizes the performance of a classification model. Each row represents the instances in an actual class, while each column represents the instances in a predicted class.
|
| 77 |
|