Metric | Formulation | Defination |
---|---|---|
Accuracy | \(\:\frac{TP+TN}{TP+TN+FP+FN}\) | Accuracy refered to the amount of accurate assumptions the algorithm produced for forecasts of all sorts. |
Precision | \(\:\frac{TP}{TP+FP}\) | Precision was the percentage of successful cases that were reported correctly. |
Recall | \(\:\frac{TP}{TP+FN}\) | It was the number of right positive outcomes divided by the number of all related samples (including samples that were meant to be positive). |
F 1-score | \(\:2\text{*}\frac{\left(\text{P}\text{r}\text{e}\text{c}\text{i}\text{s}\text{o}\text{n}\text{*}\text{R}\text{e}\text{c}\text{a}\text{l}\text{l}\right)}{\left(\text{P}\text{r}\text{e}\text{c}\text{i}\text{s}\text{o}\text{n}+\text{R}\text{e}\text{c}\text{a}\text{l}\text{l}\right)}=\:\frac{TP}{TP+\:\frac{1}{2}(FP+FN)}\) | It was the harmonic mean of the precision and recall values. |