🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Interpreting cross validation

User: "yerisderanak"
New Altair Community Member
Updated by Jocelyn

Hi guys!
I'm a total beginner, so please bear with me.
I have a process set upped, with a cross validation at the end. Inside of it a have Deep Learning, Apply Model and Performance operators. So far so good, so after I run (4h later :D) I get a Confusion Matrix and a Accuracy. And here is my question:
So I have accuracy: 35.42% +/- 47.83% (mikro: 35.42%)
Accuracy is average accuracy of all models trained, right?
So is +/- 47.83% variance?
And for the confusion matrix, is it from the last model trained, or is it a some kind of summary of all the runs?
To be exact I use k-fold cross validation, so maybe my understanding of that process is wrong.

Thx in advance and sorry for noob question!

Find more posts tagged with