Arif

12:14:08 pm on February 21, 2010  # 
What are crossvalidation and bootstrapping?
Crossvalidation and bootstrapping are both methods for estimating generalization error based on “resampling”.
In kfold crossvalidation, you divide the data into k subsets of (approximately) equal size. You train the net k times, each time leaving out one of the subsets from training, but using only the omitted subset to compute whatever error criterion interests you. If k equals the sample size, this is called “leaveoneout” crossvalidation.