computer science research log in semi microbloging style


  • 12:14:08 pm on February 21, 2010 | # | 0
    Tags: , ,

    What are cross-validation and bootstrapping?

    Cross-validation and bootstrapping are both methods for estimating generalization error based on “resampling”.

    In k-fold cross-validation, you divide the data into k subsets of (approximately) equal size. You train the net k times, each time leaving out one of the subsets from training, but using only the omitted subset to compute whatever error criterion interests you. If k equals the sample size, this is called “leave-one-out” cross-validation.


  • Modified from Prologue theme by Automattic

Leave a Comment