<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>thesIt &#187; faq</title>
	<atom:link href="http://lakm.us/thesit/tag/faq/feed/" rel="self" type="application/rss+xml" />
	<link>http://lakm.us/thesit</link>
	<description>computer science research log in semi microbloging style</description>
	<lastBuildDate>Tue, 24 Aug 2010 21:34:55 +0000</lastBuildDate>
	<generator>http://wordpress.org/?v=2.9</generator>
	<language>en</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
			<item>
		<title>What are cross-validation and bootstrapp &#8230;</title>
		<link>http://lakm.us/thesit/257/what-are-cross-validation-and-bootstrapp/</link>
		<comments>http://lakm.us/thesit/257/what-are-cross-validation-and-bootstrapp/#comments</comments>
		<pubDate>Sun, 21 Feb 2010 12:14:08 +0000</pubDate>
		<dc:creator>Arif</dc:creator>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[cross-validation]]></category>
		<category><![CDATA[faq]]></category>
		<category><![CDATA[neural network]]></category>

		<guid isPermaLink="false">http://xp-racy.lan/s2/?p=257</guid>
		<description><![CDATA[What are cross-validation and bootstrapping?
Cross-validation and bootstrapping are both methods for estimating generalization error based on &#8220;resampling&#8221;.
In k-fold cross-validation, you divide the data into k subsets of (approximately) equal size. You train the net k times, each time leaving out one of the subsets from training, but using only the omitted subset to compute whatever [...]]]></description>
			<content:encoded><![CDATA[<p><a href="http://www.faqs.org/faqs/ai-faq/neural-nets/part3/section-12.html">What are cross-validation and bootstrapping?</a></p>
<p>Cross-validation and bootstrapping are both methods for estimating generalization error based on &#8220;resampling&#8221;.</p>
<p>In k-fold cross-validation, you divide the data into k subsets of (approximately) equal size. You train the net k times, each time leaving out one of the subsets from training, but using only the omitted subset to compute whatever error criterion interests you. If k equals the sample size, this is called &#8220;leave-one-out&#8221; cross-validation.</p>]]></content:encoded>
			<wfw:commentRss>http://lakm.us/thesit/257/what-are-cross-validation-and-bootstrapp/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
