Whenever the topic of sex education and children comes up, there's an inevitable outcry from parents, politicians, and religious figures, who either think that (a) this should be taught at home, (b) the topics being taught are "inappropriate," or (c) teachers will do it wrong. All of which, frankly, don't speak to the realities of what's happening with kids right now. There's a reason people joke about kids playing "doctor" -- it's because kids are curious about their bodies, and the feelings they get from them, as much as adults are. They just don't have the knowledge to help them along the way. So hey, wouldn't it be great if they could get that someplace safe and educational, like say, school?