what do you think? should sex education be taught in schools? and please for the love of God, (HM this is for you in particular) i just want straight talk, this isn't a religious discussion. it's not for a religious class. i've already had a similar discussion with some of my very religious friends (well actually i just sat back and watched them debate each other to the finish). i just want to know what you think without damning those with opposing views to hell please, and thank you.