In the school I went to, boys and girls were separated for the sex-ed portion of health class. Girls only learned about their bodies, boys only learned about their bodies, and there was only the barest mention about putting the two together and what happens if you do. The result was boys not knowing the things you mentioned, but I also remember one of my female friends seeing a picture of a guy with an erect penis for the first time and thinking the ‘poor guy’ had some sort of deformity. (I’m not kidding.) Granted, the women generally knew more than the men since there are a lot of dicks in pop culture, while vagina is still a word that is only uttered under one’s breath in polite circles.