Mid-week question for all y’all: Where did you get most of your sex education?
Perhaps more importantly, where do you think people should get sex ed from?
Should there be more or less talk about it?
It’s always a hotly debated topic at school boards and among religious and secular folks. I’m curious what my readers here think.