83% of Americans identify themselves as "Christians."
Which means they believe in a god who is usually identified as male when gender typed. They also believe in his son, his living form, who once walked the Earth.
And what is the overriding theme of this religion? Everything will be OK as long as you trust god, do as he says, and give yourself in worship to him. What happens if you don't? Well, then you are punished with a violent, torturous eternity in hell.
Do as I say, or I will HURT YOU. This is what we believe. With no evidence any of it exists.
But we're all flipped out that men treat women this way. Where could they have possibly learned the notion that it's OK for a man to say, "Do what I want, or I will HURT YOU?"
Ladies... you follow this religion, too. You abide by it. You approve of the message.
Why, as a society, do we expect better of flawed humans, when 83% of them believe in a "perfect" deity that acts the same damn way?
Maybe if we want to change how people behave, we need to get past all the crap that they've been taught to believe?