black_kat_meow
hihiwhywhy
You obviously don't know much about your religion. Christian literature proclaims man to still be the head of the house, whom the wife should always let make the final decision. Christians believe that the most important role a woman has is to care for the children and family.you need to understand that it was the context at the time for women to be considered 'inferior'.
the context today is that females and male's are equal.
in some areas, women having more rights, and other areas males still having more rights (ie. wage differentials between male and female managers)
My bf's family is Christian, they have large volumes of more moderate and also evangelical literature in their house which I read. They all proclaim these messages.
They like to call it "being equal, but not the same." Apparently the male is still the natural leader who should be deferred to.
So, no.
Study your religion a bit more closer in relation to women's rights (not that you'd care anyway, being a male.)