I can remember back to a time when Black women viewed certain sexual acts as taboo and often made comments such as,  "Oh! Uh-uh. I don't do that. That's that white girl stuff."  With  our culture being deeply rooted in religion, some even stated that they'd been told that some acts were simply "unholy."

Yet, in recent years, there appears to have been an awakening amongst Black women.  We're now more apt to openly admit to some practices that we would have previously denied.  Now, we even go so far as to give advice to other women about how to get more pleasure from the things they do merely for the sake of pleasing their partners. What has caused such a change of attitude?  Is it a matter of growing tired of "most eligible bachelors" seeking interracial relationships? Is it "learning to like" or adapting to the desires of our partners?  Or was the initial hesitation simply a fear of the unknown?

loading...

The more progressive Christian would argue that Hebrews 13:4 deems that whatever happens between a husband and wife in their bed is acceptable, as long as it is not sinful such as including additional partners, animals, etc.  This ideaology directly contradicts traditional teachings of acts being unholy.

So, what has been the main contributor to this newly found liberation?  Is it a Black women's lib movement, or are we simply satisfying curiosities and seeking new excitement?  (You know many men don't know that we get bored from monotony, too! Kiss me here, lick me there, bam bam bam. Done.  Same way, every time equals boredom.)  Ladies, I need to hear from you. Are we embracing our inner white girl, or are we just FINALLY admitting to what we've always done?

More From 92.9 WTUG