Ten Lies Pop Culture Teaches About Sex

Originally posted on CovenantEyes.com

Right or wrong, our culture teaches us about sex. The media conducts sex education all the time.” –Stephen Witmer

Pop culture is a powerful force in society, as it colors our views of social issues, people, relationships, and sexual activity. What we see, read, and hear through the media influences our worldview on these important matters. We need an internal filter, moral compass, and high esteem of God’s word if we are to live rightly and thrive in our relational and sexual lives.

Sex is a topic that we would all do well to treat with care, being intentional to not swallow whole the ideas put forth by the culture. Lies can be found in excess through a secular view of sexuality. How most entertainment presents sexuality cannot do justice to such a profound and meaningful act. Here are ten lies pop culture teaches about sex.

Read the rest of

Read the full article at babypinkroses.blogspot.com


Submit a Comment