I am curious about what everyone thinks on here- about whether America as a country is getting more sexual due to popular media (tv shows, music videos, books, etc) or whether they believe that Americans are becoming more openminded about sex due to the increasing 'live and let live' attitude of society, and that somewhat more risque media is not MAKING people raunchy, it is simply showing that we are more accepting of others' sexuality.
What do you think- media is making us sexual, or we are more open-minded and entertainment reflects that? Maybe both?
What do you think- media is making us sexual, or we are more open-minded and entertainment reflects that? Maybe both?