With the recent politics being so dominant in the media, I can't help but think about our sexual freedoms. Many states still have laws prohibiting things like anal and oral sex. I personally think that any adults that engage in a consensual act of any kind should be legal and part of our right to pursue happiness. So the statement I want to propose is this: Whoever wins the nomination and election for president should fight for our freedoms of sexuality and change the current laws when they are in the White House.
True or False
If false, why should things stay the same?
True or False
If false, why should things stay the same?