A lot of women I know, including myself, find that bras are actually hurting us and not helping us. For the longest time I was told that they were necessary to support breasts and improve the posture of the spine and shoulders. My chiropractor said he was a part of a study (not in the study, but monitoring it) a few years ago and they found that women who didn't wear bras actually had less back problems.
Now, this isn't the reason for my not wanting to wear one. My reason is that I feel more comfortable without one. But for some reason, some people seem to have a huge issue with seeing someone's nipples get hard under their shirt. Most men find it sexy, including my hubby. But I had two comments made to me at the grocery store that I should wear a bra and stop showing off my goods underneath my LOOSE t-shirt. I mean, it's not like I was wearing a tank top. And if I was, who cares?
But anyway, enough of me. What do you all think? Voting is not private, but please explain your reasoning.
Now, this isn't the reason for my not wanting to wear one. My reason is that I feel more comfortable without one. But for some reason, some people seem to have a huge issue with seeing someone's nipples get hard under their shirt. Most men find it sexy, including my hubby. But I had two comments made to me at the grocery store that I should wear a bra and stop showing off my goods underneath my LOOSE t-shirt. I mean, it's not like I was wearing a tank top. And if I was, who cares?
But anyway, enough of me. What do you all think? Voting is not private, but please explain your reasoning.