Before we dive into who's suppressing your nookie time online, a note on terminology: “censorship” is a word that has grown a surprising number of meanings. The technical definition, and the one used by the American Library Association, is: “The removal of material from open access by government authority.”
However, people also often use the term “censorship” in private-business situations, such as when a blog does not publish derogatory comments, or Apple's App Store does not offer a sexually explicit application for sale. There are many gray areas between these two definitions when it comes to technical censorship, and while this article focuses largely on government-induced censorship, the other types are well worth investigation.
How Did We Get Here?
Earlier this year, the European Council floated the concept of a “Great Firewall of Europe” to prevent child-abuse images from passing into European Union countries. Like many censorship attempts, what sounds impossible to object to on first glance, turns out to be a maelstrom of poor design reaching much further than its intended scope. Even a prominent German organization of child-abuse survivors opposed the firewall on the grounds that “blocking just hides the problem and actually lowers police incentive to become active.” I would add that there is no global consensus of what exactly constitutes “child-abuse imagery.” The age of consent isn't even the same between the U.S. and the U.K., let alone the rest of the globe.
The entire question of what “imagery” is remains hotly contested. does it include anime? Adult actors pretending to be underage? Written works? Sketches? (How do you decide the age of sketched people, anyway? More on that later.) While questioning whether real depictions of sexual acts should be treated differently than artistic ones may seem like splitting hairs, it's at the very heart of the debate over technology-based censorship.
In the early days of cartoons, animated icon Betty Boop was particularly explicit for the times. In the early 1930s, she was depicted dancing topless in a hula skirt and strategically-drawn lei, tied up by a would-be rapist, and stroked inappropriately by an employer who implies her job is on the line if she doesn't put out. Because Betty Boop was animated, many liberties could be taken far beyond what would have been accepted in art forms depicting real people—for a while.
Public outcry about immorality in film resulted in the creation of what is now the Motion Picture Association of America (MPAA), and their “Production Code,” which restricted what could be explored in the U.S. film industry. From 1930 to 1968, a film was not allowed to “lower the moral standards of those who see it,” depict “suggestive dances,” or positively portray sexual relationships outside of marriage. Interestingly, this was not governmental censorship, but was a measure created and enforced by Hollywood studios in hopes that self-policing would prevent governmental regulation. (It should be noted, though, that governmental interference was happening in other ways. A Supreme Court decision which deemed motion pictures exempt from the First Amendment was not overturned until 1952.)
By 1968, many studios outside the Hollywood bigwigs ruled by the Production Code were releasing and making a profit on films that would have never earned the MPAA's blessing, and an influx of foreign films also made enforcement difficult. Thus, the Code was abolished, and the familiar movie ratings system began. While this may seem somewhat distant and irrelevant, our history with technology and censorship directly affects today's issues of sex censorship on the web. As the MPAA struggled with regulating a vast and international industry not always willing to bow to its specific concerns, so too, do Internet censors face the question of how to clamp down on huge amounts of content produced around the world. Very few types of content are considered illegal globally, but with global access, how are national standards upheld?
Do You Know It When You See It?
The question of a private international entity's responsibility and/or right to censor content is still a live one. A few years ago, the popular social networking service, LiveJournal, suffered through a difficult period of self-censorship, ostensibly brought on by pressure from conservative activist groups concerned by some of the sexually explicit content hosted on LJ pages. While LiveJournal was well within their rights to ban users, delete communities, and even remove specific unwanted “interests” from user pages to prevent people from finding others with similar tastes, the enormous user backlash from such swift and far-reaching actions created its own damage to the company name.
LiveJournal took particularly harsh actions against one artist due to a perception that a sketch depicted underage characters. What would be almost impossible to know as a company, however, is that while this particular drawing depicted characters from Harry Potter, who are indeed underage in some of the books/films, it's very common in Harry Potter fandom to “age up” the characters before placing them in sexual situations in fanfiction or artwork.
How is a private company supposed to know the inner workings of fandoms? Then again, if a private company isn't doing its homework, is it appropriate to take extreme actions? Moral panic is always in the eye of the beholder.
Where Do We Go From Here?
The argument over net neutrality is a frequent anxiety-inducer for content providers on the blue side of things. Some broadband providers seek to create a “tiered” Internet service, speeding up traffic to some websites and slowing traffic to others depending upon the fees paid to them by content providers (and, presumably, end users like you). Thus, for example, if Verizon and CNN end up with a partnership, Verizon customers might find CNN loading instantaneously while MSNBC languished, unlike the equal speed and access you find today thanks to our current net neutrality.
The sexual implications seem obvious: few providers of online sexual content could afford whatever additional fees big ISPs angle for, and the explosion of sexual freedom online could quickly find itself trammeled if users find that accessing sex sites suddenly provides flashbacks of 1994 and dial-up modems.
Outside of the net neutrality debate, other attempts from ISPs to self-regulate traffic are sometimes worrisome. AT&T recently suggested that the U.S. government create a national blacklist of websites which all Internet service providers would be required to block. The question of whether private companies—who, while private, handle a large percentage of public internet traffic—should be able to strong-arm governments into censorship is a complex one.
AT&T has already drawn a certain line in the sand by partnering with Apple for the iPhone, which takes great and obvious pains to restrict adult content just shy of the actual blacklisting of websites. However, considering how much Internet traffic is related in some way to sexual material, ISPs actually have a business interest in continuing to provide speedy, reliable access to porn.
If censorship in general is complex, tech censorship is a box of yarn after two dozen kittens have had their way with it. The conflicts of interest from various high-powered parties suggest that the yarn won't get straightened out anytime soon, but in the meantime, the Electronic Frontier Foundation addresses freedom of speech issues in emerging technology, and the National Coalition for Sexual Freedom works toward equal rights for those involved in non-mainstream sexualities.