I apologize for the length of this, but I think it's too important for a lj-cut.
Something I think every human being forgets sometimes is that what looks like a clear and important distinction to a member of a particular group is invisible to those outside that group. I'm generalizing here; it's not just one specific incident. But I can see it happening here, and even though I had and have nothing at all to do with this and never will again (and even before I left wasn't involved in this issue), I hope I can maybe re-align the camera a bit and make people look at things from a slightly different angle.
In the eyes of the
United States government, "child pornography" is defined as -- elisions and boldface mine -- "a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting, that [...] depicts an image that is, or appears to be, of a minor engaging in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex [...] It is not a required element of any offense under this section that the minor depicted actually exist."
In the eyes of the law, in the eyes of law enforcement, in the eyes of the people who make the legislation, a drawing of a sixty-year-old Character A giving a blowjob to a twelve-year-old Character B is identical to a picture or a video of a sixty-year-old Person A giving a blowjob to a twelve-year-old Person B. This holds true whether Characters A and B are Dumbledore and Harry Potter or "that creepy guy who lives down the road" and "his next-door neighbor he fantasizes about".
I think this is the part that fandom misses, a lot. Fandom as a whole is a very anything-goes, libertarian society, which is part of what I think is so awesome about it. Fandom (and yeah, I know I'm generalizing here) generally believes that the solution to speech an individual finds "wrong" or "distasteful" is more speech, and fandom believes in frequent discourse and ongoing conversations.
Every few months or so, the topic gets kickstarted again, and it'll make its rounds: what kind of obligation does fandom-at-large have to produce social pressure against things that are unacceptable? Who defines unacceptable? And there are two major camps that I see every time that discussion starts up: one that says that "fandom" (scare quotes included because fandom is hardly a unified beast) should consider it our responsibility to self-police lest someone from outside come in and take care of it, and one camp that says, again, that the solution to "bad" speech is more speech. The second viewpoint is usually the most common.
Social media -- and despite being firmly in the Web 1.0 camp, LiveJournal is a social media site even if it's not only (or even primarily) a social networking site -- has come under considerable fire from both the court of public opinion and various state legislatures in the past year. It's a false panic, and anyone with any degree of familiarity with social media can explain to you why it's a false panic; the "we must protect the CHYLDRYN!" refrain is repeated by people who pass stories back and forth over and over again, with the story getting more and more conflated and inflated every round. (Nothing new under the sun, from Salem to McCarthy to now.) There isn't a child molestor lurking under every digital rock any more than in the 1980s there were Satanic ritual abuse cults lurking in every shadow. Nobody, nobody who runs a social media community believes that there is.
In December of 2006, two US senators
proposed a bill to strengthen anti-child-pornography laws, particularly aimed at social networking services. In March, the Connecticut legislature proposed a bill that would require any "social network" to verify the age and real-world identity of every single one of its users, and restrict access to anyone under 18 without parental permission. In May, eight state Attorneys General requested that MySpace turn over records of any and all registered sex offenders who had accounts on the site.
Will any of these bills pass? Who can say? Will these bills be challenged in court if they're enacted into law? Absolutely, and this is why everybody who believes that the solution to bad speech is more speech should be giving money to the
EFF and the
ACLU. The point is that it is an issue that is very much on the public radar, and there have been a number of proposed laws and a number of court decisions that seek -- deliberately and specifically -- to nibble away at all of the reasons for striking down prior legislation attempts such as the Communications Decency Act.
It sucks. It sucks. Everyone I have ever spoken to who works in the social media industry -- and I've spoken on these topics to a lot of people who work in the social media industry, from major players to tiny startups -- gets visibly, vehemently, incandescently, and often scatologically furious any time the topic comes up, because this legislation hits every single one of them where it hurts: free expression.
And every time I've participated in one of those conversations, there winds up being two camps: the camp that says that companies should consider it their responsibility to self-police, lest someone from outside come in to take care of it, and the camp that says that the solution to bad speech is more speech. And, again, the second viewpoint is more common.
I don't have any inside knowledge on how this plays into any particular policy decision or specific case on LiveJournal anymore; I recused myself from anything even vaguely touching on this issue long before I even gave notice. But I think that what fandom forgets sometimes -- because we know how we use language, we know what we intend when we write or draw or create something, we know that these manifestations of our creativity are not at all intended to promote an atmosphere where the sexual abuse of children is encouraged and normalized -- that an outsider, without that cultural background and shared vocabulary, doesn't know these things.
And I'm not talking about the owners of social media sites or the people who handle reports of a policy violation on social media sites. I'm talking about Alberto Gonzales and John McCain; I'm talking about the attorneys general of Connecticut, Georgia, Idaho, Mississippi, New Hampshire, North Carolina, Ohio and Pennsylvania; I'm talking about all of the names listed
here. I'm talking about the FBI and the NCMEC, about Detective Joe who became your town police's "computer crimes" forensics guy because he knew how to run disk recovery software, about Mrs. Grundy in Lincoln, Nebraska who's simply scandalized every time she catches an episode of "To Catch A Predator".
Every person I've ever spoken to who works in the social media industry lives in mortal fear that these guys are going to manage to get through legislation that does get rid of all the things that the ACLU used in ACLU vs. Reno to get the CDA overturned -- because they've come damn close to being able to. Everybody who works in the social media industry -- hell, anyone who has a website based out of the area covered by the 9th circuit court, which includes the Bay Area -- is shitting themselves over Fair Housing Council v. Roommates.com, which sets a fucking awful precedent about Section 230 immunity for ISPs and came out of the 9th circuit court, traditionally the most liberal.
And every person I've spoken to who works in the social media industry is starting to realize that right now, in this climate, in this world, the answer is going to be for sites to self-police -- to set the least restrictive policies they can possibly set, so that they can say that they are making a good-faith effort, before something found on their service winds up being used as an excuse for legislators to produce over-broad laws that strike down anything that could even remotely be considered "dangerous". Because the owners of these sites are looking at the long run, at the big picture, and they know that law enforcement and legislators would prefer to clean house for them rather than letting them do it themselves.
What does this mean for fandom? Setting aside any question of whether it's "right" or "wrong" to produce sexually explicit material (textual or visual) about someone under the age of 18 or 16 or 14 or what-have-you, setting aside any question about what the laws actually say and whether or not the laws are fucking insane -- all of which I have seen debated in the past few months, and none of which I want to get into, discuss, argue, or explain -- and setting aside all of the conspiracy theories about and fury (justified or inflated) over LJ's actions, which I am never, ever, ever touching again so help me God, the fact of the matter is that right now, in today's climate, explicit porn with kids in it is a major risk. Not only to the service you're posting it on, not only to your account on a specific service, but to you. (And the bad news is, you don't even have to write it, draw it, or deliberately seek it out, as
Julie Amero found out.)
Text is safer; text will always be safer. There's a long-running tradition in case law supporting the "literary merit" (as part of the "serious literary, artistic, political, and scientific value" of the Miller test for obscenity) of written material. Art? Well, I'm not an artist, fan- or otherwise, but if I were, in this current legal and political climate, you would not catch me producing sexually-explicit material where the people depicted could even conceivably look under 18 if you squinted -- even if it never left my hard drive.
I believe that the free expression of ideas -- even ideas with which we disagree, even ideas which are not popular, even ideas that someone or anyone might find abhorent -- is the greatest thing any artist can uphold. But I also believe that the last thing fandom can afford, in this current legal and political climate, is to gain a reputation with the people who make and enforce these laws as "those people who promote child abuse and call it fiction or fantasy". And again, I'm not talking about on LJ; I'm talking as a whole.
We know that what we create has nothing to do with child abuse. But step back and take a look at the situation from the perspective of someone who doesn't know that, someone who's never heard of fandom before, and I hope you'll start to see why it's such a problematic issue. Law enforcement doesn't differentiate between an image of an actual minor engaged in sexual activity and, say, a photomanip of a sixteen-year-old Daniel Radcliffe's head pastede on yay onto the body of a fully legal porn star participating in same. Law enforcement doesn't differentiate between an image of an actual minor engaged in sexual activity and a painting of that exact same activity. Even if the people are fictional, even if the situation never happened, even if it's set a long time ago in a galaxy far far away and the characters are really five-hundred-year-old vampires in the bodies of fourteen-year-olds.
Be angry at LJ, if you're angry; there's certainly reason to be. But even having no clue about this specific incident, I know that there's far less reason for that anger than the rumor mill might have it. Be more angry at the society and the legal environment that makes a site have to adopt policies like these if it wants to stay alive -- and recognize that the people who are making these decisions for any social media site are desperately trying to keep their heads and their users' heads above water before the government tide comes sweeping in.
(I despise doing a hit-and-run, but I am very, very sorry to say that I am so far past the end of my rope with fury at everyone and everything involved in these issues that even the most innocuous comment causes a nuclear meltdown; comments are, therefore, disabled, and I will not be reading any post on my FL that has to do with the topic. If you'd like to link, you may without asking; link
here.)
[ETA: Best Beloved points out that most people aren't necessarily familiar with the
NCMEC, the National Center for Missing and Exploited Children; they are the quasi-governmental agency designated as recipients, through their
Cyber Tipline, for reports of online child pornography as required by
42 USC 13032.]