Computer-generated child pornography has an interesting legal history in the United States. A 1996 law made it illegal to possess or trade images that appear to be children engaged in sex, even though no child was involved in making them. But in 2002, the U.S. Supreme Court overturned that law, saying it criminalized free speech. In response, Congress passed a 2003 law that (in relevant part) criminalized “simulated” or “virtual” child pornography as long as it was “obscene”—that is, lacking in serious non-prurient value. As a result, Americans can be convicted of child pornography crimes involving drawings of children engaged in sex, computer-generated images, or adults intended to look like children—but only if the images are judged obscene. That fact-specific standard came into play in Shoemaker v. Taylor, an appeal of the denial of a habeas corpus petition in California.
Stephen Shoemaker was convicted of eight counts of child pornography possession under California state law, after police found the images among many, many more legal adult pornography images. He contended at trial that two images were initially innocent and had been digitally altered to be pornographic; the other six, he said, were innocent nudes. He was sentenced to 90 days in custody, lifelong sex offender registration, a fine, probation and a year of sexual compulsiveness classes. His state appeals exhausted, he filed a habeas corpus petition with the Central California district courts. He argued that the jury erred in finding that any of the images were prohibited child porn; the judge erred when instructing the jury and permitting the prosecution to argue that the jury could consider the context of the pictures; and his conviction was not supported by the evidence. The Central California district court denied his petition.
The Ninth U.S. Circuit Court of Appeals affirmed. It first ruled that the six nude photographs were not innocent, upon review of the actual images and application of Dost factors to test for lasciviousness. It went on to rule that the morphed images, regardless of whether they were truly morphed, are also not protected speech because the Supreme Court has not ruled on whether images of real children that have been manipulated to look pornographic are protected speech. Because such images involve real children, the Ninth said, the concerns of New York v. Ferber are in play: children can be harmed by the circulation of a permanent record of their exploitation. That’s true even though no actual sexual abuse may have taken place, the court said. The Second, Sixth and Eighth Circuits have held likewise, it noted. The court went on to agree that the prosecution’s context argument was error, but held the error harmless.
I would apply Ferber differently from the way the court did here. That case held that child pornography is not protected speech, as adult pornography is, because making it requires that a child be sexually exploited. That’s the harm to society created by child pornography. In a picture of a child that’s digitally altered, it’s not clear that the child is sexually exploited—certainly not in the same way, and possibly not at all. The reputation harm cited by the Ninth may not be a serious concern; there’s a limited audience for these images and the child took no part in making the image. But as a cyber crime attorney, I doubt that these images are disappearing—so courts may have more opportunities to consider this.
↧