Posted on
October 26, 2013 in
Here’s what Marc Randazza said to an academic about her proposed revenge-porn statute:
While you’re sitting on your ass “teaching people how to think like a lawyer,” I’m actually out front on this issue, *litigating* these kinds of cases.
I think your law is fucking idiotic. Absolutely. Fucking. Idiotic.
Nothing but the academic circle jerk and a few vote-starved legislators could possibly consider *criminalizing* the publication of photographs to be tolerable. So go write another law review article about something else you have no first-hand experience about, and leave the legal work to the big boys and girls.
I’m neither part of the academic circle-jerk nor a vote-starved legislator, and I think that criminalizing some publication of photographs is tolerable (as some publication—obscenity, child porn—is already criminalized) as long as it doesn’t narrow First Amendment protections. ((I hope Marco, who fights revenge-porn publishers in civil court, will tell me why I’m wrong.))
Single-issue advocates usually write bad laws. So we can probably do better than the true believers at drafting a criminal revenge-porn statute that might pass First Amendment muster. Our advantage is the ability to look at both sides of the issue and meet the arguments against constitutionality. Not having our chances of tenure dependent on our success, we are not paralyzed by fear of failure. Not paralyzed by fear of failure, we can avoid the activists’ six major mistakes: Overstate your case. Misstate the law. Make handwaving generalizations. Demonize disagreement. Use false analogies. Lie.
Let’s take a crack at it, shall we?
The evil that we’re trying to eliminate is people posting nude or explicit images of former lovers online without the former lovers’ consent. At the very least, when Lucrezia shares a nude selfie with Giovanni, we want Giovanni to risk conviction if he posts the image on a revenge porn website with Lucrezia’s address and phone number to humiliate Lucrezia.
The First Amendment problem we face is that “posting nude or explicit images of former lovers online” is speech; a statute focused on such posting is a content-based regulation of speech; content-based regulations of speech are presumed to be invalid (that is, speech is presumed to be protected); and the Supreme Court in U.S. v. Stevens expressly rejected a balancing test for content-based criminal laws, instead applying a categorical test.
While UH law prof Josh Blackman has said that “Invariably, the court will balance interests in First Amendment jurisprudence” and UCLA law prof Eugene Volokh has suggested that the current definition of obscenity might be expanded to encompass revenge porn, we want our statute to be constitutional here and now, rather than in some speculative world in which the Supreme Court retreats from U.S. v. Stevens or rewrites the test for obscenity.
A scattershot approach will not work. The categories of unprotected speech that the Supreme Court has recognized are narrowly drawn. If a criminal statute arguably forbids both fighting words and obscenity, then it likely forbids a great deal of speech that is neither, and therefore fails constitutional muster. We want our statute to cover only speech that fits in one of the already-recognized categories of unprotected speech. Let’s pick a category, and go to work.
I pick obscenity. While the proposed statute that I was analyzing here would not survive a First Amendment challenge, and its author’s justifications for it are undeveloped and petulant, my analysis of the idea that sexual or nude images published nonconsensually could ipso facto be obscenity was incomplete and, I suspect, ultimately wrong. At the heart of obscenity are community standards, and a community might well find a particular revenge-porn publication obscene.
For a work to be obscene, it must: appeal to prurient interests; depict sexual conduct in a patently offensive way; and lack serious value. The test refers to “a work,” so you might assume that the test for obscenity relates to inherent qualities of the work. But what is obscene when distributed to children is not necessarily obscene when distributed to adults; what is obscene in Ogden is not necessarily obscene in San Francisco; and as the Court said in 1996 in Denver Area Educational Telecommunications Consortium, Inc. v. F.C.C.:
[W]hat is “patently offensive” depends in part on context (the kind of program on which it appears), degree (not “an occasional expletive”), and time of broadcast (a “pig” is offensive in “the parlor” but not the “barnyard”).
Since obscenity is context-sensitive, an image that is not obscene when Lucrezia publishes it to Giovanni might well be part of an obscene publication when Giovanni distributes it in a different context.
For example, publishing Lucrezia’s selfie next to her employer’s name and phone number might be more offensive to the community ((Which community?)) than just publishing the picture but not identifying her.
It might be argued that if Giovanni’s republication of Lucrezia’s image is obscene, current obscenity law (for example, Texas Penal Code Section 43.23) already forbids it. But if we’re attacking revenge porn as the particular evil that it is, we might not want to simply rely on obscenity laws. Since Lucrezia’s image cannot be presumed to be inherently obscene, we want the jury to consider at least the lack of consent, how Giovanni distributed, and perhaps even why Giovanni distributed it in deciding whether it is patently offensive.
If we want our statute upheld as a constitutional restriction on obscenity, we can’t simply stamp our feet and declare:
Disclosing pictures and videos that expose an individual’s genitals or reveal an individual engaging in a sexual act without that individual’s consent easily qualifies as a “patently offensive representation” of sexual conduct. Such material moreover offers no “serious literary, artistic, political, or scientific value.”
Whether the publication is patently offensive, and whether it has serious value, are questions that will have to be left to the jury if our statute is to be upheld on obscenity grounds.
A challenge to our statute will be an “as written” challenge, so the appellate courts will not be outraged by a record of Giovanni’s bad acts and the harm he caused Lucrezia. To meet that challenge, we have to define the crime so that there is little chance that someone whose distribution of images was not obscene will be convicted.
Here we necessarily run into community standards: while the whole idea of revenge porn is offensive to us, a jury of twelve might not find a particular publication patently offensive. This is a risk that we have to take—for our statute to be upheld under anything resembling current obscenity law, we have to be willing to bow to the standards of the community, which means making the image’s violation of those standards an element of the offense.
So our proposed statute might have a basic framework something like this:
[The definition of sexual conduct in (A) and (B), I’ve lifted from Texas’s obscenity statute. It could be better written, but that’s not necessary for our purposes.]A person who intentionally distributes a photograph of another without the other’s express consent commits an offense if:
- The average person, applying contemporary community standards, would find, taking into account the manner of its distribution and the lack of consent, that taken as a whole the image appeals to the prurient interest in sex;
- Taking into account the manner of its distribution and the lack of consent, the image depicts or describes:
- Patently offensive representations or descriptions of ultimate sexual acts, normal or perverted, actual or simulated, including sexual intercourse, sodomy, and sexual bestiality; or
- Patently offensive representations or descriptions of masturbation, excretory functions, sadism, masochism, lewd exhibition of the genitals, the male or female genitals in a state of sexual stimulation or arousal, covered male genitals in a discernibly turgid state or a device designed and marketed as useful primarily for stimulation of the human genital organs; and
- Taking into account the manner of its distribution and the lack of consent, the image, taken as a whole, lacks serious literary, artistic, political, and scientific value.
Nathaniel Burney may well have something to say about the required culpable mental states—intent for the posting, but strict liability for the lack of express consent.
In the First Amendment arena, we may run into the problem of the statute overturned by the Court in R.A.V. v. City of St. Paul: the State is not permitted to select, based on its content, some unprotected speech to forbid and some to permit. We could resolve this by removing the non-obscenity content criterion, that is, by forbidding the obscenely nonconsensual distribution of any material, rather than only of an image of another.
The lack of consent would, I think, be a manner-and-means restriction, rather than a content restriction, but the problem, if the statute is not content-based, is whose consent? We could remove the consent element, but then what we would have is an obscenity statute, with “manner of distribution” specified as a factor in the statute. I don’t see any obvious problems with this—a legislature could direct juries to take into account particular details of the context when deciding whether a work is patently offensive, provided that it left the decision up to the jury.
But if lack of consent is an important element of our criminalization of revenge porn—and it is, for we are trying to protect Lucrezia from Giovanni, not to protect Lucrezia from herself—we may have a patently problem.
In English “patently” means “clearly; without doubt,” but in law “patently” (pronounced pay’-tent-ly) means “appearing on its face,” the opposite of “latently.”
If “patently” in “patently offensive” has its common meaning, then the lack of consent (which is not necessarily shown on the face of the publication—Giovanni might even claim when distributing the image that Lucrezia asked him to share it) may be considered by the jury in determining whether the publication is patently offensive.
But if the “patently” in “patently offensive” has its legal meaning, then unless Lucrezia’s lack of consent appears on the face of the publication it is not a factor that a jury should consider in deciding whether the publication is obscene.
The Supreme Court hasn’t given any explicit guidance on which meaning “patently” has. On the one hand, the Supreme Court, being crowded with law geeks, generally uses terms in their legal sense; on the other, the Supreme Court has approved laws that allow juries of laypeople to decide what is “patently offensive” without defining “patently.”
The Supreme Court has described the thing that must be patently offensive as “a work” (rather than “an act of publication”), but it has made it clear that circumstances extraneous to the work (context and time of broadcast) are relevant to the determination.
So this statute has the advantage over others proposed of fitting into the current framework of First Amendment law. An appellate court finding it constitutional might be misguessing what the Supreme Court means by “patently,” but it wouldn’t be discovering a new category of unprotected speech, nor even expanding a currently recognized category.
The downside, from the eliminate-revenge-porn perspective, is that treating revenge porn as obscenity requires that the State prove much that is non-trivial to prove, and gives Giovanni lots of room to defend himself—for example, he could bring in an expert to explain to the jury why his republication of Lucrezia’s image has serious artistic value.
That the State’s burden would be non-trivial is an upside from the defend-free-speech perspective. The harder we make it for Giovanni to be convicted, the more likely it is that our statute will pass muster. We cannot eliminate the “no serious value” element (for example) without rewriting obscenity law, and it is a premise of this post that we want our statute to be constitutional under the current First Amendment regime.
It may not be possible to convict everyone who is caught republishing paramours’ sex pictures without their consent (never mind the difficulty in catching everyone who does so). A constitutional revenge-porn statute, even if it doesn’t make conviction of every publisher inevitable, will dissuade some people from publishing revenge porn; an unconstitutional revenge-porn statute, on the other hand (like those statutes passed in California and New Jersey), will be a joke, setting back the fight against revenge porn by five or ten years. If we want our statute to be constitutional, we may have to face the fact that it is not possible both to eliminate revenge porn and to defend free speech; we may have to settle for discouraging and disrupting revenge porn rather than eliminating it.
Unlike some who have proposed revenge-porn statutes, I welcome dissent. My self-image doesn’t depend on my being right. I love to be shown that I’m wrong, even publicly, because then I can stop being wrong. So, please, tell me how I’m wrong.
19 Comments
Comments are closed.
The difference, of course, is that some of the proponents of these laws want to treat revenge porn as categorically obscene as opposed to requiring proof that a particular instance is obscene.
The attempt to create a new categorical is what the Supremes rejected in Stevens. Could a court elect to create a new categorical? Anything is possible, but Stevens does not encourage.
By the way, this still makes you a misogynist who craves women’s suffering, I’m sure.
I think proponents of other revenge-porn statutes want to declare revenge porn obscene as a matter of law, which wouldn’t be the creation of a new category so much as a radical expansion of obscenity, defining the relevant community as the state legislature.
I think it might be educational to try my hand at a statute that treats revenge porn as “fighting words” rather than obscenity. Within the limits of current law, though, I think this is about the best we’re going to do.
How have the courts handled community standards in the internet era? Could an argument be successfully made that for purposes of online content, the relevant community is the group of users that visit the website where the material is posted? Since the visitors to a revenge porn website probably wouldn’t find revenge porn to be patently offensive, images posted to that site would be protected speech. But the same images could be patently offensive if they were posted on MySpace or Facebook or whatever the kids are using these days.
Could the terms of use for a website help define what material is offensive and what isn’t? If in order to access a revenge porn website a visitor had to click a button verifying that the user is okay with revenge porn, would that prevent the images from being deemed patently offensive?
It’s a good question, and it has been for years.
If “the relevant community” is “those who will see it on the Internet,” then arguably there is no obscenity on the Internet, since people can select what they see on the Internet and avoid things that are patently offensive.
Then arguably there’s no obscenity in most of the real world either, as long we’re not talking about something like pornographic billboards. The community of people who will enter the XXX Adult Video XXX store probably don’t find its contents offensive either.
To me, (an evil minded person bent on breaking this law should it ever come into existence), the weak link in your statute is this part here: “A person who intentionally distributes a photograph of another without the other’s express consent commits an offense…”
So all any revenge minded ex- has to do is store the photos on the cloud with a really weak password like 123ABC and then claim that hackers must have got them. The cops can’t prove any different, and its not like they are going to throw hundreds of hours of computer forensics into trying to disprove the claim. Besides, without getting the IP records from the revenge porn site (which isn’t gonna happen), there is no way to prove who uploaded what and when (assuming TOR wasn’t used)! Or you can just say your cell phone got stolen or hacked. Its a very plausible claim to make. After all, Christopher Chaney of Jacksonville, Florida (the guy who brought to the world the glorious nudes of Scarlett Johansson and many other celebs) got his pics from cell phone hacks and cloud hacks. He ended up getting 10 yrs. for his trouble.
But truly, the weak link in your statute involves the distribution part of it. High school kids passing around pics of the girlfriends aren’t going to care that there is a law. And even if they get busted, the pictures are already online – forever, so the damage is done. So you are not going to solve the problem that you are really wanting to solve. It’s like you are passing a law that makes it illegal to smoke dope, but not illegal to traffic and trade in it, and thinking that you’re going to make the dope all go away.
How about coming up with a law that would make it illegal for such websites to exist to begin with? But that cannot be done.
Any constitutional statute will be easy for people of at least moderate intelligence to wire around. The difficulty of enforcing a criminal law maybe an argument against the law, or may be an argument in favor of harsher punishment for those few who will be convicted.
Why? Why deal with it as a crime? Just because Franks wants to do so doesn’t make it a good idea. Just because we don’t like revenge porn (as we understand that to be, which notably nobody’s law actually mentions or defines) doesn’t mean the proper policy to address it is criminal.
By engaging in this exercise, are you suggesting that you think it should be a crime? If not, then why? If so, then there really needs to be a different discussion. So the gal who posts a selfie of her cheating boyfriend in a moment of outrage and anger requires jail? We may need a better takedown regimen, but not another crime.
In any event, as I told Nathan, if you can’t pass the Anthony Weiner test, then it’s dead in the water. Does this pass? It doesn’t appear to. No sense getting into nitty gritty until it can withstand scrutiny on the most superficial of levels. Unless you believe that the woman who revealed Weiner’s selfies committed a crime, then the law fails.
I would have a variety of other issues with your law, ranging from adequacy of notice of the conduct in issue to overbreadth and vagueness, but because I do not agree that it’s appropriate to address this as a crime, I’m disinclined to spend my time going there.
Wow, shg, that’s a good point! Just because we don’t like something, does that mean there ought to be a law against it? But I strongly suspect that the anti-men crowd and the nanny state PC liberals will keep pushing for this.
Scott, whether revenge porn should be a crime is deserving of discussion, rather than Franksian fiat. That Fredd so enthusiastically agrees with you should be cause for concern.
My proposed statute is essentially Texas’s obscenity statute, which already passes constitutional muster; revenge porn could be prosecuted under that statute, and the State could argue to the trial court that “patently” means “clearly.”
The Weiner’s Wiener test is satisfied because republication of Weiner’s dick pics had serious political value, and because the community of people who would be offended by a slimy politician being outed is vanishingly small.
I agree with Mr. Greenfield. As a community we should look beyond the Penal Code to find solutions to these and other problems. It’s bad enough that victims of revenge porn are subject to public humiliation; they shouldn’t have to pay the bills related to prosecuting and punishing the offenders, too.
From what I’ve gathered reading up on revenge porn, it is often part of a scheme in which the women can pay money, either to the revenge porn site or to a third party (usually only pretending to be) unaffiliated with the revenge porn site. That’s already very close to extortion, if it isn’t extortion. What about targeting that part of the scheme? No reputable businesses will want to advertise on revenge porn sites, due to social stigma. If you criminalize attempts to charge money for the removal of pictures, would the operators of revenge porn sites still have a viable business model? Criminalizing the demand for money rather than the publication of the pictures would avoid the First Amendment concerns, wouldn’t it? Combined with social pressure to inhibit ad revenue, how attractive would the revenge porn business be to the kind of scumbags who would be interested in it? Perhaps I’m missing something here. If so, I welcome correction.
This is a very good idea, Michael. While I disagree with the criminalization aspect of it, why can’t public shaming of the advertisers and the payment processers (like Paypal). You make it uncomfortable to do business, the money won’t flow as easily, and then there will be less incentive to run a revenge porn website. While it might be fun and cute to run the site, after while the “thrill” of being able to stick it to a bunch of random, unknown women will get pretty old once that first service bill comes in. Nobody runs that shit as a hobby. They do it for the money. You make it uncomfortable for Pay-Pal, and they’ll go away.
The beauty is that no new criminal law is needed. Demanding money from women depicted on a revenge porn site is fraud if done under the guise of an independent “reputation management” service that is in reality affiliated with the revenge porn site.
[…] Mark Bennett offers a “revenge porn” statute that might pass constitutional muster. […]
[…] Mark Bennett offers a “revenge porn” statute that might pass constitutional muster. […]
First of all let’s look at the basics. The Bill of Rights and the Constitution were not designed to protect the group. They were designed to protect individual rights. Those First Amendment rights are granted to the individuals, not the government, businesses or corporations there are already laws in place, allowing people who have been shamed to file civil lawsuits for libel, slander and defamation of character not to mention a false light lawsuit. And they would be able to be filed against the publisher (. Website owner) , as well as person that posted the information without the consent of the person who was in the photograph. Using someone’s photograph or representation of them without their express consent is already law in most states try and use John Wayne’s photograph to sell a product without the express consent of his estate . And since these websites make money from these photographs. There should be statutes in place to take care of that . And if not that’s where the laws should be written, to protect the individual’s rights to not have photographs published that they have not consented to. Yes I know the media will be screaming about this type of law. In that it violates their First Amendment rights, but the fact is that First Amendment rights are there to protect the individual, not the media.
So there is already laws in place, to give satisfaction to victims of revenge websites. The idea that the website and the person who posted the information can be sued for everything that they own should be enough to deter anyone from even putting up one of these websites.
I guess the bottom line here is, this is another sex charged law in the making, where the sex should be taken out of it and perhaps a law passed to protect everyone from having their unwanted pictures posted on the web . I can hear the media sources screaming right now.
[…] websites liable. She will argue that does nothing to undermine Section 230. A more reasoned and thoughtful look at the issue, however, shows how this effort is fraught with dangerous consequences and […]
[…] Mark Bennett offers a “revenge porn” statute that might pass constitutional muster. […]