When Are Sexually-Themed Memes “Harmful to Minors”?–State v. Chapman

Chapman was a high school teacher. He texted a student multiple sexually-themed memes and remarks. The court details 11 items; this one is representative: “A picture of a cartoon character with white liquid on its stomach and the caption ‘When he taking his sweet ass time getting the cum rag'” (maybe this image?). The state prosecuted Chapman for felony distribution of “harmful to minor” material to a minor. Chapman challenged the materials’ status as “harmful to minors.” The lower court held that they “probably” were harmful-to-minors. On appeal, the 3-judge panel affirms but issues three opinions, a good indicator of a deeply divided court.

The statute defines “harmful to minors” material using the standard Miller test for obscenity, modified for kids (roughly following the statute upheld in Ginsberg v. New York):

(1) it describes or represents, in any form, nudity, sexual conduct, sexual excitement, or sado-masochistic abuse;
(2) considered as a whole, it appeals to the prurient interest in sex of minors;
(3) it is patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable matter for minors; and
(4) considered as a whole, it lacks serious literary, artistic, political, or scientific value for minors.

Regarding the first requirement, the memes didn’t depict any nudity, but they textually described sexual conduct. The majority says that’s enough.

Regarding the second requirement, Chapman said the memes were just humorous. The majority responded: “the memes all suggest or use explicit language to refer to sexual activities or sexual situations in crude, vulgar, and degrading terms.”

Regarding the third requirement, the majority says it’s irrelevant if minors have widespread access to similarly sexually-themed memes. The test is what adults would approve of. The court says this is a fact question to address at trial: “the parties should provide evidence of whether and in what way the internet and social media platforms have altered the community standard.” The court also calls on the legislature to consider this issue:

given the pervasive nature of the internet and social media in today’s society, especially with teens, we think it not only prudent but also necessary for our statutory scheme to reflect the existence and use of these platforms. The current version of the statute was enacted in 1983—over two decades before many popular websites and social media platforms were even created. (For example, Facebook was launched in 2004, YouTube in 2005, Twitter in 2006, Instagram in 2010, and Snapchat in 2011). It would be valuable for the General Assembly to examine the operation of this statute and give any additional guidance that would recognize the impact of the vast expansion of internet communication in the years since this statute was enacted.

Regarding the fourth requirement, the majority says it wasn’t an abuse of discretion for the lower court to find the memes had no redeeming value.

The majority says Chapman waived the obvious First Amendment problems raised by the court’s interpretations.

A concurring judge blames the Internet for society’s ills:

“[T]he prevailing standards” of the values in our communities and in our society at large have been deeply coarsened and diminished by the Internet since this criterion became law in 1983. That such coarsening has extended to teenagers, such as the alleged victim in this case, is undeniable. Just ask any high school teacher about students’ language in the hallways during passing periods. A search of teenagers’ cellphones for content not “suitable for minors” would easily and sadly confirm this as well…

The ubiquity of smartphones for teenagers, together with the instant availability and almost completely uncensored nature of content on the Internet have been at the heart of this coarsening of values for minors, and indeed for us all. One need look no further than to websites or blogs that are just one click away, and especially to social media applications such as Facebook, Twitter, Instagram, SnapChat and TikTok, all of which provide addictive “free” features and essentially uncensored and unmoderated information of all types to the user.

Ugh, so much empirically dubious techlash. I wonder how the judge felt about the 30-year-old TV show “Married…With Children,” which predates the Internet. The concurring judge also notes the impracticality of the legal test:

What are we to make of the fact that minors exchange matter not “suitable for minors” every day, especially teenagers in high school? Should that fact be considered in defining “the prevailing standards of the adult community with respect to what is matter suitable for minors?” What if a classmate, sibling or cousin who is just over the age of 18 shares “matter [not] suitable for minors” with a minor friend or relative who is just under age 18? This happens every day during students’ senior year of high school, as some students reach the age of majority while in high school while others do not. How should such and similar conduct inform the determination of “the prevailing standards?”…

“[T]he prevailing standards …” must be defined as what is factual, rather than what is aspirational. Once this is done, it will be clear that “the prevailing standards in the adult community with respect to what is suitable matter for minors” in 2022 are not the same as they were in 1983.

A dissenting judge points out that moral panics are common throughout history, including parents’ freakouts about Elvis’ hips and Dungeons and Dragons. Furthermore, the Internet has rearranged how we define “communities” for purposes of measuring community standards. Thus, “in 2022, the ‘adult community as a whole’ standard should include consideration of the internet community.” Finally, she criticizes the majority’s punt to the legislature:

the majority wishes to pretend the influx of material regularly shared amongst modern youth has not shifted the way we should view what is suitable for minors unless and until the legislature reconsiders the statute. But I cannot ignore these sweeping cultural changes. The sexually suggestive memes at issue are almost certainly in poor taste and I do not support the sharing of them with a seventeen-year-old. Nonetheless, I cannot find this material patently offensive to prevailing standards in the adult community with respect to what is harmful to a teenager on the cusp of adulthood in 2022.

Implications

Chapman’s behavior is indefensible. There are few circumstances where it’s appropriate for an adult to send memes of this nature to minors. The power imbalance between a teacher and a student makes this interaction even creepier, and his messages would be solid justification to remove him from the classroom.

Given the impropriety of Chapman’s activity, it’s tempting to assume that Chapman was grooming the student for sex. Groomers often engage in similar sexual “banter” before escalating their behavior. However, the state isn’t prosecuting Chapman for grooming. Instead, the state prosecuted Chapman for sending bawdy memes to a minor. That’s what makes this case so troubling. Especially when no actual nudity or sexual material was displayed, holding that sexually-themed conversations between an adult and a minor is a felony sounds like a First Amendment violation.

The concurring judge emphasizes that many high schoolers have sent similar, or worse, material to each other. According to this ruling, all of that ordinary teen activity may also be criminal. In that sense, this case raises the same legal dilemma posed by teens sexting each other. Nominally, those images/videos are CSAM, so both sender and recipient can be prosecuted. Yet, if we interpret CSAM laws that broadly, authorities would literally have to lock up entire high schools. That can’t be the right result.

This case also reminded me of an incident involving Judge Kozinski from a dozen years ago. He stored sexually-themed memes on a private server and shared them with his contacts. This ultimately proved to be just a subset of his broader problematic behaviors; but before we understood the full scope of his misbehavior, there was a split of opinion about how to evaluate the material on Judge Kozinski’s servers. On the one hand, it was clearly inappropriate for a high-profile federal judge. On the other hand, most of us had, at one point or another, shared similar material with our friends. Should Judge Kozinski be held to a higher standard than we held ourselves to?

Finally, this case highlights the problems with trying to determine “community standards” about appropriateness of content. That standard might have made sense in an era where many conversations were local and only (sanitized) mass media reached across disparate geographic regions. The Internet has negated these implicit assumptions. Now that we’re all members of multiple virtual communities that cut across geographic borders, how do we even define a “community” or measure its standards of propriety?

Case citation: State v. Chapman, 2022 WL 852915 (Ind. Ct. App. March 23, 2022)