Table of Contents

So to Speak Podcast Transcript: A warning label on social media?

A warning label on social media?

Note: This is an unedited rush transcript. Please check any quotations against the audio recording.

Nico Perrino: All right. Welcome back to So to Speak, the Free Speech Podcast, where every other week, we take an uncensored look at the world of free expression through personal stories and candid conversations. I am, as always, your host, Nico Perrino. There is currently a movement afoot to restrict young people's access to certain online content, namely pornography and social media. Critics of porn and social media argue that they can be harmful to minors. And states across the country are taking up the cause, considering laws that would impose age verification, curfews, parental opt ins, and other restrictions on this content.

Much of this movement is being supercharged by NYU Professor Jonathan Height’s No. 1 New York Times bestselling book, The Anxious Generation, which argues that social media, in particular, is rewiring young people's brains and damaging their mental health. Meanwhile, critics of the critics argue that the evidence of harm isn't so conclusive. And what's more, many of the proposed restrictions violate core civil liberties, such as privacy and free speech. So, who's right? Joining us to debate and discuss the issue is Clare Morell. She is a senior policy analyst at the Ethics and Public Policy Center.

She is currently working on a book due out next year titled The Tech Exit, A Manifesto for Freeing Our Kids. Clare, welcome to the show.

Clare Morell: Thank you so much for having me.

Nico Perrino: Also, joining us in studio is Ari Cohn. He is a First Amendment lawyer who currently serves as free speech counsel at Tech Freedom, a technology think tank. Ari, welcome to the show.

Ari Cohn: Great to be here.

Nico Perrino: All right, folks. So, yesterday, I get into the office, I open up my e-mail, and I see this op-ed in the New York Times from the United States Surgeon General. The headline is Why I'm Calling for a Warning Label on Social Media Platforms. And the crux of it is down here, I think in the third paragraph where he writes, “It is time to require a surgeon general's warning label on social media platforms,” stating that social media is associated with significant mental health harms for adolescents. Clare, I want to ask you what are the harms of social media? I know you believe that they exist, so what is leading you to that conclusion?

Clare Morell: Wow, that is a huge question. That's like a 400 page book from Jonathan Height. But to just to distill a couple of the things that I think are the most pressing, obviously, the teen mental health crisis, the rates of depression, anxiety, self-harm among tweens and teens today is extremely concerning. The suicide rate. So, that is all well documented in Jonathan Height’s research. But beyond that, there is more and more brain research coming out in the ways that these technologies, particularly the effects of dopamine being released in children's brains when they are still developing, is extremely harmful.

Even I think others have used the word rewiring. But we are seeing from functional MRI research and imaging research that there's actually changes in the physical brain happening from the use of these technologies.

Nico Perrino: So, dopamine is that thing that our brain gets when we get excited and we usually want more dopamine, right?

Clare Morell: That's right. That's right. It's a neurotransmitter. And studies have shown that social media has similar effects on the brain like cocaine and other drugs, tobacco, alcohol that, basically, creates a craving for more. And so, part of the danger with the design of these platforms and the reason that my book will argue that a lot of the current strategies that parents have been employing like screen time limits and parental controls aren't sufficient with the level of harms because even a screen time limit can't stop the fact that the child is always going to create more.

The limit is never going to be enough. And the other big point I will mention about the harms is that they're not just individual level harms. Jonathan Height makes this point really clearly that there are these kind of network effects or cohort effects. That these technologies have actually changed the entire social environment for kids today, such that even if a kid is not on social media say as much as their peer, maybe they're only on 15 minutes and their peers on two hours, that the nature of the social relationships, all being mediated through these platforms can still cause anxiety, loneliness, depression, this constant comparison, fear of missing out, even if they're not on the platforms that much in a day.

And I hope we'll get into why these collective solutions are necessary that leaving it up to parents on their own really isn't enough for the level of harms. So, anyway, getting back to your question on harms though, it's brain level harms by the nature of the design, the dopamine effect. It's the mental health harms and it's also physical harms. More eye research is coming out about how damaging it is for kids to look at screens all day and the kind of compulsions that social media creates to checking screens is not healthy for kids’ physical bodies. And then, relationships and friendships.

I think we're seeing a lot of harms on how kids are interacting with their peers and also their family members just because of the kind of constant craving that these social platforms create. So, I think the harms are on a myriad of levels. And then, I hope we'll also get into some of the content harms as well. The things that kids are seeing on these platforms are truly horrifying. And they're also opening kids up to tons of access from predators, from strangers. And parents really can't see into this. They're not able to see into what their kids are doing, who they're interacting with online.

And so, it's really a twofold problem of both the kind of the social media both acts as a substance like a drug on the brain. And it's also a place that people spend time and it's a dangerous and unsafe space for children full of violent pornography, beheading videos, dangerous TikTok challenges. I could go on about the content. I hope we'll talk about that some more. But yeah, so it's also a dangerous space for children to be spending their time.

Nico Perrino: So, Ari, do you accept these premises?

Ari Cohn: I do not. I'll modify that. I think there are truths in there. I do not think they paint the whole picture and I do not think they are as extensive as claimed.

Nico Perrino: So, what are the truths that you see?

Ari Cohn: There is really bad content online. Nobody can deny that. You have to wonder then why there are so many states trying to regulate platforms’ ability to remove some of that terrible content. But that's maybe a different story.

Nico Perrino: You're referencing the Net Choice case in Texas, I'm assuming.

Ari Cohn: Among others, you know, among others. But there's harmful content online. There's also an immensely more amount of really good, useful, educational, helpful, constructive content online. And we shouldn't lose sight of that fact. Social media is a place where kids learn more about the world, more about each other, more about people they would never have the opportunity to come in contact with if not for the internet. It is a place where they can find community. It's a place where they can learn things. There is a lot of good that comes from social media and to focus on the bad is to tell a very small percentage of the story.

Nico Perrino: Yeah, but even too much of a good thing can be a bad thing.

Ari Cohn: That’s true but that’s the case –

Nico Perrino: And I think that's the argument here is that kids are spending so much time online that, as Jonathan Height says in his subtitle for his book, The Anxious Generation, it's rewiring kids' brains to the detriment of them.

Clare Morell: Can I respond briefly on the benefits? I will just say I think that often, the benefits are of “social media” is actually conflated with the benefits of the internet as a whole. So, I'm not against kids going online for education. You can use Google. I mean, even a lot of these laws aren't even talking about YouTube where kids could go look up videos of things. But this is talking about how the form of social media, TikTok, Snapchat, Instagram is, as you're just stating, is rewiring kids’ brains. It's not a helpful form of communication. And some of those benefits that you're talking about, Ari, can be found in other forms on the internet. They don't need to come through social media.

Ari Cohn: But I do think it is a beneficial form of communication and the same things have been said about everything since the printing press. People were worried when the written word happened that men's minds were going to turn to mush because they no longer had to remember things from the oral tradition. And when radio came out, people were worried that it was melting the brains of America's youth. When TV came out. people were terrified that we were all going to turn into couch potatoes. And here's the thing this.

Nico Perrino: There were comic book hearings in the 1950s.

Ari Cohn: Yes. Every new form of media has faced this same accusation. And here's what I like to remind people. For example, when my parents were kids, TVs were pretty new to households. And they were going through kind of the same kind of freak out. By the time I was born, they weren't as necessarily freaked out. Why? Because as new media gets introduced, we find a way to integrate it into our lives, and we figure it out eventually. It might take a little bit of time. There might be some rough spots, but we always do adapt and it's never as –

Nico Perrino: But at what cost I think is what Jonathan and Clare would say.

Ari Cohn: But I don't think the cost has been proven. And I take the correlationary studies at face value. I don't think John is making up what he's saying at all. And I think they're actually useful in terms of parents when they're deciding what they want to expose their kids to, what they want to allow their kids to do. Frankly, I think that a parent who says, “I don't want my kid on social media,” I think that is an entirely defensible position for a parent to take. And I would not dare to criticize another parent’s choice for what they think is best for their kid because I think every parent knows their kid better than anyone else knows their kid.

And I think that's really important. However, where I do start to have an issue is when we start setting blanket rules and introducing laws that restrict the internet and social media, not just for all kids, but potentially sometimes for adults as well. That's when I think we need to take a step back and say what faith are we really putting in all of this contested scientific information?

Nico Perrino: Well, and we're going to get to those laws in a second. But before we do, I want to go back over to Clare. Ari's trying to position social media as just another technology and a litany of technologies that at one point there was a moral panic over, but now we sort of have learned to live with. Do you agree with Ari there?

Clare Morell: No. I think there's really important distinctions to make here. First of all, TV is very different than social media. I also don't think TV is that great for kids. There's a lot of studies about how bad TV is. So, that's the first point. But the second point is that a TV sits in your house. A smartphone travels with children on their person all the time. The access to the social media apps is like 24/7 and constant. TV also parents have a lot more oversight over it. They can walk by and understand what their kids are watching. They can have more control over that.

The online world is really hidden to parents. And so, I hear Ari's point on parents have the right to control or decide if they want their child on social media or not. But the reality is that parents actually don't have effective control of that.

Nico Perrino: I do want to talk about those later, yeah.

Clare Morell: And they don't have oversight into what their kids are seeing or doing. The apps don't even allow third party software to have that kind of access. And so, to put it all on parents really isn't actually fair to parents. It's extremely difficult to oversee. And I do think that we have to understand that the nature of the technology is very different. Again, the brain research and the studies are showing that the way that they have perfectly crafted these technologies with their recommendation algorithms, with infinite scroll, with autoplay, with constant notifications, it's very difficult for kids to ever escape that type of medium.

TV wasn't pinging you with notifications in your pocket all the time or auto playing like the streaming services these days or YouTube. It's like you can't just disconnect from it. It's going to constantly be dumping content into kids’ feeds and that's the change of the recommendation algorithms is even different than the original social media where once you got to the end of your news feed in reverse chronological order, there really wasn't any more content to see. Kids would get bored. There were natural limits. And I think the technology has pushed it closer and closer to the infiniteness, limitless aspects of it are really dangerous.

There's no natural cutoff points for kids. And so, they become extremely addicting and even a bar lower than addicting, very compulsive kind of behavior inducing.

Ari Cohn: Well, I want to push back just on a couple of really quick things. It's true that phones do travel more than TVs, but TVs aren't just in the home either. And this dovetails with the parental oversight and control over TVs. Just a quick example, when I was a kid, my parents did not let me watch The Simpsons. They thought that was a corruptive [inaudible - crosstalk] [00:13:33]. I'm still mad at you about that, Mom and Dad. But what did I do when I wanted to watch it, I went to a friend's house whose parents didn't have that rule.

I didn't have a video game console as a kid. What did I do? Well, people who are young these days won't remember this, but at Blockbuster, they used to have game consoles set up in the store so that while you were browsing, you could like play video games. And me and my friends would walk over to Blockbuster and play video games for a little while. It was certainly possible to do things that got around the parental oversight back then.

Nico Perrino: We are going to get to laws, which mandate certain restrictions for minors. But you might be making Clare's point there is that because they're so easy to get around, parents need help from the government.

Ari Cohn: You might think so but I'll tell you later why that's wrong. Well, but just really quickly, two quick things. 1.) On autoplay, TVs, when you have a channel on, the next show comes on. You might say well there's the dopamine of the notifications, but that's the same thing when a TV show has a cliffhanger and you wanna see what happens in the next episode. You get that that's also a dopamine hit. There's scientific studies showing that.

Nico Perrino: That is true.

Ari Cohn: And choose your own adventure books when you wanna pick what the next step in the story is. That has also been scientifically proven to be a dopamine hit.

Nico Perrino: But you don't get the instant gratification when –

Ari Cohn: To regulate dopamine hits is to regulate having fun and feeling good. I don't think we want to do that.

Clare Morell: I think that it's the amount is probably where we could agree that there's a difference.

Ari Cohn: But before we get off of this. I think –

Nico Perrino: You guys are never going to let me get off of this.

Ari Cohn: No, absolutely not. One quick comment on the surgeon general thing and this is maybe just a really high level point. I understand the attractiveness of the rhetorical device of comparing big tech to big tobacco. It allows people to –

Nico Perrino: Well, he doesn't just compare big tech to big tobacco. He compares it to door plugs on Boeing airplanes falling out of the sky, listeria contamination. I think I'm helping you make your point here is like one of these things is not like the other. At least that's what appears at service.

Ari Cohn: But the label, Nick, is very clearly targeted towards like cigarette labeling. And it's a useful rhetorical device because it allows you to trade on the anger of a previous generation. And nobody can say that tobacco companies did nothing wrong. It's really easy to just redirect to that anger rather than build it up from scratch. It's a useful rhetorical device. But I want people making that comparison to think to themselves are you using that as a rhetorical device. And is that intellectually honest? Or do you really not understand the difference between words and a product that literally when used as directed, is physically deadly?

And if you can't understand the difference between those two things, I don't know what to tell you. But it's a really troubling comparison that just seems a little bit dishonest and maybe intentionally so.

Clare Morell: Can I just push back on that a little bit? Going back to your original question to me on the harms, I don't know if I stated this clearly enough like. The explosion in the rates of depression and anxiety and suicides among tween and teenagers and self-harm injuries, ER hospitals beds are being filled with girls coming in from self-harm injuries. We have never seen this before. We didn't see this after TV. Clearly something is different and wrong.

And you can trace that and Jonathan Height and Gene Twingy have documented this, the explosion corresponding to the time that the iPhone came out and became ubiquitous among teens and when social media really exploded and the like button was invented and the kind of whole nature of it really changed into this kind of constant infinite comparisons social ecosystem that's just inherently harmful to kids and the fact that they can never escape it. There were bullying and gossip dynamics from time memorial for kids in school. And that's not new.

But what's new is that now that follows them home 24/7 and there's just there's no escaping the kind of social environment that this has created. And so, I will just say, too, like the academic outcomes among kids are just plummeting. The reading score is now back to where it was in the 1970s, the lowest it has been since 1970. And so, it's like what has changed and it's really just the screen technologies that have become so part of childhood that Jonathan Height points to in his book and other researchers as well.

And so, I do just want to emphasize this is unprecedented in what we're seeing and it's urgent. And I understand why the surgeon general is making the point that he is, which is that the rates of the teen mental health crisis, these plummeting academic outcomes, children are being harmed by this. And it's time to say enough is enough. Now, I don't think warning labels is going to cut it. So, I would push back on the surgeon general in that aspect. And I would say as a precondition to warning labels, you actually have to have meaningful parental involvement in the process of getting on social media, which there isn't.

So, I'm not sure a warning label is really going to help if parents still don't have effective control over that. But I hope we can discuss those solutions more.

Ari Cohn: It sounds like the problem is not necessarily social media, but being too connected with each other, which I'm not sure if that's really a remediable problem. But I feel like you probably have the same situation if you put all the kids in the world in one high school. It’s basically what I'm hearing.

Nico Perrino: Well, that's a comparison that I've heard from Bob Corn Revere, the attorney at [inaudible] [00:19:14]. He's like social media is essentially high school supercharged, right? You get more gossip, you get more bullying, you get more connectedness.

Ari Cohn: But I think the more connectedness is the thing that we learn to adapt with and the thing we should embrace and learn how to harness instead of being afraid of it. And there are people who disagree with me. To be clear, there are certainly people who will argue entirely straight faced it is bad for us to be so connected. And that is certainly an opinion that that one might have.

Clare Morell: But you have to distinguish between being connected in person versus online. And I think that's part of the problem is that the online nature of these experiences, kids are projecting out these perfect versions of themselves. And it's unhealthy, too, because even the nature of the platform is you hold yourself out, you post your picture, you post your post for the judgment of peers. And that is not how we interact on a daily basis. We get to know each other, there's authentic conversations.

Nico Perrino: Boy, we went to different high schools.

Clare Morell: You understand what I'm saying though? And everything has been converted into these metrics. How many likes did you get? How many people reposted? Whereas it's not about having conversations or deeper relationships. And so, I really feel for teens and tweens today that these technologies have changed the nature of their social interaction. So, I don't think it's just connectedness. But it's the nature of how these platforms have turned everything into a metric of comparison.

Ari Cohn: I'll agree with you on one thing is that this is the generation where the adaptation is like at the peak of its rough point. I think you're probably right about that. I think that this is like the top of the hill when it comes to like how do we integrate this technology into our lives. I think it will get better sooner rather than later.

Clare Morell: I think the adaptation is that people are opting out though. Gen Z is increasingly trying to get off of these things. So, I don't know if we're adapting. I think we're recognizing that this isn't part of our human nature.

Ari Cohn: But I think that is part of adapting is figuring out how much to integrate it into our lives. And this is not to say that I agree that all these problems are actually caused by social media. But to the extent that they are, I do think that we're probably right now at the peak of like the curve of a new media until like kind of it settles into its place in our lives. I'd say probably at the beginning.

Nico Perrino: So, you said you don't believe that this is the cause, social media is the cause of all these problems.

Ari Cohn: It’s not the sole cause.

Nico Perrino: No. Well, sure. I mean, some people would argue it is.

Clare Morell: I'd add in combined with smartphones.

Nico Perrino: Some people have argued on the left that young people are living with the climate crisis. And that's the cause of their anxiety and depression. Or people on the right say they've been stewed in this culture of identity politics, and that's contributing to some of their mental anguish.

Ari Cohn: Life is complicated. Things are tough. And life gets more complicated. And there are so many different variables. All of those things probably do contribute at some point the amount of even more pervasive than normal in person comparisons probably does have some effect. I don't want to discount the possibility that that is a contributing factor. But my point is that there's so much crap in this world that it's impossible to disaggregate it.

Clare Morell: I'll just say anecdotally. Let's just step back a minute. I know we're wading into all the science and all these scientific things. But I also think we're all human. We have eyes and ears and we can look and observe what's happening in the world. When you go to a high school and you see every kid hunched over a screen not talking to each other in the hallways, not talking to their teachers, they're all glued to their screens in these little silos, I think all of us would walk in and be like, whoa, something is wrong here. This is a high school.

Nico Perrino: Yeah, they're all going to get scoliosis.

Clare Morell: Oh, sure. But why are the kids not talking to each other? I talked to a college professor. She said it's very strange. It's very eerie. It feels very inhumane when she walks into her classroom to greet her students, and no one looks up. No one says hello. Everyone is staring at a screen. And she'll ask the group a question and no one will want to answer first. Everyone has this, I don't even know what it is, no one wants to put themselves out there. No one wants to give an answer because it's created this strange environment where you feel like everyone is always evaluating everything that you're going to say or do.

It could get posted online. And so, it's created very strange human interactions. So, I hear you about is it the climate crisis, is that these other things? But I think anecdotally, you observe everyone's hunched over their phones instead of talking to each other in person. We're not interacting in human ways. Maybe it has something to do with the screens that are tethered to us 24/7.

Nico Perrino: Last point then, I want to move on to the next topic. We will probably come back to this because insofar as harms need to be considered in the First Amendment analysis, I want to look at that.

Ari Cohn: I do remember the not wanting to answer first part. I think that part has always been part of childhood. But I will say there was something special about a teacher walking into the room to a banana pants like zoo of kids screaming and yelling. And you know what, I do think that is something that if we have lost, it is kind of a tragedy. That being said, I know there's been some militation towards banning phones in schools. And I think that is actually particularly reasonable and geared towards really positive changes in terms of social interactions, learning. So, that's one thing that I actually tend to agree with people on is maybe kids shouldn't be staring at their phones all day while in school.

That's probably not great.

Nico Perrino: Ari, move your microphone just a little bit from your mouth. Let's talk now about how access to the internet is currently restricted for minors. And let's focus on what the government or government institutions require for restricting access to minors. So, if you are at a public school, the government can presumably consistent with the First Amendment and other requirements, ban phones in the school. Is that correct, Ari?

Ari Cohn: I would say, yeah, most likely.

Nico Perrino: Public schools. What other ways is access to the internet currently restricted focusing maybe on social media and pornography because a lot of the tools used to restrict access to this content is the same for both?

Ari Cohn: Well, it depends I guess, on what age range necessarily. So, for kids under 13, The Child Online Privacy Protection Act, COPPA, mandates that if platforms or websites or any online service is going to collect information from a child, there's a whole rigmarole of parental consent that you need to get, verifiable parental consent. And it's difficult enough that most services online that are not geared directly towards children that are for children don't allow people under 13 to create an account.

Nico Perrino: What does that mean to be verifiably a minor under 13 in this case? How do they verify?

Clare Morell: They don't have to verify.

Ari Cohn: So, they don't have to verify under COPPA unless the site is directed directly at children.

Nico Perrino: Does it define how you have to verify or is it pretty loose in the joints?

Clare Morell: COPPA is all about data collection. It came out in 1998 pre social media. And it says if you’re a website, you can't collect data on children under 13 without parental consent. I think that the only thing that they have to verify is parental consent. There's no actual verification of the age. And so, I'll just push back and say the de facto age for social media has been set to 13 because, of course, these companies, their business model is collecting data on minors. And so, they don't want to violate COPPA.

Ari Cohn: It's collecting data on everyone.

Clare Morell: Sure, everyone. Let me clarify. Everyone. Thank you, Ari. But without age verification, there's no enforcement. We know 8 to 12 year olds are all over social media and that's because it's extremely easy. All they have to do is enter a birthdate. Very easy to falsify, check a box to agree to the terms and services.

Nico Perrino: Like when I go to a website that sells alcohol. I put in my age, right. That's all they have to do, essentially.

Clare Morell: Yes. There's no actual process by which the website verifies that you are in fact over the age of 13. And that's my whole point about leaving it up to parents. We haven't left it up to the parents. The parents are not involved at all in this process, which is the problem. If you want to say parents should be the ones who decide whether or not their kids get on social media then, to me, age verification and parental consent are necessary preconditions.

Ari Cohn: So, one technical correction is when you have a site that is directed to minors. I'm thinking like Club Penguin or like the Magicschoolbus.com.

Clare Morell: But what’s their verification process?

Ari Cohn: You are presumed to have knowledge that your users are likely to be minors and, therefore, you have to get parental consent.

Clare Morell: Oh, that’s good.

Ari Cohn: But from [inaudible - crosstalk] [00:28:23] –

Nico Perrino: The problem is not Instagram. I think that’s really what we’re talking about here.

Ari Cohn: Or ESPN.com. Kids are interested in sports. Adults are interested in sports. Because it’s a mixed audience site, you don’t have the presumed knowledge, the implied knowledge.

Clare Morell: This is going to get very legally technical.

Nico Perrino: I’ll try and break it down.

Clare Morell: The knowledge standard in COPPA is actual knowledge, which is like the highest standard of legal liability. So, if you've even go look and say, “Oh well, has COPPA enforcement against Instagram since they're letting all these 8 to 12 year olds on their platform?” The answer is no. And now, there has been a state attorney general lawsuit brought against Meta with like 33 state attorney generals on this one federal complaint. And they're actually trying to enforce COPPA against Meta to say you know that these kids are on your platform. But what I'm saying is that standards –

Nico Perrino: And Meta is the parent company for Facebook and Instagram.

Clare Morell: But it's very difficult to prove. And so, COPPA has been largely toothless in that it has not been enforced.

Ari Cohn: YouTube wouldn't say so. YouTube got socked big time under KOPPA.

Clare Morell: But they should. And my point is that it's been very difficult to enforce COPPA and it's not effective. And it's not a social media age restriction. It's fundamentally getting at data collection. And so, we need better solutions.

Ari Cohn: The reason COPPA has survived is likely because it is a little bit more challenging to enforce because it really only severely impacts the sites that are directed at minors, which is a really, really tiny portion of the internet. But anything that goes beyond that to mixed audience or sites geared closely towards adults, you start running into much bigger First Amendment problems.

Nico Perrino: Let’s level set here. So, if the website is geared toward people 13 and under or just under 13?

Ari Cohn: Under 13.

Nico Perrino: So, this is the Magic School Bus website or something very clearly –

Ari Cohn: I don’t know if that exists by the way.

Nico Perrino: I don’t either.

Ari Cohn: I was making that up.

Nico Perrino: I have two very young children. They watch Blippy and Cocomelon.

Ari Cohn: I would probably go to the Magicschoolbus.com if it existed right now.

Nico Perrino: That's the stuff that was popular when we were kids. So, if they are under 13 then, they need to get parental consent, verifiable, presumably somehow. But if we're just talking about Meta, we're talking about Instagram, Facebook, or we're talking about TikTok or Twitter. Because those websites are geared toward a general audience, they don't have the same restrictions on them and the collection of data and the consent that's needed in order for people under 13 to access those websites. Is that correct?

Ari Cohn: Yeah. If they know their users are under 13, they have to comply with it. But they aren't presumed to know. So, if they ask what the users age is and they lie and say, “I'm 14,” the platform doesn’t really have a way to know that they’re not actually.

Clare Morell: And I'll just say that the problem this has created is that these companies are in a race to the bottom because they want to attract younger users. And it's obvious in a lot of the features. So, they might not be explicitly geared towards kids, but they are trying to recruit kids to their platforms because they know that the younger that they can hook a user to their particular platform, they can profit off of them over the course of their lifetime. And I think what we forget is that the business model of these companies is ad revenue. And the more time and attention and data they can extract from us, the users, the more that helps their profits.

And they want kids on these platforms. The evidence that has come out through this state AG lawsuit of internal communications among employees at Meta saying we got to get them while they're young, the youngest users are the best users. They even did a calculation to see how valuable a 13-year-old child is to them over the course of their lifetime. And they found it was $270.00. And so, they're commodifying our kids. And so, yes, COPPA has been toothless because it's not actually being enforced against these companies.

And so, you could say, “Oh, well, they don't have actual knowledge,” but they're actually trying to go after our kids. And they're algorithms know everything about a person from the data they collect. So, the fact that they don't know that there are 8 to 12 year olds all over their platform is just a lie.

Nico Perrino: So, you would argue that age verification should be expanded beyond COPPA and should go up to what age? Or pretty much it would have to be everywhere.

Clare Morell: I would argue 18. That's our legal age of adulthood in this country. And so, I believe that to get onto social media, you should have to verify your age. And if you're under 18, then you need to have a parent’s consent to get on or a parent involved in that process. And, again, I would personally like to see Congress just age restrict social media out of childhood entirely the way that we have for tobacco or alcohol or other dangerous substances. Or if you want to treat it like a place, we don't let kids under 18 wander into casinos or strip clubs, which I argue that the online environment of these platforms is akin to that.

But all that to say, in the meantime, states don't have to wait on Congress to do that. They can effectively put age restrictions in place under using contract law to say that kids are agreeing to a whole host of terms and services and conditions without any parental involvement. And we, generally, in contract law do not treat contracts with minors as enforceable without parental consent.

And so, I'm just trying to say that states have an ability under the law to put age verification in place on these platforms, require it, and require if it's a minor to have parental consent, parental involvement in their child creating this account because I think often, we forget that this isn't just like a one-time interaction. These children are entering into this ongoing contractional relationship with these companies. And if they're agreeing to this whole host of terms and services and conditions, parents need to be a part of that. So, that's my big push is that if you want to leave it up to parents, they actually need to be effectively part of the process.

Nico Perrino: And to verify age to ensure that someone is under 18 and your framework would require verifying the age of all internet users. So, you and I –

Clare Morell: Correct. Absolutely. And I know that this is where I'm going to get pushed back because they're like, “We don't want to infringe on adult free speech,” and it doesn't have to. And I will say if we get into technological conversation, there are means of doing age verification now. We have the technological means to do this in a way that ensures user privacy that is easy for adults to do but that effectively keeps kids off of the platforms.

Ari Cohn: So, I'm going to push back just really quickly on the contract thing because really, even if that is true, the only really thing to do there is that the contract is voidable at the option of the minor. But that that's fine. But what does that get you?

Nico Perrino: Minors enter into contracts, presumably, all the time. It's a question of whether it's in enforceable, right, because if you enter, if you enter like a Target or something, there’s certain rules associated with that. You go to any website –

Clare Morell: There's been cases that show, too, though that even if it's not an enforceable contract with a minor, parents can be held responsible for benefits received under that contract. And so, you have to have parental involvement. And we have this in liability waivers, even to get a tattoo, which is, obviously, that could be considered a form of expression the way social media could be. Parents have to be involved in that process. So, I think there is a lot of precedence for saying parents should be involved in the process of these children entering into terms and services with these huge companies.

Nico Perrino: All right. I want you to address the age verification question broadly. But to Clare's point, I just want to frame it squarely, there are certain things that minors can't do without parental consent, right? Can't go into a strip club probably. Can't see an R rated movie, although that might be enforced by the movie theaters and not necessarily the government.

Ari Cohn: It's absolutely not a government restriction. It's the whole reason the NPAA exists.

Nico Perrino: You can't get a tattoo. Form of expression, right? So, there are things that in the real world, online is not the real world, but the brick and mortar world, that minors can't do without consent often involving expression to the extent you consider strip clubs or movie theaters or tattoos expression as well. So, how do you think about this as a First Amendment lawyer and scholar?

Ari Cohn: Well, I think about it in that first of all, there are big differences in between the brick and mortar world and the online world. And I think one of them actually lines up with what Clare was just saying is the immense amount of data collection, privacy concerns that we have. There is a world of a difference between handing your ID to a store clerk to buy a Playboy who probably can't remember your face or your name two seconds later and creating a permanent online record of your internet browsing history. That is just a gigantically different scale of problem. And it has gotten worse. It's a pervasive issue.

And there's also the fact that, technically, few and I'm not sure if any laws actually require somebody to check your ID before you can buy a Playboy. You can get in trouble if you sell a minor a Playboy or what have you, but there is no law affirmatively requiring, at least in most places. I can't really say –

Nico Perrino: And you can get in trouble because under the First Amendment, there is an obscene as to minor standard, which essentially means, like any depictions of sex or for nudity.

Ari Cohn: Right. Right. I wouldn't say not any depictions because there are certain things that have redeeming social, political, educational value for as two children or minors I should say.

Nico Perrino: But the standard is much lower for kids as opposed to adults.

Clare Morell: Yes, it is. And obscenity, we all know, is not protected speech.

Ari Cohn: But porn is not obscenity for adults. Porn might be obscene as to minors.

Clare Morell: Wow. We could get into a whole conversation on pornography because the pornography of today online, I don't think parents realize, is extremely violent. I mean, it is grotesque. It is every definition –

Nico Perrino: I'm sure parents do realize. I think 60% of Americans have watched porn in their lives. That would probably include [inaudible - crosstalk] [00:38:45].

Clare Morell: Well, I'm just saying what their kids are seeing online today. I mean, all of that is obscenity. It is horrendous. And I don't really want to get into the details on this podcast.

Nico Perrino: Yeah. That was the last podcast.

Clare Morell: Good to know that there is one because I think parents don't necessarily realize how easily kids are coming across this when they're not even meaning to, the accidental exposure that's happening because this stuff is all over their social media or easily clicked through social media and the fact that it's violent, it's grotesque. It's depicting things like incest and it is not any type of educational value.

Ari Cohn: It’s not your grandma’s porn is what you’re saying?

Clare Morell: It's not a Playboy. So, we have to clarify, I think, a little bit when we're talking about online pornography that it is obscenity.

Ari Cohn: But that’s a subjective opinion. Under the law, under the First Amendment, obscenity doctrine –

Nico Perrino: It’s no longer I know it when I see it?

Ari Cohn: Yeah, well, mercifully. Obscenity doctrine has kind of waned a little bit.

Nico Perrino: Well, it's become essentially a dead letter, at least as applied to adults, but it still has teeth as applied minors.

Ari Cohn: Yeah. As to minors, it still exists

Nico Perrino: Whether you agree with that or not, I mean, the courts haven't enforced an obscenity sticker, at least at the Supreme Court in a long time.

Ari Cohn: I'm not going to sit here and be the person arguing that kids should have unfettered access to porn. So, don’t worry.

Nico Perrino: We don't have to go down that rabbit hole? Okay.

Ari Cohn: Yeah, exactly. But when I say subjective, I meant as to adults. It’s perfectly reasonable for someone to say porn today is gross and I don’t like it. I think that is a supportable opinion. Whether or not that has legal effect is another matter

Ari Cohn: Well, and you were talking about the technological stuff, the data verification.

Nico Perrino: I do want to talk about that, yeah, sure. Well, I was just going to say I think there are ways to do it. I'm not saying every adult now needs to go hand over their government ID to Facebook or Meta. But there’s actually ways technologically of implementing this that is protective of user privacy because I'm concerned about that too, Ari. And I will just give an example from Louisiana. Increasingly, states are offering digital IDs. And Louisiana worked with a third party vendor that is in conjunction with their state to provide digital licenses to residents of Louisiana.

And the company that creates these digital IDs is now they've integrated this to basically provide this anonymous method of age verification to comply with their law restricting minors from adult websites that would allow an adult to, basically, you can also YouTube this, there's like a demo of it on YouTube that shows you exactly how it would be done. But if they're trying to access an adult website, they can click, “I have a digital ID,” through Louisiana Wallet. And then, all it does is then it communicates to Louisiana Wallet is this person above 18 or not without revealing any underlying information about them to the website.

And then, it creates like a little token. And you click through. And maybe it takes all of 30 seconds and no information about you has been revealed to that website whatsoever. And it's, I think, being integrated with the government the same way that we get from the Department of Motor Vehicles, our physical license. States are increasingly offering digital IDs so people can have it in their Apple Wallet and go through in the airports. And so, there's ways of doing this that are anonymous that is not handing over or compromising further information.

Ari Cohn: But that's not anonymous. That ignores half the problem. And that is the website might have no idea who you are, but the government now knows that you're accessing porn.

Nico Perrino: Or the third party website.

Ari Cohn: Yeah, or the third party. And there's a whole lot of people that say well, some of these laws say, well, the government can't have access to the data. And I say if you trust that, I have a bridge in Brooklyn to sell you. But also, if you're worried about say if you remember the IRS scandal where conservative 501C4s are being targeted or whatever, when the government has used information that it shouldn't have to go after people that they don't like. Once that data exists, it will ultimately, inevitably, invariably be used for that purpose in the future at some point by someone because that is just what happens?

Clare Morell: I don't think you can know that. And also, I think that they've tried to put protective silos in place. It's not to say that that information is getting recorded or kept anywhere. And I think in the laws that are crafted, you can be very clear like a lot of states have done this. If any of this underlying information is retained on anybody, you have the right to then bring a lawsuit for that. And so, they're trying to actually build into these laws protections for people's privacy.

Ari Cohn: But how does that work? How do you ever prove compliance if you have to delete all of the compliance data? That means the government has to trust the platform or the website when they say, “Oh, see. Here in our records, it says we verified the ID's.” And I don't know one state regulator who's going to take any of these websites at their word. But if you have to delete the compliance data then, you can never prove compliance. That data has to exist, it must exist.

Nico Perrino: I do want to level set a little bit by asking Clare presumably, you don't believe it's bulletproof with regard to privacy, right? Even companies that are tasked with maintaining privacy LastPass, for example, which is a password protection tool, got hacked, right? But at the same time, can you ever expect –

Clare Morell: Sure.

Nico Perrino: To the extent they're trying to put in all these checks and balances, Ari, to address what is widely perceived to be harm, is it enough to have like a reasonable assurance of privacy? Are they doing enough, right?

Ari Cohn: Well, I think it's worth taking a step back and talking about why age verification is problematic in the first place beyond necessarily just the privacy implications because that actually helps explain why that isn't sufficient. We've been down this road before.

Nico Perrino: Age verification 101, right?

Ari Cohn: Yeah. In the late ‘90s, Congress was trying to do basically the same thing the states are trying to do now. And they tried really hard. They tried with the Communications Decency Act, which got struck down because it was absurdly overbroad.

Clare Morell: It was very overbroad. I'll admit that, too.

Ari Cohn: I can't believe anyone thought that was going to work. They tried the Child Online Protection Act. They tried it multiple times, in fact. And it kept getting struck down in part because of age verification.

Nico Perrino: So, these two decisions, just for our listeners, are Reno v. ACLU, which is the 1996 Supreme Court case, right, and then Ashcroft v. ACLU in 2003.

Ari Cohn: And then, there is now Ashcroft v. ACLU in the Third Circuit, which was the second round, which then got denied after the Third Circuit had a couple rounds of striking it down. So, there's two Supreme Court decisions and two or three relevant, non-reviewed Third Circuit cases or Appellate Court cases.

Nico Perrino: And these have struck down age verification requirements. But if I'm not mistaken, much of it had to do with just the realization that there wasn't the technological means to ensure privacy.

Ari Cohn: Well, it's not necessarily to improve privacy. So, one of the things that that happened is the Congress said you can be prosecuted for sending indecent materials to a minor. But one way you can protect yourself from this is if you age gate content. And one of the ways you can do this is by credit card verification. And the court took a look at it and they said there's two problems with this. First of all, there is the chilling effect of asking someone to provide their identification before they access sensitive materials.

Nico Perrino: So, the chilling effect idea is this idea that you go to that website, you're asked to do this thing, so you're providing information and it might make you think twice before you actually access the material you would otherwise want to access.

Ari Cohn: You are constitutionally entitled to access. I'm going to take this in two parts and explain why, even if there is some kind of –

Nico Perrino: But on the constitutionally entitled point, adults are constitutionally entitled to access that, but minors are not, right, which is where the rubber hits the road on the age verification.

Ari Cohn: But the problem is that if you're trying to verify age, you have to verify everyone's age and then, you end up impacting the rights of adults as well.

Clare Morell: Not necessarily. I'm just going to keep saying not necessarily because I do think that 20 years after Ashcroft, the technology has changed vastly for the good and for the bad. The bad being now, kids have way more access to pornography, potentially, violent pornography, things that are obscene and are not protected speech because the smartphone and social media have brought it to their fingertips 24/7. The ease of access is immense. And for the good though, we've also improved the way that age verification can be done. And Ashcroft really said there was a compelling government interest to protect children from this content.

However, they said that age verification at that time was not the least restricted means and filters would be more effective. However, I think we can all see that filters have not been effective, particularly if we're getting into the weeds on the technology, filters don't operate inside of social media apps. Third party apps are blocked access to these social media platforms. And so, kids can get to is within Snapchat five clicks to Porn Hub and access the website from within the app, and no filter can block it. So, parents are really rendered helpless with filters.

They can block external websites on browsers, but they're not able to block where most kids are coming across this today, which is on social media itself. And so, I think putting age gates in place means then if a kid tries to click through to Porn Hub, they're blocked before having [inaudible - crosstalk] [00:48:52].

Ari Cohn: But the question was whether we can do it without impacting the rights of adults. And you said not necessarily, but nothing you just said doesn't mean we don't impact the rights of adults.

Clare Morell: I have a whole brief written on this on ways that age verification can be done. There's things like zero knowledge proofs, utilizing cryptographic techniques. Again, I would really argue for a third party authentication method, not handing over these things to the websites themselves. And also, again, the technology exists. There's ways you could actually have people do it on the device itself. Device level age verification is increasingly being pushed where you would have to provide information one time to the device and then, it would create a secure token or zero knowledge proof key to show that this person is over 18 and then, the device could delete that scan of your government ID.

So, there's ways to do this securely without creating more information about you, whether it's giving a thumbs up or thumbs down to a website.

Ari Cohn: But that's only half the problem.

Nico Perrino: Last point and then, I have another question.

Ari Cohn: Because the chilling effect isn't just from the fact that the record exists. The chill actually happens when an adult tries to access protected information and then is asked to provide their ID. And what happens after they provide it is part of the equation. But the moment they are asked and they say to themselves, “Do I really want to do this,” whether or not they're thinking about what happens to the data afterwards and they decide not to, that is where the harm occurs. So, it is in fact the act of being asked that is part of when the chill happens.

So, I don't think that even if any of these technologies can address to a reasonable extent the privacy concerns, and I'm not yet convinced they do, but assuming that they would, I don't think that really solves the whole problem.

Nico Perrino: So, we all know that adults have First Amendment rights and minors, do we agree, have some First Amendment rights? They're more narrow.

Clare Morell: Limited.

Ari Cohn: Only slightly.

Nico Perrino: So, to the extent the content we're talking about restricting here is protected by the First Amendment, the government has to show what in order to justify restrictions on it? I'm going back now and I'm thinking back to the Supreme Court's decision in Brown v. Entertainment Merchants Association, which addressed a California law that I believe if I'm recalling correctly, restricted people under the age of 18 from buying violent video games without parental consent. And the Supreme Court held and I think it was Scalia writing for the majority here that the government does not have a free floating power to restrict the ideas to which children may be exposed.

So, all of this kind of age verification stuff assumes that the government has the right to restrict access to this content that, and we can disagree here, may or may not be protected by the First Amendment vis a vis minors rights. How do you look at this, Clare?

Clare Morell: I have a couple of points on the Brown case. And first, I guess, thinking about it as it relates to restricting access to adult websites online, obscenity is not protected speech. And when the Fifth Circuit upheld Texas's law restricting adult websites, it found that because that law was so narrow as to deal with just obscenity, whereas COPPA and the CDA were much more broad in obscenity, then the precedent that they relied upon was Ginsburg v. New York from 1964/1965, I believe, instead of Ashcroft. And they argued that this type of law is actually subject to rational basis review, not strict scrutiny.

Ari Cohn: Which ignored the law entirely.

Clare Morell: No. It looked and it said it's narrowly tailored to just obscenity, which is not protected speech.

Ari Cohn: But obscenity as to minors. You have to be clear about it. It's obscenity as to minors, but not as to adults. But if the content is being accessed by both, you can't treat it just as obscenity to minors and regulate it for adults under the standard for minors. That that ignores Ashcroft entirely. It ignored binding Supreme Court precedent.

Nico Perrino: The Supreme Court held in Bulger v. Young's drug products in 1983 that the government cannot limit discourse to that which is suitable for a sandbox to that which would be suitable for the sandbox. But the courts are at this point divided on this question.

Clare Morell: They are.

Nico Perrino: So, the Fifth Circuit has ruled one way that you have some –

Ari Cohn: The Fifth Circuit is divided with everyone on pretty much everything because it's a lawless place.

Clare Morell: Hold on a second.

Nico Perrino: Shots fired. Where are you writing that brief for? What circuit are you writing that brief for, Ari?

Ari Cohn: The Ninth.

Clare Morell: We can debate that more because I also think that, even if you take the Ashcroft precedent that that needs to be revisited because, again, of the technological changes since then. But I do want to get back to your point on Brown v. Entertainment Merchant Boards Association. And if we're looking at social media laws, which the Arkansas court that struck down Arkansas parental consent law did rely on Brown, but it didn't distinguish that. The social media laws are not focusing. They're not content based restriction the way that Brown was to say violent video games. It's treating social media as a form of communication.

These are akin to time, place, manner restrictions. These laws are trying to say the manner, the form of communication of social media, not about particular types of content, is inherently harmful to children and should, therefore, have an age restriction with parental consent. And so, it's being, I think, conflated to be a content based restriction, but it's actually content neutral and it's dealing with the form of communication.

Nico Perrino: So, it’s restrictions on movies, not a content based restriction then if it's just a form of communication?

Clare Morell: I mean, arguably, yes. Movie is a form –

Ari Cohn: But that is not what the courts upheld.

Clare Morell: I'll give it this. It's different than a movie. It's different than a video game in terms of it's creating a contractual relationship between the social media company and a child that a parent is not a part of. And these laws are trying to rely on contract law to say these minors are agreeing to these whole host of terms and conditions. Also, they're entering then into a contract with a company without parental involvement. And so, I think you have to distinguish that the nature of these things, it's an ongoing relationship isn't like a one-time purchase of a video game or a one-time viewing of a movie.

We're not trying to restrict kids away from particular categories of content on social media. We're trying to say that it's a form of communication that has been shown to be inherently harmful to minors. And so, it's putting a time, place, and manner restriction on it. And it's not to say kids can't get on social media, but just a parent has to be part of that process consenting to that. And so, I think there are fundamental differences that often get misinterpreted.

Nico Perrino: Make your point and then, I have a question for you, Ari.

Ari Cohn: But it hasn't been proven to the extent that that the First Amendment requires, first of all. And second of all, I would dispute that it's content neutral because I guarantee you that if Facebook, for instance, was full of simply only educational videos and cat videos and fun, positive, happy go lucky, every day, we love kids to have access to this kind of thing and kids were spending all day on it and saying, “I can't sleep. I'm too engrossed in these educational articles,” nobody would be passing these laws. Nobody would be passing these laws.

Nobody tried to pass these laws when I read so much as a kid and I would bring a book into the bathroom when my mom would yell like, “Ari, you've been in there reading for like 45 minutes. Get out of the damn bathroom.”

Nico Perrino: But I think the argument is this isn't just a one way communication where you're watching cat videos and the viewer is viewing them. There's social credibility that comes from the likes and retweets and you're sharing content in the two way means. You also have direct messages with people, which I know has raised some concerns from activists. So, it's a little bit different than just like essentially sitting in front of a TV like is kind of what you're describing.

Ari Cohn: It's not an exact analogy. I will say it's not an exact analogy. But I do think if you look at the trajectory of what people are saying about why social media is harmful, it invariably comes back primarily to the content. And when you go to DMs, part of what's wrong with the DMs is the content of messages that people get. It is fundamentally, I believe, arguably content based.

Nico Perrino: Some of the things that activists are concerned about you could say to the extent solicitation is a form of content.

Ari Cohn: Well, that's true. But do we ban kids from going to the park because somebody might drive up in a panel van and offer them candy?

Nico Perrino: Well, no.

Ari Cohn: And people are very upset right now that parents who let their kids go to the park alone are getting arrested and shit like that. There's a whole list of things that people are really upset that kids can't do on their own anymore that somehow those same people are very upset that kids can do on the internet.

Clare Morell: Well, I will just say, too, you even said these platforms aren't primarily geared towards children. These are inherently adult environments in the same way that we restrict kids. Sure, a kid could have an edifying conversation in a strip club, hypothetically. But it's a really dangerous environment for kids. And there are other ways for kids to have those conversations. Again, the internet as a whole is not what we're trying to block out. It's that this format and the environment that kids are being exposed to is very adult. And so, we're trying to say like in the same way in the real world, we're not just letting kids wander into bars and strip clubs and casinos without a parental involvement.

The same should apply to the online world.

Ari Cohn: But if you're talking about strip clubs and casinos, that is content based. So, you can't say it's not content based but then, draw an analogy to something that is content based.

Clare Morell: But it's a place of interaction. It's not just about –

Ari Cohn: But even so, I think format comes close to content in a way. I don't think that a court if we said, “Kids are getting paper cuts. We are going to ban kids from reading books or newspapers,” I don’t think the courts are gonna be like, “Reasonable time, place, and manner restriction,” because this isn't Nam. We still have rules.

Clare Morell: Well, yeah, but it's very different.

Nico Perrino: I think we have a disagreement.

Ari Cohn: But here’s why, it fails even intermediate scrutiny.

Nico Perrino: We're about to talk about scrutiny. Holy cow. So, let's try and break it down simply because Clare has referenced rational basis. So, I think there's maybe a disagreement here about what level of scrutiny the courts and what rigor the courts need to have to analyze the compelling government interest in these regulations? It sounds like, Clare, what you're saying is that one of the lower levels of scrutiny, or the lowest rational basis scrutiny, is what can or should be implied. And Ari, what you're saying, you just mentioned intermediate scrutiny. There's also strict scrutiny, which is often applied in the First Amendment context.

Does it all kind of boil down to how you define these media? I can never remember if media is plural or singular.

Ari Cohn: I'm on record every way with that one.

Nico Perrino: And whether it's a content based restriction or not. I don't know that we're going to square this. Reasonable minds can disagree that’s why we’re here.

Clare Morell: It’s complicated.

Ari Cohn: We’re here to solve all of the problems right now.

Clare Morell: I’ll just briefly say it’s not only about the level of scrutiny in the sense that Justice Thomas in the dissenting opinion for Brown said that the freedom of speech as originally understood does not include a right to speak to minors or right of minors to access speech without going through the minor’s parents or guardians. And so, I think there's also a fundamental debate about that, regardless of what level of scrutiny you're treating at.

Ari Cohn: I don't know. The majority pretty well tore him a new one over that one.

Clare Morell: But there's a dissent to recognize there. And I think, Ari, it might just come down to a philosophical disagreement between us, which is that does the government exist to protect kids from their parents, or does the government exist to protect parents’ rights to protect their children?

Ari Cohn: I think that's a false dichotomy.

Nico Perrino: Well, it's actually a good dichotomy because I think it gets into some of the voluntary measures that are available that I do want to discuss. But were you looking to get in a last point on this?

Ari Cohn: I think the argument that Brown is maybe not applicable in that it applied strict scrutiny is one that I disagree with, but I think at least it is arguable. So, I want to take a second to address the court would have applied intermediate scrutiny if it was content neutral. I don't think that the social media laws, actually, would satisfy intermediate scrutiny either for a few reasons. I'm not sure they'd be narrowly tailored. And they don't have to be the least restrictive means under –

Nico Perrino: Som what do those mean? Narrowly tailored and least restrictive means? This is my role on this podcast to make sure that all of our non-lawyers who listen understand what the heck is being talked about here. Also, I'm a non-lawyer so I need to know what's being talk about here.

Ari Cohn: A law that restricts speech. At least if you're on rational basis land, basically, everything survives. If the government wasn't on drugs –

Nico Perrino: These are corporated doctrines, right?

Ari Cohn: Exactly.

Clare Morell: Yeah, but there's a legitimate interest.

Ari Cohn: Right. Rational basis, basically, is these people aren’t, literally, insane –

Nico Perrino: The government has some interest in whatever they're trying to regulate.

Ari Cohn: Yeah. And they had, literally, any reason they could possibly think of to think that this law would help, it survives.

Nico Perrino: Rational basis, there we go.

Ari Cohn: It is extremely easy. Intermediate scrutiny is a little bit different. It has to be reasonable, narrowly tailored. It shouldn't affect way more speech than is actually necessary.

Nico Perrino: To meet the compelling government interest.

Ari Cohn: Right. It doesn't have to be the least restrictive means, which I'll get to in a second under strict scrutiny. It also has to leave open ample alternative channels for communication. And that's one of the other reasons I think this would fail is that, yes, there is the rest of the internet. There is the real world. But one of the factors in deciding whether a time, place, and manner restriction leaves open ample alternative channels for communication is the intended audience and the format of the speech. So, the government can't say, “Well, I'm going to prohibit you from speaking in this place because you can go online and do it.”

The government can't do it the other way either. Part of the reason people are on social media is because it allows you to reach the world in a way that is easy with the user base of that platform. So, I don't know that a restriction on minor social media platforms, even if viewed as content neutral, leaves open ample alternative channels for communication. Then, you get strict scrutiny. It adds it has to be the least restrictive means. So, it cannot be anything more than the lowest possible thing required to fix the problem.

Clare Morell: Yeah. I'll push back. I'm not conceding that social media laws are a content based law.

Ari Cohn: I don’t take you as conceding to that.

Clare Morell: Just to be clear.

Nico Perrino: For the record, folks.

Clare Morell: Yeah. For the record. The first thing is I think that we have to treat these as it’s fundamentally a law about contracts. And it's because the nature of social media, understand it's a platform where people communicate. But kids are consenting to have their data collected by these companies. They're agreeing to terms and obligations that a parent is not involved in at all. And so, if we're looking at this as a contract, then it's content neutral and it's about contract formation between a child and the company. And a parent needs to be involved in that process. So, I don't want to concede that that should be treated as content neutral restriction.

Nico Perrino: Can I ask a clarifying question on? So, you would be okay then, if minors could access Facebook just to look and didn’t create a contract? I think you used to be able to do that with Twitter before Elon Musk locked it down.

Clare Morell: Create an account.

Nico Perrino: Because they're not creating a contract, theoretically, right? They're just accessing a website like any other website, and there might be cookies associated with it.

Clare Morell: It's an interesting question I haven't honestly spent a lot of time considering because I think a lot of the way that social media is harmful to kids is because of the types of interactions, the ongoing kind of atmosphere that they're entering through creating an account because I think we talked about this earlier. But YouTube wouldn't fall under any of these laws because kids aren't creating an account. They're just going on and viewing YouTube. And so, yes, to answer your question. I might personally be not a fan of kids going on and accessing tons and tons of YouTube videos.

But in terms of the law, we're narrowly looking at kids creating accounts, ongoing relationships with these companies. And so, these laws would not capture kid looking at a Facebook page or watching videos on YouTube if that answers your question.

Ari Cohn: Nico, how excited are you for your first two hour episode?

Nico Perrino: We’re not going two hours. We're not going two hours. So, I do want to move on to the next because we could talk about, God forbid, scrutiny and form analysis and all this other fun stuff. But I want to talk now about voluntary measures that might be available for parents who are concerned about the potential harms of social media or porn on their kids. And this actually might tie in with the level of scrutiny, right, Ari, because if you're applying strict scrutiny and it's the least restrictive means, I don't know if voluntary measures are available then, there are other means that are less restrictive for parents to address.

I think, in the Playboy case, right, the Supreme Court, this was in 2000, said that if these voluntary means are met with a collective yawn, that doesn't justify government involvement.

Ari Cohn: And also, if the government is not permitted to assume that if given full information, parents will refuse to act. Before I go into that, I'll say one of the other things that relates to this is that when it comes to the least restrictive means and whether the regulation is actually necessary, the government can't just say there is a problem with this less restrictive means. It has the burden to affirmatively prove that what it wants to do is better. And government is just really –

Nico Perrino: Well, isn’t it always going to be better when the government compels people to do things, right? Well, like at least more effective presumably, right?

Ari Cohn: Yeah. But I'm talking about like vis a vis the particular flaw in a sense.

Nico Perrino: I see what you're saying. Let's talk about voluntary measures.

Ari Cohn: Because I think that if given full information kind of points us to a thing that maybe we can find common ground on. I think if perhaps there is a push towards making platforms make these parental controls easier to understand, creating more help for parents to use them, making them more robust, I think there'll be a lot less objection to that. I am very much in favor of platforms creating tools that help parents. I am very much in favor of the platforms finding a way if they have a problem with platform security and privacy and things like that to allow and enable third party developers to create tools that help parents.

Nico Perrino: But you want them to be voluntary. You want this to be a voluntary choice that these platforms made as opposed to a choice that's imposed on them by the government.

Ari Cohn: I think that I my objection to the government forcing platforms to do that would be significantly less than my objection to government saying minors can't be on social media. I think it is an easier constitutional lift in a sense, not to mandate their use, but to mandate making them available. I think it's probably a lesser constitutional problem on the whole. I'm not entirely sure. And this would all, again, come down to legislative language how much I would agree with it. But I think it's an easier sell for sure.

Nico Perrino: You, presumably, Clare, don't think that the voluntary measures are sufficient, that they exist currently now, but they're not working.

Clare Morell: Yes. No, I do not think that they’re sufficient. In fact, that's the whole premise of the book that I'm writing.

Nico Perrino: Let’s plug that book again.

Ari Cohn: It's called the Tech Exit, A Manifesto for Freeing Our Kids. And I'm making a maximalist argument in that book because the current strategies of screen time limits, parental controls are not sufficient because the notion of parental controls is actually a myth. Like if you look at the settings that these apps have, a parent actually doesn't control them. It's like they might default to this for minors. Oh, but then minors can just go into the settings and change it. Oh, here's how to talk to your minor about how to put these settings on like restricting DMs from strangers.

But the child can change those at any time. If we're using the word control, it's a myth. And that parents aren't the ones with the ultimate control, the children are. And the companies aren't actually trying to help parents. And then, it's maddeningly because then you say, well, the companies aren't actually trying to help parents. So, I'm going to go out, I'm going to buy a private parental control monitoring software. Oh, but wait, actually they can't access the social media app for kids where spending a majority of their time.

So, the app can tell me my kid spent two hours on Instagram today. I have no idea what they saw on Instagram. I have no idea who they talked to or what they did. And so, parents don't actually have oversight in the way that we do in the real world. And so, I just want to get back to this point that we're trying to put parents back in the equation. In the virtual world, they've been completely left out. It is like the Wild West online where parents aren't actually part of overseeing it the way that you would be in a home so I could see what my kids are watching on TV.

Or if my kid says, “Hey, mom, I wanna go participate in this political rally downtown,” I could say, “You know, that actually just doesn't sound that safe. Maybe there's other ways you can engage in the political process without going to this dangerous rally.” There's none of that parental guidance or oversight whatsoever in the online world. And so, again, I just want to get back to that what these solutions are trying to do is put parents back in the equation because the voluntary measures that they have been offered are completely ineffective. And again, just even if we're going to talk about adult content that kids are accessing online, you could say, “Well, we'll just put filters in place.”

And filters work in a web browser, though not perfectly. But again, where most kids are coming across this adult mature content is within the social media apps themselves. And the companies aren't allowing external filters access. And so, parents don't actually have control is my point. And so, putting them back into the equation is, again, the solutions that I'm advocating for.

Nico Perrino: Kids are pretty smart with this stuff, right? My son is 3 years old. He knows how to scroll and pick videos. But let me ask one question that's always been on my mind. And it might just be something that's easy or for me to say because my kids are 3 and 1 years old. But sure, the filtering stuff is hard. But that presumes they have a tablet or a phone or access to this stuff in the first place.

Ari Cohn: And nobody can afford a $1,000.00 on their own.

Nico Perrino: And at least my kids, I don't know, they're always in controlled environments. They're either with me or at school. And I have a choice.

Ari Cohn: That's cute that you think school is a controlled environment.

Nico Perrino: One of them goes to daycare at the House of Representatives. I think it's the most controlled environment of any daycare in the country. But there's a choice to get your kids a phone. Right? And they're expensive. I think often you need a credit check in order to get a data package. So, I think part of it, Clare, right, is that there's, the social pressure on parents to get them a phone.

Clare Morell: Absolutely.

Nico Perrino: But does that justify the government getting involved to help alleviate that social press?

Clare Morell: Yes. In part, yes, in the sense that there are, and Jonathan Height and others have documented, because it's not just an individual level effect that these technologies are having. So, even if I make the decision, which I personally will make that my kids are not going to get a smartphone. They're not going to get access to social media.

Nico Perrino: Until 18?

Clare Morell: Until 18.

Nico Perrino: Oh, wow. On the record, okay.

Clare Morell: You can hold me to it. But I would like to say I think that is a big part of what we didn't really even get into today is the smartphone is a lot of why these things are so problematic. It's not just the kids accessing social media for a limited amount of time on a desktop computer where everybody in the house can walk by and see it. It is the combination. And Height really gets at that. It's this phone based childhood. So, yes, a parent could say, “Well then, I'm not going to give a smartphone.” But the social pressures are immense. It's not even just social. Schools require kids to use QR codes and apps for things in class.

The broader, I guess, economic, educational, social forces in society are all pushing towards a smartphone based childhood. And so, we do need help I think government wise, government solutions to try to push phones out of childhood. But to just get back to your point, even if I just am like I'm going to be creative. I'm going to find work arounds. I'm not going to accept that they need a smartphone to do this assignment. I as the parent am going to do everything I can to push against these great pressures towards it.

It still doesn't alleviate the fact that my child's social environment is all mediated through these online platforms and interactions that I, as an individual parent, can't combat. And so, I think that is why it does necessitate government intervention when we recognize, and I guess I need to continue to push this point about the harms, when we recognize something's inherently harmful for children. And I hear you trying to push back on some of the benefits, Ari.

Ari Cohn: It's also not proven to be inherently harmful.

Clare Morell: I think it is. And I will say that it's kind of like saying, “Oh, you know, Kool Aid is a great source of vitamin C for kids.” And it's like, well, there's a lot of really bad things in there and there's other good sources of vitamin C. And so I think likewise.

Nico Perrino: But you do acknowledge there are some benefits? The surgeon general did [inaudible - crosstalk] [01:16:01].

Clare Morell: I would say that to the as a whole. Yes, of course, you could say, “Oh, there's this one educational video on TikTok that was helpful to my kid.” But it doesn't outweigh the fact that the environment it creates is very harmful by the nature of the technology and to say that there is other means for kids accessing educational content, again, when mediated through the parent. And I think my biggest hang up with social media is that the parents are not actually involved. They don't have oversight or control the way they do over other things online or in the real world.

And so, I think his analogy towards big tobacco, alcohol, other substances or technologies are dangerous for kids is a good analogy and is a way to consider then when government action is needed is I think two conditions. When we recognize the society is harmful for kids and shouldn't be left up to individual parents. And when there are collective aspects to that harm that individual parents on their own can't mitigate. To me, that pushes for why we need a public policy solution.

Ari Cohn: So, first of all, I'm old enough to remember when parents had the fortitude to say I don't care if everyone else is doing it. You're not doing it. And to the extent parents have lost that, I think that is a problem with the parents. And my parents were never shy about being like, “I don't give a shit what your friends are doing. You're going to do what I tell you to do.”

Clare Morell: Yeah, my parents, too. I grew up without a cell phone for that reason.

Ari Cohn: But you know what? There were times in my childhood that I really hated them for it. And there are times looking back that I'm like, “You know what? I'm really glad you did that.”

Nico Perrino: Did you grow up with a lot of technology?

Ari Cohn: My parents were huge techies. My dad worked for Motorola for 25 plus years. We always had, basically, the technology as it was coming out.

Nico Perrino: Did Motorola make those Razor phones? Was that Motorola?

Ari Cohn: Yes. But then, that was well after. We had their first like big one, too. We had broadband internet when it was first out.

Nico Perrino: You had access to this?

Ari Cohn: We had computers. We did with some controls. And this actually goes into exactly a point that I wanted to make about something that you and Clare were talking about about kids finding a way around things. And I will say I was a terror as a child. And I very easily found a way around every piece of filtering software my dad ever installed. And it involved some hokery pokery on the computer. But I was smart and I figured it out, even back then. Kids today are, obviously, gonna figure out a way around that, too. And that actually undermines the constitutionality of government intervention. And I'm going to tie it all together with a neat bow for you here.

Nico Perrino: Yeah, very quickly because we gotta wrap up here.

Ari Cohn: One of the reasons age verification was found not to be narrowly tailored in the original cases was because it did not actually verify age. Anyone can put in any credit card number. The same thing happens now for what it's worth.

Nico Perrino: Well, it sounds like a pretty crappy credit card authenticator.

Ari Cohn: On the internet, no one knows you're a dog. Anyone could put any information in. If I'm using Yodi, I can handle –

Nico Perrino: Sure. But when I’m trying to put my credit card in and if I put a six instead of a seven, it’s usually like –

Ari Cohn: No, no, I mean, you can take somebody else's credit card.

Nico Perrino: I see what you're saying. Okay.

Ari Cohn: If somebody's kid is signing up with Yodi, they can hand their phone to their older brother or whatever, do the facial identification. There is actually still no way to effectively show and it's actually also fairly easy to trick selfie verification using free software that anyone can learn how to use in about 10 minutes.

Clare Morell: I think it's pretty hard to falsify a driver's license.

Ari Cohn: But you can always use your brother’s driver’s license. The problem is that there is no way to ensure that the person entering the information is actually the person who's going to use the account.

Clare Morell: But that puts up a huge barrier. That's a very small percentage.

Nico Perrino: And still, growing up in high school, I think I passed the statute of limitations, we would have a friend's brother go and buy our alcohol if we wanted to get alcohol. There's no way to verify that that alcohol doesn't go into the mouth of someone under 21.

Ari Cohn: Right. But that's also not a speech restriction. So, there's a difference there in the constitutional analysis. So, it wasn't narrowly tailored because it was effectively unavailable. I think that problem still exists. But that was more of an aside to the point of, actually, if kids are going to find a way around it, that also impacts whether something is narrowly tailored and appropriately tied to fixing the problem.

Nico Perrino: And it depends what level of scrutiny you apply to it, right? Then, we get back to where we started.

Ari Cohn: All right, Nico. Do you want this two hour episode or not?

Nico Perrino: No, no. But I'm just trying to pull it all together, right?

Ari Cohn: I'll go there.

Nico Perrino: But Clare, did you want to have a last word here?

Clare Morell: All I would say is that if you want parents to be part of the solution, there needs to be some public policy changes in the law to actually make that possible. And so, when we say, “Oh, we're leaving it up to parents with these voluntary measures,” they're meaningless in the sense that parents don't have any real oversight. And we're trying to say Congress, it probably would take a while for them to actually age restrict social media the way I would want them to. But states are doing this. Like seven states have passed these laws saying we want parents to be part of kids forming these online accounts so that it's up to the parent, Ari..

Ari Cohn: But most of them have been enjoined as unconstitutional.

Clare Morell: No, but the battle is still going on in the courts. And again, yeah, there's ways that you have to be careful on how you craft the legislation. Not all laws are created equal. But there are ways to do this that are constitutional, that are not restrictive on adults’ rights to speech. And I think that that is what we need. We can't just let, basically, I will say like the big tech companies weaponize the First Amendment against children. We have to say, actually, parents have a right to oversee their children's activities online.

Ari Cohn: I will not deny that there is potentially a constitutional way to get some of this done. I have not seen it yet and I don't think legislators are interested in putting in the hard work to figure it out considering what we've seen so far. But you know what? I also think, again, I'll be the first to admit that parenting is hard. I made it extremely hard. But parenting is tough. And parents have to adapt, too. I think some parents are failing in some ways. And I don't think that it's necessarily a moral failing. I think life has gotten more complicated.

It is more difficult. But that necessitates some hard work on both ends. I think parents do have the right to do it. I'm not sure the government has the right to force other people to do things so that parents can accomplish that. I think that is maybe a bridge too far in most cases. Again, there are instances like providing more information on parental tools and stuff that you could maybe persuade me are a much easier lift constitutionally.

But I don't want to get into the position where the legislatures, Congress or states have this freewheeling authority to basically shape the entire world and all of the expression on the internet to help parents who are unwilling to do the hard work to actually pay attention to their kids. Maybe that sounds glib. I don’t mean it that way.

Nico Perrino: Ari, you usurped the last word. Clare's taking it back.

Clare Morell: Well, I was saying I think that the government has an interest in protecting children, all children, including those who may not have as involved parents. And I think that is what these laws are also trying to do. This is a justice issue in some ways. Not every child does grow up in a home with involved parents trying to oversee what they're doing online. And we know there's actually a socioeconomic disparity of screen time. And so, I think that a lot of these solutions are trying to –

Ari Cohn: But that’s why these laws harm those kids, too. It doesn't harm them. It's trying to protect all children from what we know to be a very inherently harmful thing, which is social media. And I think we can make analogies like the surgeon general has to other things when we recognize as a society that children should be protected from this. And we want to empower parents to protect their own kids. If you want to make that argument of it should be up to parents, it's not effectively up to them at the moment. Kids can get onto these accounts without any parental –

Nico Perrino: Fifteen seconds, Ari. We’re going to actually wrap it up.

Ari Cohn: Kids can always do things that evade their parents’ rules, and we can't legislate that out of existence. But I think the point about kids who grow up in homes that are abusive, not supportive, uninvolved parents, that is one of the reasons why requiring parental consent for every website they signed up for that involves communications with another person is incredibly dangerous. You can end up cutting those kids off from the internet, from other kids like them.

Nico Perrino: You're saying it's an outlet for kids who are in bad homes.

Ari Cohn: In bad situations, yeah. Or just lonely situations, frankly.

Clare Morell: That hasn't borne out in the data.

Nico Perrino: I don't think we're going to solve it here. This conversation is not going away anytime soon. It's still something that we're seeing debated in state legislatures and, of course, then in the courts. So, I want to thank you both, Ari and Clare, for coming on the show. This has been fun, Ari. Thanks for coming, Clare.

Clare Morell: Yeah. Thank you. Thank you for having us. Thanks for a great conversation.

Ari Cohn: My pleasure. It's been great.

Nico Perrino: That was Clare Morrell, a senior policy analyst at the Ethics and Public Policy Center, and Ari Cohen Free Speech Council at Tech Freedom. I am Nico Perrino. And this podcast is recorded and edited by a rotating roster of my former colleagues, including Aaron Reese and Chris Malby. To learn more about So to Speak, you can subscribe to our YouTube channel or our Sub Stack page, both of which feature video conversations of this episode. And you can see if you look at the video, our new podcast studio. We've recorded last couple episodes here. It's great.

You can also follow us on X by searching for the handle Free Speech Talk. We have a video. We post the full video of this conversation there as well. If you had feedback on this or anything else related to the podcast, you can send it over to So to Speak@ thefire.org. Again, So to Speak@thefire.org. And if you enjoyed this episode, please consider leaving us a review on Apple Podcast, Google Play, wherever you get your podcast reviews. Help us attract new listeners to the show. And until next time. I thank you all again for listening.

Share