Table of Contents

So to Speak podcast transcript: 'Private Censorship' with J.P. Messina

'Private Censorship' with J.P. Messina

Note: This is an unedited rush transcript. Please check any quotations against the audio recording.

JP Messina: If I’m a neo-Nazi, and I’m not, I promise, the idea that AT&T could deny me service because sometimes I’m gonna be talking on the phone about my neo-Nazi stuff I think makes people rightly uneasy. This is one of the core modes in which we communicate with one another. Its general purpose, it’s not only gonna stop my neo-naziing but stop my communicating with Grandma to take her to the hospital. Sometimes probably neo-Nazis do that stuff, too, I suppose.

Nico Perrino: You’re listening to So to Speak, the free speech podcast brought to you by FIRE. The foundation for individual rights and expression. All right. Hi, folks. Welcome back to So to Speak, the free speech podcast where every other week we take an uncensored look at the world of free expression through personal stories and candid conversations. I am as always, your host, Nico Perrino. The First Amendment to our United States Constitution reads, “Congress shall make no law of bridging the freedom of speech.” Later, of course, the First Amendment was applied to the states not just to Congress restricting their ability to abridge the freedom of speech as well. But private institutions, unlike the government, are generally free to restrict speech at least if they so choose. It’s their free association right to do so. With that said, I hear from many of you, our listeners, who write to me to complain about private censorship.

Censorship on social media, within the workplace, or by institutions like PayPal or even music venues. And this makes sense. Right? We spend most of our time talking with one another expressing ourselves in the context of private institutions. As a result, that’s where we will feel speech restrictions most acutely. But is it right to call these restrictions censorship if they aren’t imposed by the government? Is there value in encouraging free speech norms even within private institutions that aren’t required by law to respect them? In short, how should we think about private censorship and its role within a liberal society? Fortunately, we have JP Messina with us today to I’m sure answer all of these questions definitively as philosophers like himself do.

He is the assistant professor of philosophy at Purdue University. And the author of a new book titled Private Censorship. And that’s it, folks, there is no subtitle. I like to power move to JP. Welcome onto the show.

JP Messina: Thanks, Nico, great to be here.

Nico Perrino: Also joining us is Aaron Terr. He’s a recurring guest on So to Speak. He is FIRE’s director of public advocacy. And he has often taken the lead in helping those of us at FIRE navigate the tricky thicket of private censorship. Aaron, welcome onto the show.

Aaron Terr: Thanks for having me back on, Nico.

Nico Perrino: JP, why don’t you explain the origin story of this book?

JP Messina: Yeah. So, well, obviously, it’s the next natural step for a graduate student who had written a dissertation on a manual [inaudible] [00:03:19] political philosophy to start thinking about issues of free speech. So, I don’t really understand the question. More seriously, my first job out of my Ph.D. program was at Wellesley College’s Freedom Project, which was basically an organization designed to bring heterodox thinkers to campus, challenge students’ ideas. They had some internship programs and things like that. So, I was teaching a class in the Philosophy Department there called Toleration and its Limits. And the philosophy employment situation was a bit precarious. I had to find ways of incentivizing myself to write a lot of new stuff. And so, I devised a rule that I’d write a paper every time I taught a class. So, while I was teaching this class Toleration in its Limits, I wrote a paper on state speech restrictions.

It ended up getting published under the title of Freedom of Expression and the Liberalism Fear: A Defense of the Darker Mill. So, the simple title is new to me. I gave a talk on that paper at the Freedom Project’s winter session event. It was about an hour-long talk. And I think we went about an hour and a half, two hours into Q&A. In which Q&A, we didn’t talk about state censorship or state speech restrictions at all. We talked about people getting fired for things that they said. We talked about social media platforms deplatforming people for their speech. We talked about social norms and how oppressive they can be sometimes. And in talking about all of that, I was sort of off the cuff. I hadn’t thought in any detail about any of that.

And that never feels all that good in a Q&A situation. So, I went back to my office. I thought about it a little bit. I took a job about five or six months later at the University of New Orleans in a policy institute. Seemed especially irresponsible to continue thinking about the context of that job. And so, I had a conversation with Lucy Ringle at Oxford and said, “How about a book on private censorship?” And she seemed interested. And so, I wrote up a proposal and, yeah, and then I wrote the thing, and the world focused on these issues to a degree that I couldn’t have imagined when I began writing. So, –

Nico Perrino: Were the people who were asking you those questions at Wellesley were they concerned about private censorship? I mean, do you get the sense that I get from my listeners that that’s mostly what people are concerned about in America right now? Maybe the First Amendment problem is solved, and we don’t have to worry about government censorship anymore. But that’s not often what I’m getting asked about.

JP Messina: Yeah, exactly. I mean, for me, I was sort of flabbergasted. I thought, well, obviously, the core issue of censorship is the First Amendment issue. And the really interesting thing to talk about for me is all of these different ways in which academics seemed interested in moving away from the U.S. First Amendment exceptionalism in the free speech domain. For the students in the class or in the lecture, that was just not true. They were interested and concerned about comedians getting their specials canceled and things of that nature.

Nico Perrino: This isn’t a new concern, though, I’m assuming. I’ve read John Stuart Mill and I’ve read his second chapter of On Liberty. I’ve seen him talk about, or heard him, or read about him talking about – he’s dead, so he can’t speak anymore, but the tyranny of the majority, of course. And you have Alexis de Tocqueville trotting through America, the American colonies, or America – excuse me, we weren’t the colonies anymore when we came – talking to people about this sort of kind of conformity that they often feel pressured to abide. So, when you were writing this book, were you looking at some of this historically?

JP Messina: Defiantly. So, the book is very Millian in spirit as a lot of my free speech work is because a lot of the U.S. scholarship on free speech takes Mill as a serious jumping-off point either to beat on them or to try to save him from those who would beat up on him. But Tocqueville is also interesting to me. He talks about America being characterized by a silence of which people in Western Europe just can’t even fathom. And so, it’s not a new concern. But I think the degree to which it is the central concern that people have about free speech and the degree to which our democracies have it or enjoy it is new or more recent. And I don’t know if that’s because the problem is worse now. I suspect probably not. Or just we’re much more aware of it because we see a lot of this stuff happening in very public ways and it’s become interesting to report on it and talk a lot about it.

Nico Perrino: So, Aaron, you’re the director of public advocacy at FIRE, so you do or spearhead a lot of the non-litigation advocacy within the organization and have been tasked by the organization in many respects to respond to these questions regarding private censorship. What have been the biggest challenges for an organization that defends free speech values in figuring out what to weigh in on with private censorship?

Aaron Terr: Yeah. I think we said from the beginning that we’re a free speech organization, not a First Amendment organization. So, we are concerned about the ways that people’s ability effective right to speak freely can be affected not only by government restraints but the actions of private organizations or individuals. And that’s because free speech depends not just on the legal protections for the right to speak but on cultural norms that support the exercise of the First Amendment right. So, it really requires I think a culture that values open discussion, tolerates the censing and unpopular views. And that prefers dialogue and persuasion to cancellation, de-platforming, or intimating your ideological opponents into silence. So, I think that’s kind of the guiding principle when we look at cases involving actions by private actors. But they are such difficult cases to analyze, and I think JP gets into this in his book. Because of this tension between on the one hand the freedom of association that we do want to protect for private individuals and private associations of individuals that can come together for some common purpose, or vision, or views that they want to promote and advance. But then also, recognizing that private organizations, and businesses, and large corporations have a lot of influence over our lives. And the actions that they take can really have a significant effect on how free people feel to contribute to public conversation on important issues.

And just recognizing that we don’t wanna live in a society that imposes too high of a cost for expressing their opinion. We don’t want people to hold back from participating in democracy and contributing to the public conversation on important topics because they fear massive social backlash. Because that’s not a good formula for producing knowledge or developing smart policy. It instead gets you a society where bad and wrong ideas have a much greater shot at prevailing because the moment that they gain a critical massive support, opposing them becomes fatal to your career or reputation. And I think also when it comes to things like comedy shows and entertainment, a lot of people value free expression in that realm. And giving artists really wide latitude to explore the limits of their creativity, and imagination, and pushing the envelope, pushing boundaries. That really gives something valuable to society and it’s something that we want to protect.

Nico Perrino: Well, let’s talk a little bit about some of the incidents of private censorship that have caught headlines. I think often it’s best to just talk about examples to make some of these theories concrete. You mentioned music venues. I’m thinking back to, for example, when the music venue First Avenue in Minnesota in 2022 canceled an appearance by Dave Chappell. He was gonna there. He was gonna do a comedy show. They said that they were getting backlash not just from people within the community but also from within their own organization. Their employees said they weren’t gonna work the show. For those of our listeners who aren’t familiar with First Avenue, this was a place where Prince played a lot. It kind of taunts itself as a place for free expression or has at least historically and it's been reviewed as such.

But here you have employees rising up saying, “We don’t want this comedian here.” You have people in the community, “This comedian shouldn’t come here.” What does a business owner – I mean, how should we think about this? What is a business owner supposed to do in those circumstances? Right? You have this value of free expression that you uphold, but you also have a business to run, and you can’t run it without employees who are willing to do the work. So, JP, what sort of questions does that raise in your mind when you think about private censorship?

JP Messina: Just a whole ton of questions. So, I think focusing on the point of businesses is one of the good places to start. So, they can’t function without revenue. And they can’t have revenue without employees that are willing to work with them. On the other hand, they also often have missions and sometimes keeping their workforce happy is incompatible with those missions. We know from some of the work and diversity theory that it’s actually pretty good to have disagreement among the ranks of an organization. It can – especially where complex problem solving is an issue, it can promote decision-making. And so, there are some bottom-line type reasons for valuing ideological diversity in the workplace. A lot of these cases, though, seem to be cases where there is a wing of the workforce that is interested in a certain ideological perspective and imposing their will on the rest of the organization. And so, I think that there you’ve got to make some really tough decisions as a business owner. Do you hold the line on your core values?

So, if you’re right about First Avenue and that they really build themselves as a free speech organization, do you hold the line on that and say, “Look. I’m sorry. This isn’t really up for debate. We’re gonna have this show. It’s sold out or whatever. We think it’s gonna be an important thing for our organization to host this sort of show.” Or do you say, “Well, look, the employment market’s tighter than it has been. It’s gonna be hard to replace these people. They’re going to actually leave. Their threats are credible. Maybe we let Dave Chappell find somewhere else to play this time and we work on our corporate culture a little bit.”

I think organizations in a complex, civil society where any one given organization is just a part of the fabric are never solely responsible for fixing or failing to fix the kind of free speech culture that Aaron greatly points to the importance of. But if every organization defects and seeds to these kinds of demands, suddenly, it’s not so hard to imagine what Tocqueville might have been thinking about when he thought about the incredible silence in certain social contexts. So, I mean, I think that organizations do have a moral duty in various cases to at least absorb some costs for maintaining the kind of free speech culture that we have reason to value if we care about the kinds of stuff that Aaron was talking about. Genuine inquiry, discovery of the truth. Things like comedic values. Jerry Seinfeld was just on recently talking about how difficult it is to be a comic in today’s climate. And that’s probably not the norms are so much changing but that they’re extending to domains which were sort of norm for your reality-free to a substantial degree before the current one.

JP Messina: Yeah.

Nico Perrino: Go ahead.

Aaron Terr: No. I was just gonna say that I think so much opposition to comedy in particular comes from just misunderstanding it. Either willfully or not. But comedy often operates through exaggeration, and irony, and subtlety. And should not always be taken at face value. And it’s why comedy is often absent in our authoritarian society other than whatever the authoritarian likes. Nothing that would be threatening to his hold on power because they will often deliberately misinterpret commuting messages or take them out of context to justify censorship. Or they place certain topics off-limits so that any joke about race is racist by definition and just intent and context go out the window. And I think that really is – we’re losing something huge when we lose creative freedom, particularly in the context of comedy. It can just be so effective at undermining the legitimacy of authoritarians’ power or views that have kind of become dogmatic in society. And that’s a really valuable service.

Of course, some comedy doesn’t do any of that and it’s just funny and entertaining. But, yeah, I mean, there’s something very valuable there that we want to protect.

Nico Perrino: I wanna flip this a little bit because Coinbase, for example. The cryptocurrency exchange famously told its employees, “We’re a mission-first organization. We serve customers who are interested in trading and acquiring cryptocurrencies, so we’re not gonna be doing what all these other corporations are doing, which is issue statements on the social and political issues of the day, and we expect our employees to not bring their politics into the workplace as well.” Is that a sort of private censorship concern that we should be worried about? I mean, is it violating free speech norms by essentially telling employees there are certain things that we don’t want you talking about in the workplace? JP?

JP Messina: Yeah, well, it could be. I mean, I don’t know enough about the structure of Coinbase, its organization, how many members it has, how intimate their relations are, or anything like that. I mean, the way that I think about some of these business cases where people are basically told you can’t say this in the workplace is that if they have a real business purpose in making the workplace restriction, that’s probably gonna be something that we should recognize that they have a right to do. Outside the workplace, things are a little bit tricker I think for all sorts of organizations. But I think –

Nico Perrino: Well, when you say outside of the workplace, you mean people speaking on their time.

JP Messina: Right. On Twitter.

Nico Perrino: Yeah.

JP Messina: Not invoking Coinbase’s name. So, I think for all of these kinds of private organization rights kinds of questions, it’s important to think along two dimensions. One dimension is the intimacy of the organization, which is language that I take from Supreme Court doctrine and extend it in ways that it hasn’t typically been extended to cover non-profit enterprises as well as social clubs and organizations. Where intimacy concerns things like regular contact between group members, the size of the organization, the degree to which the organization's central activities are restricted to members, the turnover rate at the organization, and things of that nature. And I think that when you tell an organization that functions like that that it can’t create rules about what speech it’s going tolerate on or off the job site from its members, that raises serious associational concerns. The second dimension is an expression dimension. And this is another sort of language that I borrow from Supreme Court doctrine where some firms are constituted for expressive purposes. So, you think about media organizations and things of that nature. Those two have sort of expressive liberties that would be curtailed if we said, “No, you can’t have rules about what your employees can or can’t say.” And so, large organizations that are non-expressive are the kinds that – they have the least sort of interests in expression and association.

And so, the [inaudible] [00:21:42] claims to it. But they still have reason to just want to focus on what they’re focusing on. And so, when Coinbase says to its employees, not knowing much about the composition of the organization, “We don’t want you talking about this stuff in the workplace. We want you to focus on cryptocurrency,” that seems to me perfectly appropriate for a private employer as long as they’re not also going out and saying, “You can’t Tweet about it. And we don’t wanna know your opinions about it. And if we find out that you have opinions about it that go in this or that direction, your job is on the line.”

Nico Perrino: So, those are employees working within an organization. I wanna also ask about users of different services. Right? So, you could say, “Nobody at – at Twitter we’re a mission-first organization,” for example, “We don’t want folks bringing politics into the workplace.” But then you have Twitter’s mission that it provides to its users, which as it was advertised in Twitter 1.0 and is definitely advertised under Elon Musk is, “This is a platform for open expression. This is the Town Square. We’re a free speech platform.” So, users have this sort of expectation that censorship isn’t gonna happen there. But censorship does happen on X formerly Twitter. It does happen on Facebook.

It does happen on a lot of these services that do kind of spouse this sort of expressive value. Aaron, so can you explain what rights users have on these private platforms that extensively claim to be free speech platforms, but sometimes do censor?

Aaron Terr: Yeah. Well, it’s important to start off with noting that the platforms themselves have a First Amendment right to editorial discretion, which gives them the right to decide on their own private platforms what users can say. Now, that doesn’t mean that the decisions they make about what rules they adopt and enforce on the platform to restrict speech are immune from criticism.

Nico Perrino: Or hypocritical.

Aaron Terr: Or hypocritical.

Nico Perrino: What they say they are trying to do.

Aaron Terr: Right. And inconsistent with their claim, commitments to free expression, which platforms like X make. So, I think – so, users don’t have really – they don’t have First Amendment rights against the platform. But I think in many cases, they have an argument that the platform’s forms are acting inconsistently with their policies, or their policies are too vague. And that tends to lead to arbitrary or viewpoint-discriminatory enforcement. Or that the platforms lack transparency about how they enforce the rules. They don’t provide users with a meaningful opportunity to appeal decisions made against – adverse actions against them that restrict their speech or ban them from the platform. So, I think FIRE has recommended things that platforms can do to uphold their stated commitments to free speech, and to maintain their legitimacy, and avoid to the extent that they can accusations that they’re acting, enforcing the rules in biased or arbitrary ways. And I think that there’s probably a lot of work that they can do to better fall in line with the principles that they claim to support.

Nico Perrino: Well, you still have to define what you mean by freedom of speech for these platforms. Right? Because they don’t allow pornography in many cases.

JP Messina: Sure.

Nico Perrino: They don’t allow animal crush videos, beheading videos. These are all things protecting the First Amendment, but –

Aaron Terr: Yeah. I think you have to acknowledge that the platforms are social media platforms. They’re also businesses. They’re trying to attract users and advertisers. So, if there’s certain speech like the types that you just named that are going to repel most users and advertisers and ultimately lead to the platform’s failing, then, yeah, it’s reasonable for them to consider to take that into account when they’re deciding which speech they’re gonna allow on their platforms. But I think things like making sure that the rules are – that you do have on the books are narrow and well-defined so people have reasonable notice of what speech isn’t allowed is something that all platforms can do. And I don’t think is going to hurt them. I think it can only help them and keep users. And to the extent that they can provide users with due process I think would also be a benefit all around. And ideally, the rules that they enforce wouldn’t discriminate against particular political viewpoints. It would just allow for to the extent possible really robust discussion on political issues and other issues of public concern.

Nico Perrino: JP, do you have thoughts on social media companies? And Aaron also brought up advertisers. And I know you have at least one case study in your book where you talk about the influence of advertisers on the kind of private censorship.

JP Messina: Yeah. I have a couple of thoughts. So, I think largely I agree with what Aaron says that vagueness, inconsistency – I mean, some of what platforms say about the speech that they’re gonna tolerate can border on fraud. So, you don’t wanna set people up to wind up violating your rules. I don’t wanna enforce your rules in a discriminatory matter. Ideally, you build the relevant distinctions into the rules themselves so that you’re enforcing them in a way that seems to be transparent and fair. I do think though that these companies do have to take a stand on the question that you raised, Nico, which is how do they understand free speech? So, Musk has said that he understands free speech at one point or another as whatever the law says it is. That’s not good enough, frankly, because the law’s confusing and contradictory, and sometimes it’s wrong.

Nico Perrino: And it’s different in every – and Twitter is a global company. It’s different in every country.

JP Messina: One hundred percent. And so, I think that there are competing understandings of free speech. There are understandings of free speech according to which some kinds of utterances compromise free speech. They’re First Amendment protected, so they’re not outside the bounds of the First Amendment conception of free speech. But things like hate speech have analyses in the scholarly literature according to which these are things that compromise their target speech rights by silencing them, by making them check out of the conversation, and things of that nature. And I think some social media platforms take a stand on this without informing their users that they just understand free speech in a way that’s different than the First Amendment understanding of free speech. They’re totally free to have their own idiosyncratic understandings of free speech. But if they use the language of the public square in their advertising materials when they’re defining their community guidelines, when they’re setting their terms of service, they invite people to think that they have liberties on the platforms they lack. And that causes a whole bunch of confusion. And certainly, I think, a lot of the controversies here would go away if they were just a little bit more transparent about what they take the idea of free speech to mean. That also does a nice thing in terms of the competitive space of these platforms.

So, if you think that what platforms offer is basically different moderation packages, which is I think the right view on the matter, then if Twitter says we believe free speech is “P,” and you think that’s a crazy view of free speech, now, you have a way of differentiating your social media platform from the Twitter platform in a way that might attract users that agree with you and don’t agree with Musk’s interpretation.

Nico Perrino: What do you all make of the argument now that social media is ubiquitous in our lives and is so essential to kind of the day-to-day participation in the conversation and in the world economy that it’s essentially a public utility, right? There’s a Supreme Court case from a couple of years ago. It didn’t rule that social media companies are public utilities, but this is the Packingham case where they essentially said that social media’s important. And that if you’re gonna deny someone who is convicted of a crime access to social media after they’ve done their time that this is a significant burden on their participation in kind of the United States economy. So, what you have, for example, are states like Texas that are saying, “You can’t discriminate based on viewpoint.” This case is being litigated at the Supreme Court and we’re waiting on a decision that we’ll probably get sometime next month. You also have states responding to news events like Florida, for example, where they said that presidential candidates can’t be kicked off social media platforms. That social platforms also can’t take down reporting from news outlets. This is in response to deplatforming Donald Trump in the wake of January 6th. This is in response to Twitter taking down, I believe, the New York Post in response to their stories about the Hunter-Biden laptop. So, we have other contexts where things are deemed to be so important in our lives, so essential that we identify them as public utilities.

But then you’re also violating the free association rights now. And you’re saying that once you get to be such a successful business that you are essentially gonna have your moderation practices dictated by the government. Aaron, how do we navigate that tricky thicket?

Aaron Terr: Right. I think that that’s the key point at the end there is that as much criticism as people might have of the way that social media platforms moderate content or restrict speech, it doesn’t immediately follow that the answer is to place the government in the driver's seat and take the reins because the consequences of doing that are going to be – are almost certainly gonna be far worse than the status quo. And also, the point you raised about, again, the freedom of association that these have as private platforms. And I think there’s something that’s not right about the idea that we – freedom of association is important because it allows people to come together to, say, form a web forum or community group, or social media platform online according to their own values, and interests, and principles that they want to promote. But the moment that it becomes too popular, the moment that too many people like it, all of a sudden you lose your First Amendment Rights of free association and editorial discretion. And the state can come in and oversee the way that you moderate speech on the platform. I don’t think that that’s right. And these Texas and Florida laws that are now the subject of Supreme Court litigation, we talked about it in previous broadcasts. I think of all the problems that these laws have particularly this idea that you can have a law that says you can’t discriminate against speech based on viewpoint. That means if you allow one viewpoint to be expressed on the platform, you have to allow the opposite viewpoint.

And you get these results like if Twitter decides to allow speech on its platform that is critical of terrorism, then it must also have to allow ICIST to operate on the platform and spread its propaganda. And I don’t think that the government should be coming in and forcing private platforms to do things like that. And the last thing I would mention is that there is no social media platform. As much market share as some platforms might have, none of them do have a monopoly. And I think we have seen over previous years that the dominance of any particular platform isn’t inevitable. There still is some choice that users have about which platform to use. I mean, right? Who is still on Friendster or Myspace today? Right? And where was TikTok five years ago? Right?

Everyone was saying, “Well, Facebook, and Twitter, and YouTube.” That’s all you have. And then TikTok just kind of comes out of nowhere and now has become one of the most popular platforms in the country and around the world. So, I don’t really buy the argument that – and that’s not to say that there aren’t some barriers to entry, and you have network effects where one’s platform has become very popular. I think that’s where everyone wants to be. And so, it’s hard for new entrants to come in, but it’s not impossible. And there are platforms that come along with different features and different things that make them appealing to users and there is some competition there. So, I think that’s an important thing to keep in mind is that there is some still choice between platforms that people can make based on how they moderate speech and the other features that they have.

Nico Perrino: Yeah. People have choice. They can leave if they want because this is a, at least somewhat, competitive market. But how do we think about companies that aren’t operating within a competitive market like the social media spaces? How do we think about internet service providers, for example? These are the backbone of the internet. They are core infrastructure. But in 2017, you had Cloudflare, which is a core internet service provider. It helps websites fight back against denial-of-service attacks. And if you get a denial service attack, you essentially can’t function on the internet. It announced that it was going to stop supporting The Daily Stormer, which is a neo-Nazi website.

And Matthew Price, who’s the CEO of Cloudflare, after they made the announcement that they were gonna do that, and this is kind of in the wake of Charlottesville and whatnot. Matthew Prince said, “After today, make no mistake. It will be a little bit harder for us at Cloudflare to argue against a government somewhere pressuring us into taking down a site that they don’t like.” He was very uneasy with the sort of authority they had to – he puts it, “Essentially take down a website.” So, JP, do you think about companies like Cloudflare differently than you might think about companies like X or Facebook?

JP Messina: Yeah. I mean, going back to the previous question for just a second. One note that I think people miss about the Packingham case, in particular, is that it was a politician’s community on Facebook. And they said that the politician can’t oust people and censor them. So, it was a very narrow sort of decision. Taking up the question of ISPs. It does get more complicated. It makes them look a little bit more like telephone service providers. So, if I’m a neo-Nazi and I’m not, I promise, the idea that AT&T could deny me service because sometimes I’m gonna be talking on the phone about my neo-Nazi stuff, I think makes people rightly uneasy. This is one of the core modes in which we communicate with one another. It’s general purpose.

It’s gonna not only stop my neo-Naziing but stop my communicating with Grandma to take her to the hospital. Sometimes, probably neo-Nazis do that stuff, too. I suppose. So, there I think in the phone case because there’s no way of discriminating on a case-by-case basis about the particular content, and you’re just saying to individuals across the board, “Sorry. We’re not gonna let you use the telephone lines, the radio towers at this point.” The censorship concerns are really significant. And it makes some sense to create duties to serve through the law. Now, I mean, that’s a technological problem. Maybe AI solves it, and you just censor the particular conversations about neo-Naziing, the rallies, and the propaganda, and the cause to meet at the bar or whatever to conspire. And, yeah, I mean, there I think you’ve got something analogous with what you’ve got with the ISPs denying particular organizations the ability to host their particular websites for their particular content on their platforms. I think it’s less concerning than it is – so, the idea of subjecting these sorts of organizations to duties to serve is less concerning than it is to subject X or what have you to these duties to serve.

At least because there are genuinely these products are such that we don’t really as consumers have strong preferences as between providers. What’s important is that we have the service. Whereas, you might have a strong preference as to which package of social media content moderation you take up such that you really don’t want to entrench one or a couple of players. One of the things that I always worry about with the public utility regulation of social media is that you really do risk the entrenchment of these companies as natural monopolies. You then stop competition. You stop better products from being developed. You give to a particular platform a kind of importance in the public’s fear that it need not have. I think there’s a danger in thinking that these platforms, any one particular platform, is more important than it is. And I think a lot of those worries are not there in the case of ISPs. On the other hand, meeting spaces in general, and if you think of ISPs as providing certain kinds of digital meeting spaces to people, access to them has long been at the behest of private individuals and private organizations. And there is a way in which I think it’s in an affront to say to somebody you have to host this content that you don’t wanna host.

And it’s public-facing in a way that telecommunications isn’t public-facing. So, when you host the content, you righty get backlash from people who are, “Why are you associating with these other people?” On the other hand, I don’t welcome a future in which we have politicians jawboning ISPs in the way that they jawbone Twitter and Meta and ask them to remove content. And I think that that’s one of the risks that you get if you have ISPs taking a more active role. So, I think there are really good reasons to allow just to be basically non-discriminatory in content as an ISP on the level of ethics.

Nico Perrino: And there’s also just the reputational hit that’s gonna come from serving these websites is not gonna be the same as it is in other context because often people on the outside don’t know what websites they’re serving. Like I don’t know, JP or Aaron, what phone service you use. I don’t know what internet provider you use at home. I don’t know what banking service you use. Whereas, I do know, Aaron, that you’re on X because I see you on there. And so, if you’re saying things on X and I want X to censor you and they don’t, then that could develop a reputational harm that they might have an interest in redressing. I do wanna just for point of clarification speak quickly about the Packingham case because I think we have – we’re getting confused as to what that case actually addressed. It addressed a North Carolina statute that prohibited registered sex offenders from using social media websites. And what the court essentially held and that’s why I think it's important for this conversation is that, and this from the opinion of the court, North Carolina with one broad stroke bars access to what for many are the principle sources for knowing current events, checking ads for employment, speaking and listening in the modern public, and otherwise exploring the vast realms of human thought and knowledge. Now, you think about sex offenders being in society, there are restrictions you place on sex offenders. But what they’re saying here is that the restriction that barred them with total access from social media is not the least restrictive means to accomplish their aim.

Although, you can understand why North Carolina would do something like this. The idea of being that someone on social media might be able to reach out to someone who’s underage to re-commit a crime, for example. But they’re saying, “You bar these people from social media, they can’t engage in the modern economy. They can’t find jobs. They can’t engage in community events.” And that’s why the law was struck down, I think. JP, what you were referencing in part was the spring court case we got this year Linke v. Freed where it’s politicians are blocking users on social media platforms. And to the extent that these politicians have personal accounts on these social media platforms that they may or may not use for public business to announce different policies, to use as town forums. No other court has essentially held that each form is going to be different depending on how the politician uses it. But it can become a public forum in which these politicians cannot censor folks.

Aaron, I wanna ask you. It seems like we’re talking about a spectrum of different institutions here. On the one hand, you have an X that holds itself out to be a public square where there is some sort of expectation of free expression there. On the other hand, you have these ISPs perhaps that are serving any sort of expressive end. They are just providing your ability to access the internet. So, how do we think about this different spectrum when we’re kind of analyzing whether there are certain norms we think should exist within any given context?

Aaron Terr: Yeah. I think that social media platforms are inherently expressive in a way that ISPs and the deeper internet infrastructure like a domain name registry isn’t – and is good reason to – it makes sense to draw those distinctions. Twitter, X, Facebook, they have community standards that they want to enforce that affects the speech that is on their platforms and that is viewable to the public at large. And, again, I think we want to have space for individuals to come together to create platforms in pursuit of particular values that they –

Nico Perrino: Either wide-ranging speech like perhaps X or narrows speech interest in farmers only.

Aaron Terr: Right. Right.

Nico Perrino: Right? You can’t join this dating app. unless you’re a farmer.

Aaron Terr: Yeah. Whereas, a phone company or an ISP is generally just really providing the means to connect to the internet or use a particular means of communication. It’s not in the business of curating this huge compilation of public-facing speech like X or Facebook. And I think all – the other important distinction is that when you go into deeper internet infrastructure, often there are more limited. You don’t necessarily have the alternatives that you do with, say, social media platforms. I think there are people that live in certain areas where’s there actually only one ISP that you can use in that, say, a rural community that you live in. So, if that ISP decides that it’s gonna restrict certain speech, then you’re just kind of shit out of luck if you live in that area. By the way, I also really like the point that JP made about jawboning, which is that even though we’re talking about private entities, you can’t really fully extricate the looming threat of state censorship. That threat can expand or contract depending on what the private entities do. So, in other words, the moment that an ISP, like JP mentioned, the moment that ISP does start getting into the game of policing user speech, it creates this opening for government officials to come in and try to exploit that to suppress the speech that they don’t like. So, now, you –

Nico Perrino: Well, we’re kind of seeing that in Congress right now where they brought in front of them a number of university presidents of private institutions. We’re talking about Columbia, Harvard, MIT, and Penn. And essentially are pressuring them to incorporate within their student codes of conduct, new speech codes.

Aaron Terr: Right.

Nico Perrino: And then the latest incident, Columbia said that it was gonna investigate a professor for – well, we at FIRE would perceive to be probably academic – [inaudible] [00:48:39] freedom of speech standards.

Aaron Terr: Right. And I think if these institutions stood resolutely in defense of freedom of expression, then they would be subject to less pressure from government officials to censor speech. So, that’s a pretty important point. And then we often talk about how censorship is a slippery slope, too. Right? There might be some people listening who think that, “Well, who cares if they – ,” like Storm Front can’t operate on the internet. Do we really need neo-Nazis to have freedom of speech online? It just, again, it’s similar to the argument like creating this opening for jawboning. Once you start going down that road, it rarely is going to stop. Right?

There’s no limiting principle. No reason why it has to stop the neo-Nazis. And it generally is not going to because people are going to see the opening. And they’re gonna start arguing that the views that they think are important and dangerous to society should be censored, too. There’s this Eugene Volokh who coined the term Censorship Envy. Right? Which refers to this phenomenon where people tend to react to censorship of speech. Not by appealing to neutral free speech principles, but by trying to get even. The unfairness of the platform's decision to restrict speech is often seized upon by activists who will point out that, “Why are you treating these people more favorably than us?” And rather than demand free speech for all, they want to expand the circle of censorship.

And over time, yeah, this tendency does tend to expand the range of prohibited speech. You got an eye for an eye leaves the whole world blind, I guess. Or maybe a tongue for tongue leaves the whole world mute. Is that a better metaphor?

Nico Perrino: Quote him on it, folks. But, yeah, the point about the slippery slope is important. And you might care about the neo-Nazi website today, but it reminds me of that H.L. Menken quote, “The trouble with fighting for human freedom is that one spends most one’s time defending scoundrels. For it is against scoundrels that oppressive laws are first aimed.” JP, I’ll let you respond to anything that Aaron said if you like. But also, I wanna ask you about search engines, which are a bit different, right, because it’s not always outright censorship. It’s just burying your results deep into the 10th page of Google, which I don’t if I’ve been to the 10th page of Google for any search, I’ve ever conducted on that search engine. And then, now, you also have new forms of search emerging as a result of artificial intelligence where it’s like you just ask artificial a question and it turns out one answer for you. And you don’t know what sort of mechanisms are happening in the background or what sort of true and relevant information is being suppressed because whoever wrote the algorithm doesn’t like it.

JP Messina: Yeah. Great. So, thanks for the correction on Packingham. The line on that one that I’m supposed to have is that it’s really important what the court didn’t say, which is that it would be a problem if social media platforms decided not to serve sex offenders because that would really say that they thought that being on social media platforms was crucial to having First Amendment rights. What they said was that states can’t tell them that they can’t be on there. So, that’s supposed to be the line on that with these Supreme Court cases. One gets one’s wires crossed. I think just sort of closing up the ISP conversation. The way I often think about speech is in the metaphor of a funnel. And so, you want a really big mouth at the top and then it’s okay to have lots and lots of filters. And so, the fact that there are neo-Nazi sites on the internet is not so concerning to me and shouldn’t really be so concerning to many people because, for the most part, the internet is mediated for people.

It’s mediated by social media companies and moderated by social media companies. And people’s access to those sorts of content is such that they can usually avoid it at a pretty low cost. Unless they go looking for it, they’re probably not going to find it. And that’s one of the interesting things about search censorship. So, you mentioned the fact that very often if this is gonna happen it's not gonna be because anybody removes content. If there’s explicit search censorship going on, it’s gonna be content that’s just put a little bit further down the results page. You say you’ve never been to Page 10. I don’t know if I’ve been to Page 7 just to one-up you there. But I think there are real questions of fact in the search domain. I don’t know how much censorship Google’s actually engaged in. I do know that SCO search engine optimization is hard and that it lags a lot.

And that it can seem really puzzling when you first start a website that it’s showing up early on the search page even with very particular query terms. But pretty often if you get the optimization stuff right that problem will resolve itself. Nevertheless, it’s another domain in which you have really high market concentration and people reach to the language of public utilities regulation. And I just think it’s really hard to run a good search algorithm. I think the core value in search is returning results that are relevant to users. And when you start deciding what users should want to find, the search engine becomes much less useful. And so, there are really strong incentives for these particular kinds of organizations not to bury results that their users are gonna find relevant. And it’s kind of a mystery to people right now why Google search has deteriorated in quality. I think lots of people if you go look on Reddit and stuff, there are lots of people talking about how it’s harder to find things on Google than it used to be. And there might be other explanations, but the solution that people have lighted upon for the time being is to say search the term Reddit, and then you’ll get actual people talking about it and giving their own links to stuff.

And I do this all of the time. I use DuckDuckGo, I use Bing, I use Google search. Sometimes, I use AI. But I do very specific things where I know that there are communities of actual users who actually care and share the values that I have who will be able to point me where I wanna go. The miracle of Google before the current moment is that you didn’t have to do any of that. It was good at returning results that were relevant. And so, it’s a question about why that is. Is that because they’re exercising a heavier hand in deciding a little more to error on the side of doing what they take to be I think a good thing, which is not reinforcing stereotypes making sure that they’re exposing people who might be in echo chambers to information that they might not otherwise be exposed to. Doing sort of what a lot of people see as their corporate social responsibility.

AI, yeah, I worry about this. There are interesting questions here. There are definitely queries where you know the answer that you’ll plug into ChatGPT, or Gemini, or something of this nature, and you’ll get a certain kind of response where you know that the answer to the argument is paused on the other side, and it just won’t give it to you. And so, one of the respects in which I take to just return to the earlier, the beginning of our conversation about Tocqueville and Mill and the early distinctions that they were drawing. For Tocqueville, there was something disturbing about the disappearance of diversity sort of regardless of where that disappearance – what occasion the disappearance. Whereas, for Mill, I think he focused a lot more on certain forces of conformity. So, if you were subject to lots of social punishment, that was a problem for Mill, and free speech problem, and a problem with individuality. But if conformity had other kinds of sources, it wasn’t necessarily concerning. Right? If people just start to agree about things, that’s actually a sign of progress for Mill.

And I think one of the things that generative AI makes me worried about is that there might be a whole lot of conformity that is totally not coming from social censorship or anything like that. But it’s just coming from people’s laziness and their desire to sort of offload their thinking to a large language model. The designer of which heads certain values that the person might not share. But they’re indifferent. They’re nihilists. They don’t care that much about coming up with their own views or even discovering what they are. And so, they just say, “Oh, yeah, that’s my view now.” And I see this in student work. I had like 18 AI plagiarism cases. And it’s like, “Yeah, this is just like really uninteresting, and boring, and homogenous.” And so, I think on that point, Tocqueville wins and Mill’s worse off.

Nico Perrino: Is that because people don’t, most people perhaps, don’t care about having an original thought? They just wanna know how to get through or what the right answer is.

JP Messina: I think partly it’s because of that. I think maybe I have some bad select effects in my own case where I’m asking them to do work that they have no intrinsic interest in doing. But I think – I’ve been disturbed by the number of academics and professional writers who want to offload aspects of their creative process to these tools. And I don’t know how much they’re sensitive to the fact that writing is a process of discovery. It’s not just a way of recording what you think. But it’s a way of learning what you think. And so, yeah, I don’t know. It problem has multiple sources. But one of them is just that we’ve got a lot on our plates and we’re looking for shortcuts and maybe we’re not thinking about the various ways in which the shortcuts we take can make us more homogenous, less diverse, and less interesting than we might otherwise be.

Nico Perrino: Well, to a degree, AR, artificial intelligence is just a tool. So, if you have no interest in being original in the first place, whereas before artificial intelligence you might just plagiarize. Right? You might just go look at someone else’s work. I think what is Claudine Gay, the former president of Harvard alleged to have done? She alleged to of plagiarized the acknowledgments to one of her papers or books. Right? So, this just makes it easier in a certain respect. But I wanna close up here by asking a final question that speaks to one of the critiques I hear of free speech culture most often. And that is there are no bright lines to determine when someone is falling on the Liberal side of a free speech culture.

It’s very easy when you’re talking about the government because it’s either government censorship or it’s not. Right? But in the private context, you have all these different interests at play. And furthermore, we also have this kind of reverence in the past of boycotts. The way ostracism was used to fight racism or sexism, for example. And then you also just have the vested interest that you wanna have a civil and ethical conversation. JP, you talk a little bit about this in your book. Right? And there’s just certain norms of conversation. I don’t wanna bring to my dinner party at my home someone who’s just gonna constantly be interrupting people saying offensive things. Farting in the middle of the meal, for example.

I guess that’s a form of self-expression. And it actually speaks to another private censorship controversy we had recently with the dean of UC Berkley’s law school where Erwin Chemerinsky where he hosted an end-of-year dinner as he often does for students. And one student decided to make it a political rally. Bringing her own sound amplification and talk about the Israeli/Palestinian conflict.

JP Messina: I thought you were gonna say that one of the students farted into the microphone.

Nico Perrino: Well, you know, that could have been the punchline to her speech. But she didn’t get there because Erwin and his wife stopped her. Right. And there’s this whole debate as to whether this was a First Amendment violation or not. And Erwin’s, “No. This is my private home. I can have who I want here, and I can tell them to leave whenever I want here.” So, you do have a cadre of First Amendment free speech advocates who are saying, “Let’s just focus on government censorship because everything else is too hard.” Aaron, JP, by way of closing remarks, do you think that’s the right way to think about this in a Liberal society? And do you begrudge people who take that approach? Let’s start with you, Aaron. Give JP the last word as the author.

Aaron Terr: No. I think we can acknowledge that it’s difficult. But that doesn’t mean have to give up the whole project of trying to cultivate a culture of free speech in addition to ensuring that people’s First Amendment rights against state intrusion on their freedom to speak is guaranteed. Yeah, I mean, I think that there is a risk in taking the social tyranny argument too far. And, for example, to point to someone simply coming under heavy criticism as an example of that or to use a modern cancel culture.

Nico Perrino: Well, no, I think there’s a distinction even there. Right? It’s like there’s a distinction between robust criticism and pairing that criticism with an effort to destroy someone’s life either through trying to get their employer to fire them or getting a bank to de-bank them, for example.

Aaron Terr: Right. Right. Right. Exactly.

Nico Perrino: It’s like that second step.

Aaron Terr: I agree. And I was just throwing a bone to the other side because I think that sometimes I do see people labeling things cancel culture that really to me just look like sharp criticism. And I think like John Walsh when he wrote a piece in the summer of 2020 when this issue of cancel culture was being widely debated and discussed that drew, I think a helpful distinction between criticism. He tried to draw some helpful distinctions about how we think about criticism versus cancellation saying that criticism marks evidence and arguments in a rational effort to persuade, whereas canceling seeks to organize and manipulate the social or media environment to isolate, de-platform, intimate ideological opponents. So, really the difference between using tools of persuasion on one hand. And on the other exerting power to enforce conformity to prevailing opinions. But, certainly, right, you should have the right to boycott businesses to express disapproval. But I think more frequently we see these people boycotting institutions that are nominally dedicated to free expression, like a comedy club, or an entertainment venue, or a college or university. I think the more frequent – we’ve also seen a lot of these calls within the publishing industry, too. Right? The more frequent those calls become, I think that says something concerning about cultural attitudes toward free speech.

And that we still want institutions to resist these calls to the extent that they seek to uphold Liberal values like free expression. I also just think that a lot of this is due to I think social media has changed the game. I think in many cases when you see calls for people to be fired on social media, that companies really overestimate the costs associated with retaining an employee who catches the attention of a social media mob for two days. And so, when 100s of people are flooding your company’s mentioned on Twitter, it can seem like the whole world is against you and you get tunnel vision. But really the hysteria is just coming from a small sub-culture on a platform that most Americans don’t even use. And most that do probably aren’t very active on it. So, you get the situation where the tail’s wagging the dog. Just wait out the storm for a day or two. You’ll probably be fine. So, I think I might be digressing a little bit from the original question. But, no, I don’t think that we should avoid having to discuss these difficult issues of free speech culture just because there aren’t necessarily any hard boundaries.

I think we should acknowledge that. Again, the actions of private organizations and individuals can have a significant influence on how free people truly feel to speak their minds and contribute to public discourse and to the extent that we value democracy, right, and democratic deliberation, and the search for truth. The more input we have coming from people and not allowing [inaudible] [01:06:30] to become entrenched in society that people can’t challenge because of the social pressure, that’s a bad thing. So, yeah, I mean, I think it’s something we can distinguish. While at the same time acknowledging that private censorship, state censorship, yeah, they’re two different things. Right? There are reasons that we treat them differently. State censorship is totalizing in a way that private censorship isn’t. Going back to the Packingham case and pointing out that an important point about this was about the state saying you can’t use any social media on the internet. Much different from just getting kicked off Facebook and possibly being able to go elsewhere and voice your views. And plus the consequences of state censorship can be – you can get thrown in jail by the government generally worse than just getting, say, kicked off of Twitter or losing a platform.

But, still, that doesn’t mean that we – we can care about both. Even if we care about them to a different extent or acknowledge that one is a bigger threat than the other.

Nico Perrino: JP, final thoughts on that.

JP Messina: Yeah. I mean, I agree with much of what Aaron just said. But maybe I’ll dwell a little bit on first the point that he raises at the end about the severity of the sanctions. And second on part of what makes some of this so hard. So, on that first point about the severity of the sanctions. One of the pieces of feedback I got when I was writing the book was that you’re just way under – this is from someone who had been and I won’t invite further virtual against them by specifying who it was, but somebody who had been canceled for something that he had done or subject to social media mobs and loss of professional opportunities. So, I mean, Aaron’s point about in the [inaudible] [01:08:14], well. nobody’s going to jail if they’re getting privately censored. And so, there’s just a – that’s like a major kind of thing when it comes to criminal speech sanctions.

And this person said, “That’s just naive. You don’t understand how bad it is to lose your most important social connections, your job, your livelihood, your esteem in the community, your reputation, all of that stuff.” And Mill agreed. She said, “It can be worse than being in a jail cell for a little bit coming out to find your social world basically intact.” And so, I don’t know at the end of the day having never been canceled, having never been to jail, which of these things is worse. But I think it’s really important to listen to people who have been through this and to hear them when they say that it really does have these effects that are about as bad as they imagine. On the flip side, I think it’s also important to note that there’s no remedy for them and for the kinds of harms that they’re suffering when their speech excites this kind of social punishment because in order to offer a remedy, you would have to force people to restore their reputation and you would have to force them to not have opinions. And we don’t have the mind control there and it wouldn’t be a good thing if we did. So, to some degree, the distinction survives, but for a slightly different reason. On to the point about how hard it is –

Nico Perrino: That’s a really interesting point because if you’re accused of a crime and you’re convicted and you serve your time, we have all these systems and kind of this idea as a society that we want people to be able to come back in the society fully. You see some companies, for example, who make a point of announcing that they’re gonna hire convicted felons even who have come out and served their time. And we also understand that people who are convicted had an opportunity to present their case. That they were able to make the best arguments as to why they should not be found guilty. But in cancel culture, you don’t have any of that it seems. You don’t have an opportunity often to present your case if you don’t have your own megaphone. And you don’t have a system that you trust to ensure that the right facts are sorted through and the conclusion that this person did something wrong is right. So, you really don’t have any recourse. And when you think about people who are canceled, I think there are few and far between who have been able to kind of come out of it whole or successful. I mean, I think people like Lucy K., right, are still persona non grata in communities.

And people can have disagreements about how bad his alleged activity was. But if he murdered someone, he could go to jail, and then come out, and we would think he’s perhaps reformed. But I don’t see any way out of it for him here in this current state, for example.

Aaron Terr: Yeah, no. I think that’s a great point, too. And I also – I think it’s the point JP you made about the severity of sanctions is a really good one, too. I definitely didn’t mean to minimize how severe sanctions of private censorship can be particularly in the context in private employment where you can lose your job and livelihood for something that you say. Or even like you mentioned just the cost of losing your entire social circle. So, yeah, that’s – my point I guess was more just – even if we accept the premise that sanctions on private censorship context tend to be less severe than can be if the state wasn’t constrained by the First Amendment, it doesn’t mean that we shouldn’t care about it. And, certainly, the consequences can be severe. And if you have a whole lot of Americans who are afraid to lose their jobs for speaking out on public issues then that is something that is really concerning. And we shouldn’t want jobs in this country to come at the cost of democratic participation.

Nico Perrino: And sorry, JP, you were gonna make one more point.

JP Messina: Yeah. I mean, just quickly, I think we all – I don’t wanna give the impression that I’m super [inaudible] [01:12:26] about people’s prospects to re-enter society after incarceration. I think there are various ways in which that’s extremely, extremely difficult. But it is true that there’s at least the mechanism and the aspiration there in the formal processes designed to facilitate that sort of thing. But, yeah, on just how difficult this is. I think that it’s difficult in the intrapersonal case. So, I think there are aptness considerations when you speak. And I don’t think it’s the case that I experience in my own intellectual life and conversation is one where I just say the first thing that comes into my head. And one of the things that I worry about is how it’s gonna be received and whether I’m going to be punished. But another thing is whether it’s going to come off as insensitive or whether it’s going to be alienating to the particular persons that I’m talking about certain issues with. And so, if I can’t get it right in my own case, what hope do we have to get it right in the intrapersonal case where we’re thinking about whether what somebody else did and whether the way that they exercise their expressive liberties was [inaudible] [01:13:40].

Where misunderstandings can happen more easily. Where social punishment emerges not just as a mechanism of censorship but as a response to censorship where vigorous criticism is perceived by some especially the shy and those less accustomed to the rough and tumble of seminar-style discussions can experience [inaudible] [01:14:11] as sort of totally shutting them down. As is in encouraging them to sort of step back from the conversations and maybe not try again. So, it’s really difficult. But I do think it’s really important and I think we’ve got a lot of learning from one another to do. And I think sometimes the kind of vigorous criticism that I might offer to one of my colleagues in a report or something like that would be the exact wrong thing to do with respect to one of my students who’s just learning how this stuff works. And so, sensitivity to context is something that really matters, trying to figure out when to hold back in conversation is something that really matters. Helping one another see when we should be holding back in conversation is what really matters. But I think it also matters that we do those kinds of things in ways that respect well-known principles of necessity and proportionality. That we refrain from escalating situations with sort of doxing campaigns and campaigns to get people fired, and campaigns to sort of silence people and say, “You now persona non grata,” should be a kind of last resort in these cases with other less intrusive means being preferred.

So, I think we’ve got a lot to learn, and I think it’s hard. But I don’t think we should give up. I think that would be foolish. And I think it would be foolish because if we give up on the norms front, we’re much more likely to give up also on the formal censorship front, the state censorship front. And I could explain why I think that, but I’ll just leave at letting you know that it’s something that, I think.

Nico Perrino: I think we will leave it there. For folks who want to read more about what JP thinks, you can check his book, Private Censorship. JP, Aaron, thanks for coming on the show.

JP Messina: Thanks for having me.

Aaron Terr: Thanks, Nico.

Nico Perrino: JP is a philosophy assistant at Purdue University. And, again, the author of the new book, Private Censorship. Aaron is FIRE’s director of public advocacy. This podcast is hosted by me, Nico Perrino, and produced by Sam Neuhauser and myself. It’s edited by a rotating roster of my colleagues, including Aaron Resse, Chris Molpie, and Sam. You can learn more about So to Speak by subscribing to our YouTube channel where you can also find a video of this conversation. And we are on Twitter, which you can find by searching for the handle freespeechtalk. If you have feedback, please email it to us at sotospeak@thefire.org. And that is sotospeak@thefire.org.

Share