Table of Contents
So to Speak Podcast Transcript: Section 230 co-author, Rep. Christopher Cox
Note: This is an unedited rush transcript. Please check any quotations against the audio recording.
Nico Perrino: Welcome back to So to Speak: The Free Speech Podcast, where every other week we take an uncensored look at the world of free expression through personal stories and candid conversations. I am your host Nico Perrino.
A few episodes ago we discussed Section 230 of the Communications Decency Act, which is an American law passed in 1996 that says, essentially, that websites cannot be held legally liable for content posted on their sites by users. Some argue that the law created the modern internet allowing companies like Facebook, Twitter, Amazon, and Yelp to flourish. But some critics argue that Section 230’s liability shield allows harmful content to proliferate on the internet, while the companies that host that content can disclaim any responsibility for it. Indeed, right now there is an effort in Congress to repeal Section 230.
In response, our guest on today’s show published an op-ed in The Wall Street Journal arguing that repealing Section 230 would kill the internet, and he happens to know something about Section 230. He co-authored the legislation. Our guest today is Christopher Cox. He was a 17-year member of the House of Representatives, and from 2005 until 2009 was the chairman of the Securities and Exchange Commission. While a representative, he co-authored Section 230 with then-representative, now senator, Ron Wyden. Representative Cox, welcome onto the show.
Chris Cox: Well, I’m very pleased to join you.
Nico Perrino: So, in 2019 law professor Jeff Kosseff published a book entitled The Twenty-Six Words That Created the Internet. Those words, of course, come from Section 230 which you co-wrote. And I wanted to kind of kick this off by asking if you agree with Professor Kosseff about the impact of your legislation. Did it create the modern internet as we know it today?
Chris Cox: Well, there’s no question it’s a foundational legal underpinning, but we all know the old joke about Al Gore inventing the internet. And Ron Wyden and I did not invent it any more than he did, but having sturdy legal rules definitely created an atmosphere of legal predictability so that people were willing to invest, and people were willing to build very creative techniques for people to use this new technology, which in the 90s was still in its formative stages, so that what we’ve become accustomed to thoroughly today could eventually grow and flourish.
So, the ability of what I will call ordinary people, that is people who don’t own CNN or NBC or FOX or what have you, to express themselves, and the ability of all the rest of us to access that, would not exist if it were not possible for the host of that content to feel comfortable that they would not be sued every time someone who posted one of the billion things that are uploaded every day, did something that was violative of the law. What Section 230 says is the people that should be held accountable for that speech are the speakers, the people who say it.
Nico Perrino: So, the 26 words that Jeff Kosseff is referencing comes from Section C1 of Section 230. It says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Now, I’ve got the law in front of me. It’s actually much longer than that, but that seems to be the crux. What were you trying to accomplish in writing Section 230?
Chris Cox: Well, first, let me seize on the point that you have very wisely made which is that the 26 words that Jeff Kosseff in his very useful book refers to are a part of Section 230, but there is more. And that will become relevant I think in our discussion this hour, because early on at the beginning of court interpretation of Section 230, there was a fascination with the 26 words, but some of the rest of the statute was overlooked for a while. I will race to a conclusion of that topic and say that the courts these days seem to have sorted it all out, but it was an issue at first.
So, what we were thinking when Ron and I introduced this legislation and discussed it with our colleagues in the House and the Senate for a full year before it became law was how we could adapt existing laws governing what to that point were traditional means of point to point communications like radio, television, newspapers, magazines, and so on, where all of the users were passive receptacles, and there was one point of origination for all of the content. And furthermore, the editors of that content had the physical, and mental, and human capacity to review what was being published before it went out.
The law addressed that situation by saying that the editors have a responsibility to make sure that what they publish was truthful and not in other ways violative of the law. So, that’s the foundation of liable laws. How to adapt that to a technology that is instantaneous and it’s one of its critical features is that it’s instant communication, it’s global. And because of that global capacity, will involve not just a few hundred or even thousands of pieces of communication on a daily basis, but literally millions. It was millions in the 90s. Today it’s billions. So, how to adapt the laws so that that kind of communication can flourish while still making sure that we have responsibility in the law for violations?
Nico Perrino: I’ve always wondered though if the theory is actually correct, that these 26 words created the internet because the United States is one country. And to my knowledge, we’re the only country with Section 230. Now, there might be laws similar to it in other countries, but I know there are other countries without Section 230. So, why is it significant that America has a Section 230, and other countries don’t, and it hasn’t broken the internet so to speak? Or maybe it has in ways that I’m just not aware of.
Chris Cox: Well, so you have to take yourself back to prehistoric times in internet technology terms to the early 1990s and recall that the United States was truly the leader in this technology. While neither I nor Ron Wyden or Al Gore invented the internet, the United States of America actually did. And so, without our nation, without our country, there would not have been this technology, at least not when it emerged and the way it emerged. So, in the early days of the internet, the United States was patterning for all the rest of the nations of the world, how this was going to operate.
And the norms that we adopted were very closely examined by the rest of the world. And the basis for legal liability that Section 230 set out it became in the early days of the internet the basis for planetary communication. It was called the World Wide Web, and that was the ambition of idealists that we would really have a global medium of communication. And there was a lot of friction removed, so instead of a long-distance phone call – we don’t even think of long-distance phone calls as genuinely expensive now, but they used to be.
Instead of all of the expense that might be involved in reaching many people in many different parts of the world, you now had access to a global community, global markets, and so on, almost for free. Just with the cost of your own internet access. And that pattern maintained for quite a while before China built the great firewall and before Russia decided to have its own version of Wikipedia and make things more hospitable to their way of viewing the world, and so on. We now have a Balkanization of the internet, and we have many parts of the world, North Korea being a limiting case, that are closed off from anything that we might be doing here in the United States.
Nico Perrino: Did you see, or could you have seen it going the other direction, closer to that China, Russia, North Korea model in the early days of the internet? Was that part of the fear?
Chris Cox: Yes. We had very much in mind that we were dealing with something that was called back in the day the World Wide Web. We knew that it was important what America did since we were the first movers. And so, we wrote the legislation with a view to making it adaptable and hospitable to the entire global community, if you will.
Nico Perrino: Well, when you use the word hospitable, I was reading a retrospective that you wrote on Section 230 and in it, you write that representative Wyden and yourself were determined that good faith content moderation should not be punished. And you go on to say that, “We named our bill the Internet Freedom and Family Empowerment Act to describe its two main components, protecting speech and privacy on the internet from government regulation and incentivizing blocking and filtering technologies that individuals could use to become their own censors in their own households.”
When we think about Section 230 today, we generally only think about that first part, the liability shield that’s applied to online content providers or hosts I should say. You don’t think about that second part which is incentivizing and empowering these platforms to also allow for blocking and filtering technologies and empowering families and parents to use them within their own homes. Why do you think that latter part has been forgotten or do you disagree with the premise?
Chris Cox: No, no, you were precisely correct in describing the way we intended the legislation to work and the emphasis that is placed on it. There is in Section 230 beyond the 26 words an initial section comprising congressional findings. And our findings were very much focused on the importance of making sure that users had control over the information that received. At the time, remember we were both in the house at the time, Ron and I. There was alternative legislation in the Senate.
And the Senate’s approach was that the FBI and the Department of Justice and maybe the FCC were going to be in charge of surveilling the internet and deciding what could be published on the internet without legal liability so that children would be protected. And the simple way of explaining the ramifications of that is that only things that were fit for children would be found on the internet. That was a great challenge to many of us in the House at the time who thought that we would be destroying the potential of the internet for education, for the spreading of medical information, for all the wonderful things that we take advantage of today if we dumbed it down to a child’s level.
And so, our fight was to explain to people first of all that putting the government in charge of prescreening everything on the internet would truly bring it to a screeching halt because they just didn’t understand many of these people in the Senate at the time didn’t even have emails. They didn’t understand the medium. It would overwhelm government capacity to have to review and prescreen all of this, just as it would the platforms themselves.
So, we had to find a different way to make people who were using the internet the kings of their castles, as it were. But they could limit at the portals of their own home what came in and so on. And we imagined that technology in the future would be better at handling that even than it might have been in the 1990s. So, yes, that was an important feature of Section 230 at the time.
Nico Perrino: So, Section 230 also allows these internet platforms to moderate content, right? Wasn’t there a court case around this time or just preceding the passage of 230 where a court held essentially that if internet platforms moderate some content, then they’re legally liable for all of it? And so, they were put in this bind where they either had to moderate all content or no content. And wasn’t 230 passed in part to help address that concern or at least a section of it?
Chris Cox: Yes, you are precisely correct. And that is important both historically and very much so today because if Section 230 were repealed, we would go back to that state of the law pre-Section 230. Interpreting liable law, which was an issue in the case that you allude to, under the existing rules of the 1990s, the result that the courts of New York State reached in two separate cases were as follows. One platform, which was called CompuServe, was held not liable for what users posted on their forums because they made no effort whatsoever to moderate anything that went on, on the platform.
So, they didn’t have any rules about harassment or bullying or excessive violence or swearing or what have you. Prodigy in a separate case was held liable where the facts were very much the same. They were being sued for content on their platform posted by a user, but they were held responsible because they advertised themselves as a family-friendly place. Their advertisements on TV showed mom, dad, and kids all smiling in front of computer screens and surfing the web together. One of their big financial backers was Sears Roebuck, which was a big company at the time.
So, this is a very mainstream kind of business, and they wanted to make sure it was family-friendly. They had no capacity, of course, any more than any other platform to stop everything in advance or even to read it. They didn’t have an opportunity to read most of this stuff before it was published. But because they made the attempt, and specifically because they had rules against harassment, against violence, against profanity, and so on, the court said, “You are now no longer like a library. You are like a newspaper. You have editorial control over what your users are posting and therefore, you will be liable.”
And the liability that they faced on account of one user post was in the multi-millions. So, they settled out of court, and it was very clear the incentive that would have existed if Congress had not stepped in and changed that result. The only way that platforms could allow people like you and me to post online in real time would be to have no rules whatsoever; to have an anything-goes atmosphere. And the internet would very quickly become a sewer, and that’s exactly what we were trying to prevent.
Nico Perrino: We were talking earlier about the open internet and kind of the dream that was the open internet in the 1990s. And in your retrospective, you describe one of your goals for Section 230 to be not just the protection of speech but also privacy. And it seems like today we’re revisiting some of the questions that were not only litigated in court but also in Congress around the time of the passage of Section 230, in particular, surrounding age verification and its use to try and restrict from certain content deemed sensitive to minors on the internet.
I’m just wondering kind of what you make of that continuing debate. You said that you had speculated that filtering technologies and blocking technologies would advance with the passage of time. It seems like folks still have a concern surrounding the types of content that can be accessed on the internet.
Is that more a problem of parents not using the technology that’s available for them, and then kind of retreating or asking the government to play parent for them instead or is it really just a need, a realistic need reflecting on the situation that the only way to actually block this content deemed sensitive or unlawful for minors is to actually require age verification, which if you think about it, requires either identity verification or age verification sometimes or often with a government ID?
Now, they do say that there are technologies to prevent the kind of capture and holding of this information after the identity is verified, but then you also have a question of compliance. So, if an internet platform is sued for allowing a minor to access its platform, it needs to be able to prove that it asked for age verification which requires you to look at a log. So, I’m just wondering how you think of the current debate happening around identity verification, age verification, and access to the internet, in light of the debates that you were having 30 years ago.
Chris Cox: Yes, these are very complicated questions. They become more complicated with each passing year as technology itself does so many more things. It’s a much a more complex environment legally and substantively than it ever has been. But the questions and the ways that we think about this can be reduced to some long-standing principles, privacy being an important thing for us to be wary of us protecting. Just to use a couple of examples, I worked in the federal government for many decades, and I started out in the Whitehouse.
My appointment to be a lawyer for President Reagan in the Whitehouse required that I undergo a full field FBI background investigation. And the FBI went and knocked on the doors of every neighbor I ever had. The way you put together FBI reports is to write down what everybody says. So, they talked to hundreds of people. And whether people say, whether it’s true or not, gets written down and put in these reports. They also ultimately comprise quite a repository of everybody you’ve ever known and so on.
Fast-forward to today, I’m an AT&T customer. Perhaps a lot of people listening to this podcast use AT&T. And AT&T we just learned through recent FCC filings lost so much data a few years ago in a data breach that for millions of their customers, every phone call to every person was offloaded to bad actors. Don’t know who they are, whether they’re state actors, or so on. This is such a sophisticated breach, that’s a good guess. But that’s an amazing invasion of privacy.
In both cases, I’m left without a remedy because my FBI background file along with those of hundreds of thousands of people who work for the federal government, which were stored at the Office of Personnel Management, were hacked. And we’re pretty sure we know who did that. That information is now lost forever. I have essentially no way to vindicate my privacy rights there, nor do I as an AT&T customer as best I can tell.
So, we can say it’s terrific that when you give your information to a platform that their rules are they’re never going to give it up, but good luck because history suggests over a period of many decades that it happened before, it’s happened recently, and it will happen again. So, our privacy is not very safe once we put things online. That’s thing one.
Nico Perrino: And then thing two I mean is what does this make of the free and open internet, that dream that was in the 1990s if you can only access certain aspects of the internet by submitting a government ID? Is that a concern for you as well?
Chris Cox: Yeah. So, that’s thing two, is who is doing the policing? If it’s the government we are put in the most precarious of positions because this is very antithetical to the entire American tradition, that the government can tell you what you can read. The power that we would seed to any government in that way would be mostly unthinkable, I think, to millions of Americans.
So, then what about private actors? Should private actors be in a position where they can demand that we show identification, that we disclose very important, private, personal information about ourselves before we can read what they have on their sites? And to that, I would say perhaps surprisingly, maybe. If we have a sufficiently vibrant market with choice, and we have thousands, perhaps even millions of platforms for all different interests and purposes and so on, then it’s hard to imagine why that would be a bad thing. That would be one model.
And people might feel safer, particularly if we’re talking about children here, to have a platform where that was necessary in order to access things on the platform because then you could let your kids have free reign on the particular site with a lot less worry. So, I said at the outset, all of these things are getting more complicated. They are. And so, the solutions necessarily have to be more nuanced, and we have to allow for perhaps many different ways of handling the same problem in different contexts.
Nico Perrino: You mentioned a competitive marketplace and an open marketplace with lots of different companies vying to compete for users. One of the arguments that you get from folks on the other side is that this isn't a competitive marketplace. You have just a couple of big players, and they get this benefit that you shouldn’t get if you’re effectively operating as a monopoly. What do you say to that?
Chris Cox: Well, so this is properly understood as an antitrust argument. And this is a perennial problem in antitrust. These are not easy questions. When does an actor have market power? There’s a little bit of special nuance in the tech sector or in the online world because of what we call network effects. And yeah, even with network effects we’ve seen the one big thing come and go. So, it can happen very quickly. It can happen overnight where people flee from one platform to another. So, it makes it perhaps the wrong tool to use antitrust to chase after these things because antitrust is a litigation mechanism.
And these cases in federal court, even normal cases in federal court these days if you take them through trial can take seven years easily, and big antitrust cases take longer. And that is an eternity in the world of tech. So, it’s fair to say that antitrust could be part of this, but it doesn't really solve the problem. We need solutions that work much more quickly than that.
Nico Perrino: You had mentioned earlier in the conversation that when you were looking at drafting Section 230 that some of your colleagues in the House of Representatives didn’t even have email, so their knowledge of the emerging internet might have not been very robust. Were you an early adopter of these technologies? What led you to kind of understand the unique challenge here?
Chris Cox: Yeah, that was just sort of serendipity that both Ron and I were computer geeks early on. I had a business that I started in the 1980s when there was still the Soviet Union, that translated the Soviet Union’s daily newspaper, their largest newspaper called Pravda, which means truth in Russian and that was the one thing that was absent from that newspaper. But my thought was that exposing what the Soviet Union – and this was subtitled the official organ of the Communist Party, the Soviet Union – what they were saying about America would be of interest to Americans.
And I studied Russian in college, and that’s how I got initially started on this. And so, I read Pravda once and a while, and I thought holy cow people have no idea what they’re saying behind our back as it were, because we didn’t read Russian. So, we translated this on a daily basis for the first time, and I think the last time it’s ever been done for any newspaper in the world and sold it in 26 countries around the world. But in order to build this business in the 1980s, I had to figure out how to get 50 translators that were located hither and yon around the country to work together.
So, we got them all linked up by computer modem and bought IBM PCs for everybody. And just having to decide between a Mac and a PC was an early business decision. Ultimately when I ran for congress, I wrote the software that I used to track donors and campaign contributions and addresses and basically built my own database and had fun writing software in that way. And that made me hardly Bill Gates or anything, but it did make me a little more aware of what was going on in this world than people who hadn’t yet figured out email.
Nico Perrino: Did you have colleagues in the house coming up to you asking you how email works or the internet works as you were considering?
Chris Cox: I don’t want to pick on Congress because it was not a particularly dark cave where the world hadn’t yet entered. They were like everyone else. These were early days is all you can say about it. The Senate was a little slower than the House at adopting the technology. And that was largely an age thing because many people that serve for seven, eight, 10 years in the House then run for Senate in their states, then they stay another 10 or 20 years. So, the ages on average are higher in the Senate and it was more likely that people were not early adopters over there.
And that was part of the challenge, and it’s why we took a year to talk these things through. I don’t want to leave anyone with the impression though that age was the determinant of whether you were any good at this, because some of the greatest allies that Ron and I had were some of the older guys like Pat Leahy. He was terrific.
Nico Perrino: Yeah, my wife used to work for the House of Representatives. And just reflecting on her day-to-day work, so much of it is tied in with the internet. I imagine it must have taken a considerable effort to update the infrastructure even just of the day-to-day operations of the House of Representatives in order to do the sort of constituent correspondence that is so much a part of how staffers work on a day-to-day basis. And you were there largely for that transition. I can only imagine what it was like.
Chris Cox: Well, one thing that helped was that we had the first change in party control in 40 years in the House of Representatives. And so, it was an occasion for a new look at things. And it would have happened either way if Republicans had been in control for four years and the Democrats took over, they would start with a skeptical eye to the way things have been and asked did they need to be that way or what can we change or improve or do differently. And so, there was a spring cleaning that took place beginning at the end of ’94 after the election in early ’95.
And Newt Gingrich who became Speaker was very keen on ginning up the Library of Congress to put everything under the sun online and to digitize things. And people eventually of both parties, all parties, we had a couple of other parties represented, decided that that was a good idea. And of course, it is a good idea, and it remains one.
Nico Perrino: Do you think we needed Section 230? I ask because I’m wondering if the First Amendment could have accomplished the same ends as Section 230, albeit presumably more slowly and at greater cost to the pace of internet innovation.
Chris Cox: Well, the First Amendment is let’s call it an important cousin of Section 230 because they both work to support freedom of expression and also our freedom to read things and to hear things and watch things, but they play very different roles. I would say Section 230 is a compliment to the First Amendment.
It vindicates First Amendment values, but the First Amendment standing alone would not have solved the problem that you and I discussed earlier in this broadcast of the CompuServe versus Prodigy problem where the courts had decided that there would be legal liability for content produced by other people if you had any rules of the road. That would be the law today and the First Amendment would be fine with that.
Nico Perrino: There’s a critic of Section 230 who I’ve been having a continuing dialogue with who argues that there are practical arguments for the need for Section 230. The just kind of expansive amount of content that’s produced on the internet on a day-to-day basis, and the difficulty of these internet platforms and moderating it, it’s a practical impossibility to effectively moderate all of the content. Therefore, if we want to have the free-flowing of information on the internet, then you can’t make them liable for every piece of content posted.
But they say they don’t see a first principles argument for it if that makes sense, something that you might derive from the First Amendment. Do you disagree with that? Do you see a first principles argument for what you did in Section 230 rather than a practical one?
Chris Cox: Yes, and I would go further, and I would say that Section 230 is closer to first principles than is this approach of secondary liability that would exist without it. Because remember, when a newspaper in the old paradigm where we got a newspaper editor, maybe even with a green eyeshade, looking at typewritten things that are submitted before deciding to publish them, that person is taking on secondary liability for what somebody else wrote. And liable here is a paradigm. There are other things that Section 230 covers, basically all forms of illegality, but it’s easy to discuss this because of the Prodigy example in the liable context.
So, I the newspaper editor, if I don’t read this carefully enough and something slips past me that’s a lie, my newspaper and I are going to be liable for this. That’s liability that I think we all agree should be imposed first on whoever told the lie. And then we have derivative liability for the newspaper. So, derivative liability is not a first principle, it is an add-on to the fundamental rule of law that the people should be responsible for their own actions.
Because it is impossible in any fair way for the law to expect that an editor or a team of editors or even a team of imperfect algorithmic-driven detectives looking for bad things can accurately screen out everything we don’t like, that should be the basis for liability for whoever created it, because it’s impossible to do that with perfection, it’s unfair for the law to demand it. And therefore, we need to revert to first principles and say well who is in the best position to stop this? And the person who is in the best position to stop it is the person who did it, and that’s the way Section 230 allocates liability.
Nico Perrino: Section 230 has developed into kind of a boogeyman in public discourse. Anyone who has a problem with technology seems to pin some of the blame on Section 230, but has it always been that way? Over the 30 years or so since its passage, how do you chart kind of the public’s perception of Section 230 to the extent it had any kind of awareness early on of what it was? Have you noticed a dramatic shift and when did that kind of turning point come?
Chris Cox: I’d say that the inflection point was when there was a notorious case involving sex trafficking, involving a website called backpage.com. And what Backpage.com was doing illegally was so serious that it attracted a Senate investigation under a committee led by Rob Portman of Ohio. The committee put together detailed evidence that the people at Backpage.com knew the way that they were making money was to promote sex trafficking, and that they helped the people who were using their site for that purpose to disguise from law enforcement what they were doing.
That under Section 230 would make Backpage.com totally liable because beyond the 26 words – and I promised we would get to this – is another part of Section 230 that says that if you are complicit, even in part, in the creation of problematic content, then you have no protection whatsoever. And yet, a federal court in the First Circuit said that Section 230 shielded this behavior. And there was a documentary movie made about this, about how awful it was.
And once the court said Section 230 was the reason that we couldn't stop child sex trafficking, not surprisingly a whole lot of hell was raised in Congress and in state legislatures and in the media and so on. That decision was wrong. Because this was the Court of Appeals and it didn’t go to the Supreme Court, it was not technically reversed, but a few years later the Jane Doe plaintiffs were substituted. They had new Jane Doe plaintiffs, a new case, same defendant Backpage.com, and this time around the court got it right. But by that time the damage had been done, and Section 230 was understood to be a free pass for illegal behavior on the internet.
Nico Perrino: You’re right in your retrospective that some courts have extended Section 230 immunity into internet platforms that seemed complicit in illegal behavior generating significant controversy about both the law’s intended purpose and its application. That seems to be what you’re getting at here in talking about the Backpage case, but nearly 30 years later having seen how courts have interpreted Section 230 and how it’s played out in the court of public opinion, and how the internet has developed frankly, are there any changes you would make to Section 230?
Chris Cox: I don’t think it’s necessary now because the courts have finally sorted this out, but to avoid Backpage from ever having happened in the first place – and it wasn’t the only court decision that ran off the rails a little bit interpreting Section 230 – if I could go back in time like Marty McFly in Back To The Future and hold the pen once more, I think I would state it in much plainer English the way that these different sections or subsections of Section 230 work together.
Because the portion that I just alluded to that says that if you are complicit in the creating of illegal content, you have no protection is actually contained in a definition section. It’s not upfront with the way the statute’s architecture works. It’s not upfront with the 26 words. And so, I think yeah, we would put that in the headline.
Nico Perrino: We recently got a couple of decisions from the Supreme Court on social media content moderation. There were two cases, the NetChoice cases coming out of Florida and Texas in which those states passed laws kind of dictating what platforms must do or can't do on social media platforms. For example, in Texas, they passed a law banning viewpoint discrimination on social media platforms. In Florida, they banned kind of actions or moderation decisions for political office holders, public officials, and for news publications.
The Supreme Court ended up somewhat punting on these cases saying that the facial challenge wasn’t properly analyzed at the lower courts, but at the same time, it did kind of opine on the First Amendment principles at stake. And I’m wondering if you would be willing to opine on those two cases and kind of what implication they have, if any, from Section 230. Is there a Section 230 argument here? Could these laws have been preempted by Section 230, for example?
Chris Cox: Well, first of all, let me take issue with critics who were saying the Supreme punted because I don’t think they did. Standing is a jurisdictional question and it’s the court’s responsibility in every case to consider standing in order priority first before they get to anything else. It doesn’t mean they can’t provide guidance when they end up sending a case back as they did here, but it does mean that they can’t move on and finally adjudicate the case if there’s a question about standing.
So, I don’t fault the court for that in any wise, nor do I fault the court for not providing guidance because they did. In fact, some of the dissenters were upset and claimed that they said too much. If they were going to decide the case on standing, they should have sent it away, and that was that.
Nico Perrino: Yeah, I think it was Alito who said that all discussion of First Amendment principles was mere dicta.
Chris Cox: Right. And so, if one were to criticize the court, it wouldn’t be on the grounds that they had failed to provide guidance, because surely they did. It was not a Section 230 case, neither of these cases was Florida or Texas. They were both decided under the First Amendment, as they should have been because this is an example of a situation where with or without Section 230 the First Amendment is going to prevail. And what the First Amendment tells us is, among other things, that you cannot be forced to say things you don’t want to say.
So, that aspect of being an internet platform – it’s not the government, this is a private platform operated by private individuals, private investors, and so on, and a private community of users I should add – that all of that is beyond the reach of the federal government to say you can or can’t say this or that. And forcing speech from the platforms themselves or forcing them to carry speech that they would rather not because it violates their terms of use, violates the rules of the road, the First Amendment is very clear on that. And when these cases make their way back to the Supreme Court, given the guidance that’s already been provided, there’s no question that’s how it’s going to turn out.
Nico Perrino: How do you think all the principles that we’ve been discussing in this conversation will apply in the age of artificial intelligence? Already, most social media platforms use algorithmic boosting, amplifying, artificial intelligence essentially, to recommend content. And that’s only going to be further supercharged as artificial intelligence becomes more powerful. Does Section 230 protect the implementation of these artificial intelligence tools to kind of moderate the content that a user sees on the platform?
Chris Cox: If artificial intelligence is creating content, then there is no Section 230 protection for the simple reason that Section 230 doesn’t protect anybody creating content. So, the question will always be when you’re performing a first analysis under Section 230, whether or not this is content creation. One application of artificial intelligence are these ChatGPT-type applications. They are manifestly creating content, and therefore they would not have any Section 230 protection. It’s possible to imagine other uses of artificial intelligence that don’t necessarily involve you in literal content creation.
And so, it’s a little bit more difficult to answer it across the board. But this is a great illustration of what we were talking about before, the complexity of ever-improving or evolving technology. The bad actors will have more tools as will the good actors. That cat-and-mouse game is going to continue. Will we be able to stay ahead of the hackers and so on? And all the opportunities for mischief will multiply as will all the benefits that we can receive if we are wise about the use of these technologies. So, nothing is unfortunately simple in this high-tech world in which we live.
Nico Perrino: Yeah. Actually, just kind of thinking about it in real-time here, you can think about the outputs from a ChatGPT which is a text-based artificial intelligence, or a Midjourney, which is an image visual-based artificial intelligence, as content that’s being produced by the platforms that have coded these tools. But then you could also think of it as content that necessarily needs the end user’s participation to create because it creates the prompts that then produces the output. And then the outputs themselves are derivative of the different materials that the model researches.
So, content that’s found across the internet, frankly. It kind of consolidates that wide learning from across the internet to produce the output. So, I can see the argument, and I’m sure that the platforms will make this argument to avoid any sort of liability as a content creator, but I could see the argument that they’re just scouring the internet responding to user prompts and the artificial intelligence is creating almost kind of a transformative use out of a preexisting user generated created content. It’s fascinating questions that are going to have to be sorted out here.
Chris Cox: Well, in at least a rudimentary way this question has already gone to the US Supreme Court in the case of Gonzalez against Google. And in that case, at least when it was search that we were talking about, the court decided very much for the reasons that you just laid out that if the user is responsible based on the user’s inputs for what’s coming back to the user and the software is doing its level best to respond to that, that that’s not the sort of thing that would make the platform liable. Whereas if the platform itself were speaking that would be a different result.
Nico Perrino: What do you anticipate happening with Section 230 moving forward? There’s been a lot of talk about reforming it, sunsetting it, deleting it from the US code for years, decades even, and nothing has materialized from that. And so, I’m wondering personally – and you don’t need to speculate on this – I’m just wondering if a lot of it is just posturing.
That we know nothing is going to ever change with it because to actually eliminate Section 230 would so fundamentally transform the internet that companies like Amazon and Facebook and Yelp, these companies that we depend upon for our day-to-day lives at this point, would become fundamentally broken business models. Maybe I’m catastrophizing there, but I’m just wondering what you see happening moving forward and whether you think there’s ever a realistic chance that Section 230 would either be reformed or sunset.
Chris Cox: Yeah, in terms of whether it’s posturing that’s going on in Congress, I mean congress is a big place. It has 535 men and women. There’s always somebody in there that’s going to be posturing, but I credit all the people that are looking at this issue with their doing their level best to finding solutions. As we said, these are difficult problems. And so, the fact that I might disagree with some of their proposed solutions is really no different than where I found myself back in the 1990s where as I said, the Senate had a very different way of going at this.
They were going to criminalize speech on the internet that was not fit for children and have the FBI surveil it and so on. That’s always been the way that business has been done in the country. We have different points of view, and we sort them out, and I think it’s good faith on all sides with the exception for the few people that we all have our favorites who seem to be following different drummers on whether they like to posture or not. That said, what’s going to ultimately help in this discussion is people realizing what it means to sunset Section 230 or to repeal it, take it away as legislation that at least is being discussed right now would do.
Reverting to the status quo ante pre-Section 230 wouldn’t just hurt platforms. And understandably, people care much more about their constituents than they do about the businesses. But the people who are now used to going to YouTube let’s say to find a how-to video to fix their washing machine are going to realize that it’s not there anymore because there’s liability associated with posting a washing machine video. If it’s wrong and your basement floods, you could be sued if the rule is that websites are responsible for the user-created content that they carry, for free by the way. They carry this content for free.
Is it worth the risk for them to do that? And the answer in most cases is going to be no. It’s possible that the giant platforms will continue to stay in that business, but no question their rules of the road are going to clamp down dramatically on what they’re willing to carry. And so, I would say by an order of magnitude, you’d see less web content. And it wouldn’t all be speech we don’t like. There would be a lot of babies going out with the bathwater.
Nico Perrino: People compare the advent of the internet to the invention of the printing press. Just this transformational technology that allowed people to become more connected with each other for information to be more readily accessed by a wider swath of the human population. Some have said that the internet is an even more significant invention than the printing press.
If you take Jeff Kosseff’s proposition that we discussed at the top of this conversation at face value, that Section 230 was what created the internet and we look at your experience, your history, you’re a long-time congressional representative, you’re a former chairman of the FCC. Where does Section 230 fit into your legacy? How do you view it?
Chris Cox: Well, so in terms of creating the internet, I love it as a way to sell books and as a way to draw people’s attention to the importance of the legislation, and I’m very flattered of course, but it’s not terribly different from people claiming that they created a million new jobs because they changed a provision in the tax code. It is true that changing provisions in the tax code can facilitate job creation, but it’s also true that at best that is a necessary but not sufficient condition. You have to have people willing to create jobs. They have to go do the work. And nobody in Congress is doing that heavy lift.
The best that Congress can do is provide helpful incentives. There’s a lot of ways to do that. And creating a predictable legal environment was unquestionably a useful incentive. And so, to that extent Section 230 deserves credit, but it didn’t create the modern internet. The flipside might be a little closer to true, which is without something like that, we probably wouldn’t have had it.
Nico Perrino: Yeah. I think you’re being somewhat humble there because I have a hard time imagining you would get the Amazons or the Facebooks or the Instagrams or the Xs of the world, and just all the downstream consequences that have come from that.
Chris Cox: Let me interpolate that. Not only would you perhaps not have those very successful businesses that we all know about, but there are literally millions of websites in the United States that we all have access to on our mobile devices, on our PCs, and otherwise. And a lot of us rely on our favorite websites for this, that, and the other thing. And a lot of that content is user-created content. All of that is governed by Section 230.
So, platforms that don’t charge users for access to their sites and provide this kind of content that’s created by users, a lot of them amazingly good Samaritans who spend their time to do very professionally produced videos and documents and all sorts of wonderful things that we all take advantage of every day, all that stuff I think would be the first to go because those are the people who can’t possibly self-insure. And by the way, there is no insurance market for the kinds of liability that would be created by repealing Section 230.
Nico Perrino: Yeah, to say nothing of the hit that many people’s retirement plans would take if some of these companies that depend on Section 230 no longer could. But I will ask you one more time, and I’ll allow you to sidestep if you want to, but where do you see Section 230 in your legacy?
Chris Cox: Well, I wouldn’t answer this in a very heartfelt way, but in a way that perhaps will surprise you. I’m most happy about my involvement in Section 230 not because of the real-world impact that it had, but because of the way it came about, the way the law was written in a totally bipartisan fashion. In my 18 years in Congress and all my years in Washington, I have never been part of anything like it, let alone the leader of the effort. Section 230 when it became law, as you said, it was originally introduced as a freestanding bill called the Internet Freedom and Family Empowerment Act.
And I introduced it with my friend on the other side of the aisle, Ron Wyden from Oregon. By the time it came to a floor vote after much discussion with our colleagues the fact that we had talked it out, that it was a novel question that people didn’t have partisan knee-jerk responses to, meant that the debate time was split not between Democrats and Republicans, not between people for the bill and against the bill, but between people on both sides of the aisle who all agreed with the bill and who all disagreed with the senate approach. And the Senate at that point had passed their version by, if memory serves, over 95 out of 100 votes.
So, this was a really remarkable thing. And then we went over to the Senate side, and I mentioned the name of one senator Pat Leahy. He was key in that effort. But senior members on both sides of the aisle eventually came around to understanding that this was a good way to do things, and it got included in the Telecommunications Act, and that was passed overwhelmingly. And it was signed at a wonderful signing ceremony at the Library of Congress, great festivities, big crowd, President Clinton, Al Gore were there, and the president signed it.
Anyway, it was probably at a time after an attempted presidential assassination when people are talking about the need for national unity, that was my best moment in congressional service of seeing that sort of thing in a positive way produce legislation. So, I hope that as people reconsider Section 230 they could at least aspire to doing it that way, rather than making it a political hammer to go after their opponents with.
Nico Perrino: Well, Representative Cox I think that’s the right note to end on. Thank you for this conversation.
Chris Cox: Very happy to join you.
Nico Perrino: That is Christopher Cox. He was a long-time congressional representative and former chairman of the FCC and the co-author of Section 230. This podcast is recorded and edited by a rotating roster of my FIRE colleagues including Aaron Resse and Chris Molpie. You can learn more about So To Speak by subscribing to our YouTube channel or our Substack page. You can also follow us on X by searching for the handle freespeachtalk, and you can also find us on Facebook, excuse me. Feedback can be sent to sotospeak@thefire.org.
Again, that email address is sotospeak@thefire.org. If you enjoyed this episode, please consider leaving us a review on Apple Podcasts, Google Play, Spotify, or wherever else you get your podcast. Reviews help us attract new listeners to the show. And until next time, I thank you all again for listening.